Pages

Tampilkan postingan dengan label Lumina Foundation. Tampilkan semua postingan
Tampilkan postingan dengan label Lumina Foundation. Tampilkan semua postingan

Selasa, 29 Maret 2011

Let's Develop Solutions

Tired of the rhetoric? Want to take a stab at cutting costs in Wisconsin public higher education yourself-- or even try increasing productivity?

The Lumina Foundation has supported the development of an amazing interactive tool that helps you do just that.

Here's one result I generated:

Let's say we need to close the 2025 budget gap for Wisconsin public research universities to maintain current spending per FTE student. We can do that by increasing student/faculty ratio from 13:1 to 17:1. Period. Gap closed. No increases in tuition or state & local revenues necessary. And research suggests that such an increase will come at no significant cost to degree completion rates. If you want to suggest it will hurt instructional quality, you'll need to provide hard causal evidence to support that case-- I'd love to see it--email it to me!

Better yet, let's first increase faculty salaries per FTE to the 75th percentile (which means an increase of about $1,000 from a starting point of about $6,300) and do the same for student support services too. Let's further commit to no tuition increases, and assume no increase in state or local revenues either. We can do ALL that and still have no budget gap if we increase student/faculty ratio from 13:1 to 19:1.

What is required to increase student/faculty ratio? Obviously we either enroll more students, retain more students, or reduce the size of the faculty. Here are the two main challenges:

(1) There is a widely held belief that student/faculty ratio is THE measure of quality in higher education, despite an overwhelming dearth of evidence to support that belief. It's no coincidence that rankings systems rely so heavily on that measure--and that all this talk of being competitive seems to set aside any possible changes to the student/faculty ratio. In fact, since the ratio is actually interpreted to mean "commitment to teaching" that effectively precludes any real re-consideration, lest we come across as not committed to education! But come on-- what evidence is there that the number of faculty allocated to students is the best indicator of commitment? How about the number of highly-trained faculty? The amount of professional development offered? The valuation of teaching in tenure decisions? This reeks of a system that responds to the needs of faculty more than students (for more, see my next point). There are alternative ways to measure quality.

(2) Faculty. Faculty at research universities tend to strive for as little student interaction as possible. Yep, I said it. There are some exceptions, but generally we spend our time vying for smaller classes and less advising. Could we learn to teach bigger classes and do it well? Could we be required to do so at least semi-regularly? Could the advising load for undergrads be spread across a wider range of faculty (including those in departments that don't teach undergrads)? Sure. But you'll face resistance.

So let's stop pretending that there's only one way to skin this cat. We don't have to break from UW System, hike tuition, and/or become semi-private in order to solve our fiscal crisis. We have to have tough conversations about the best ways to deliver higher education in the 21st Century. Sure, that's a tall order-- but it's one that the smart communities of Wisconsin's public universities can no doubt handle.

Kamis, 28 Januari 2010

Making SAFRA Count

The end of last year was a busy time for me as I waited out the birth of my daughter who decided to spend an extra 10 days lounging in utero before emerging into the Wisconsin winter. I was so focused on strategies to promote her exit (sidenote: talk about an area in need of better research-give gobs of data on live births for hundreds of years, docs still refuse to hazard a prediction of labor occurring on any given night!), I virtually shut out the world of higher education policy. Imagine!

Thankfully, others were hard at work around and over the holidays, thinking about ways to make sure that the substantial, timely, and hard-won investment which will (fingers crossed) soon come to higher education via the Student Aid and Fiscal Responsibility Act (SAFRA) are most effective. Evidence of that work is contained in a December Lumina Foundation memorandum to the U.S. Department of Education, awkwardly (but accurately) titled "Structuring the Distribution of New Federal Higher Education Program Funding to Assure Maximum Effectiveness."

The memo gets it (mostly) right. There's great potential for this money to count, but also a real possibility it will do next to nothing if mismanaged. For example, if definitions of key terms like "college completion" are vague, and standards for "rigorous" research evidence ambiguous, then funds will likely go to continuing business as usual-for example, supporting programs that purport to increase college access while doing little to change rates of success-leading some to ask, access to what?

To avoid this the Department of Education needs a distribution system based first and foremost on one principle: keep it simple. It should make states define college completion and disseminate that definition-then stick to it. It's easiest to tell if plans are straightforward and consistent with intended principles if prospective grantees are forced to explain their ideas in a concise manner. Lumina gets this, and its team recommends a two-step process that requires a concept paper in advance of a full proposal.

So the good news is that this Lumina paper hits many of the key issues and makes some solid recommendations. That said, its authors missed an opportunity to address one important issue. The section titled, "How will the U.S. Department of Education know if these investments are actually helping to meet the President's goal?" is essential. It goes to the heart of one major goal of SAFRA-to increase the body of knowledge about what works in promoting college completion, and therefore the field's capacity to create lasting change.

As I recommended to ED's Bob Shireman early last year, we can do higher education a great service by holding a high bar for what constitutes research on college completion. Too often research in higher education hypothesizes that policies or practices advance desired outcomes, but utilizes insufficient methods to establish causal linkages between the two. As a result, we often don't know whether the results we see can be directly attributed to the new practice or investment.

In this case, ED should define "research" and "researchers" and "evidence," ideally in ways that are consistent with current practices at the Institute for Education Sciences; and require states to use those definitions. There should be a prescriptive process for selecting researchers (so as to make sure that truly independent evaluations are conducted) and proposals that allow for sustained research should be prioritized (e.g. those that leverage supportive foundation funding to continue the work to assess mid and long-range outcomes). I'd also like to see ED involved in increasing the capacity of researchers to do this kind of work, since it's far from clear how the demand for new work can be met by the current supply of higher education researchers. Maybe an IES pre- and/or post-doc training program targeted to postsecondary education?

Sure, this would require setting aside sufficient funds for the research side of the initiatives-but absent that investment, we'll likely never know whether the money spent on SAFRA-funded programs and policies had any real effect. That would, of course, be business as usual-precisely what we must avoid if we want to make this once-in-a-lifetime opportunity really count.

Selasa, 28 Juli 2009

(Re)Focusing on What Matters

Last week I spoke at a meeting of the Lumina Foundation’s Achieving the Dream Initiative, a meeting of policymakers from 15 states all working to improve the effectiveness of community colleges. At one point, a data working group shared results of its efforts to create new ways to measure college outputs. This was basically a new kind of report card, one capable of reporting results for different subgroups of students, and enabling comparisons of outcomes across colleges. Something like it might someday replace the data collection currently part of the IPEDS.

While it's always gratifying to see state policymakers engaging with data and thinking about how to use it in meaningful ways, I couldn’t help but feel that even this seemingly forward-thinking group was tending toward the status quo. The way we measure and report college outputs right now consistently reinforces a particular way of thinking-- a framework that focuses squarely on colleges and their successes or failures.

What’s the matter with that, you’re probably wondering? After all, aren’t schools the ones we need to hold accountable for outcomes and improved performance? Well, perhaps. But what we’re purportedly really interested in—or what we should be interested in—is students, and their successes or failures. If that's the case, then students, rather than colleges, need to be at the very center of our thinking and policymaking. Right now this isn't the case.

Let’s play this out a bit more. Current efforts are afoot to find ways to measure college outcomes that make more colleges comfortable with measurement and accountability--and thus help bring them onboard. That typically means using measures that allow even the lowest-achieving colleges at least a viable opportunity for success, and using measures colleges feels are meaningful, related to what they think they’re supposed to be doing. An example: the 3-year associates degree completion rates of full-time community college entrant deemed “college ready” by a standardized test. We can measure this for different schools and report the results. Where does that get us? We can then see which colleges have higher rates, and which have lower ones.

But then what? Can we then conclude some colleges are doing a better job than others? Frankly, no. It’s quite possible that higher rates at some colleges are attributable to student characteristics or contextual characteristics outside an individual college (e.g. proximity to other colleges, local labor market, region, etc) that explain the differences. But that’s hard to get people to focus on when what’s simplest to see are differences between colleges.

It's not clear that this approach actually helps students. What if, instead, states reported outcomes for specified groups of students without disaggregating by college? How might the policy conversation change? Well, for example, a state could see a glaring statewide gap in college completion among majority and minority students. It would then (hopefully) move to the next step of looking for sources of the problem—likely trying to identify the areas with the greatest influence, and the areas with the most policy-amenable areas of influence. This might lead analysts back to the colleges in the state to look for poor or weak performers, but it might instead lead them to aspects of k-12 preparation, state financial aid policy, the organizational structure of the higher education system, etc. The point is that in order to help students, states would need to do more than simply point to colleges and work to inspire them to change. They’d be forced to try and pinpoint the source(s) of the problems and then work on them. I expect the approaches would need to vary by state.

Don’t get me wrong, I’m not trying to absolve any college of responsibility for educating its students. What I’m suggesting is that we think hard about why the emphasis right now rests so heavily on relative college performance—an approach that embraces and even creates more insitutional competition—rather than on finding efficient and effective ways to increase the success of our students. Are we over-utilizing approaches, often adopted without much critical thought, that reify and perpetuate our past mistakes? I think so.

Image Credit: www.openjarmedia.com

Sabtu, 11 April 2009

The Best Ideas Seem to Come Out of Thin Air

I was at dinner with a group of community college leaders last month when one administrator began to tell me about an "innovation" his college was trying. I'd asked for his thoughts on the productivity agenda in higher ed, an effort to do more with less. Doug Harris and I are working on a Lumina Foundation initiative and are tasked with identifying the most cost effective ways to increase the number of college grads-- groan, a laudable yet seemingly impossible task.

So this guy starts telling me about his program- a call center operating on a $50,000 per year budget. The center makes two kinds of calls- those focusing on recruitment, and those focusing on retention. The first set includes calls to follow-up on whether students took the ACT, did the FAFSA, finished their application etc. The second set are efforts to check in-- why aren't you enrolled this semester, what's going on with an undecided major, welcome calls to late registrants, etc. A staff of 4 has made 5,000 recruitment calls and 10,000 retention calls in just over a year.

And lo and behold-- it looks like, based on pretty solid evidence, this thing is successful- and paying for itself. Comparisons between students who were called and reached, those who were left a voicemail, and those who were called but no message were left, indicate that even the most conservative assessment reveals that the call center not only covers its own costs but GENERATES new revenue through tuitition.

Surprise surprise-- students respond when someone calls to say "I care." On some calls students reported problems that administrators were then able to resolve. In others, students gain needed information they would've otherwise gone without.

Doug and I are still working on these cost-effectiveness ratios, but I gotta tell you-- right now call centers are right there on top. Who knew? When I said this talented gentleman (Joe DeHart at Des Moines Area Community College) where the idea came from, he referred me to his president, Rob Denson. Rob reports this was a completely organic process-- someone decided to make a few recruitment calls that were well-received, it seemed like a good idea to ramp up, and so they did.

At a recent Lumina Foundation meeting no one in the room seemed to be able to name other colleges trying this. My question is, why the heck not? Backed by strong evidence that social capital is unequally distributed, that information is invaluable, and that people are often receptive to help, this program may well be succeeding--despite its incredible simplicity.

I love stumbling into new ideas like this. You can bet you'll continue to find me at dinners like these, hoping to come upon another one. In the meantime, if you've got thoughts on innovative programs or policies you've tried out in higher ed, write me a note. If you've got data we can use to estimate both costs and effects-- and oh man is that rare-- so much the better!