Beth Keserauskis

Building relationships and making connections

Forbes College Ranking: True Gauge or Sketchy Data?

Forbes recently released their annual rankings of colleges. The rankings were calculated in partnership with the Center for College Affordability and Productivity. My institution was conspicuously absent, and I was asked to investigate why. (The responsibility for responding to external credibility surveys lies within my unit.) What I have subsequently found is that the rankings are based on existing, publicly available data (IE no surveys were sent to each institution requesting data, information.) Which in and of itself would not trouble me, except that the data points used, to me and others, are questionable at best. Additionally, information like the infamous “other cost” category institutions report as an allowable cost at the discretion of the student to cover expenses like mileage to clinicals, internships, etc. in their financial aid package, are treated as actual billed charges, therefore increasing the “cost of attendance” and subsequently this elusive “net price” calculation and the debt load calculation (predicted, I might add).

I have several excerpts from articles, blog posts, and even their own methodology posted below giving a glimpse of the, in my humble opinion, “sketchiness” of this ranking. I can only hope that Joe and Jane Sixpack are able to sort through the variables…oh wait, they likely can’t. So now we have another “think tank” with a clear agenda (political or otherwise) leveraging a brand like Forbes to advance their cause.

How important are rankings like this and U.S. News and World Report Best Colleges? Only the audiences we are trying to attract will tell. And believe me, I intend to ask them just that, so we can tailor our approach accordingly to these surveys.

Compiling the Forbes /CCAP Rankings (excerpt from the methodology document, full document can be found on their site)

By the Staff of the Center for College Affordability and Productivity

 Ranking Factors and Weights

The Center for College Affordability and Productivity (CCAP), in conjunction with Forbes , compiled its college rankings using five general categories, with several components within each general category. The weightings are listed in parentheses:

1. Student Satisfaction (27.5%)

  • Student Evaluations from RateMyProfessor.com (17.5%)
  • Actual Freshman-to-Sophomore Retention Rates (5%)
  • Predicted vs. Actual Freshman-to-Sophomore Retention Rates (5%)

2. Post-Graduate Success (30%)

  • Listings of Alumni in Who’s Who in America (10%)
  • Salary of Alumni from Payscale.com (15%)
  • Alumni in Forbes/CCAP Corporate Officers List (5%)

3. Student Debt (17.5%)

  • Average Federal Student Loan Debt Load (10%)
  • Student Loan Default Rates (5%)
  • Predicted vs. Actual Percent of Students Taking Federal Loans (2.5%)

4. Four-year Graduation Rate (17.5%)

  • Actual Four-year Graduation Rate (8.75%)
  • Predicted vs. Actual Four-year Graduation Rate(8.75%)

5. Competitive Awards (7.5%)

  • Student Nationally Competitive Awards (7.5%)

School Selection

The 650 institutions of higher education in this ranking are schools which award undergraduate degrees or certificates requiring ―4 or more years‖ of study, according to the U.S. Department of Education, and only those schools categorized by The Carnegie Foundation as Doctorate-granting Universities, Master‘s Colleges and Universities, or Baccalaureate Colleges are included in this sample of schools.

Of the 650 schools included in the sample, 608 wereincluded in the 2010 college ranking. (A total of 610 schools were ranked in 2010, but two of them, Bryant University and Missouri University of Science and Technology are now classified as ―Special Focus‖ institutions by the Carnegie Foundation). We have accounted for any name changes that have occurred over the past year. The 42 schools added this year to the sample are all institutions classified by the Carnegie Foundation as Doctoral/Research Universities and were added based upon undergraduate enrollment size.

A Little History of the Forbes Rankings from 2008-present, excerpt from a commentary on methodology (full commentary can be found at: http://bestcollegerankings.org/popular-rankings/forbes-college-rankings/)

2008 marked the first year that Forbes entered the college ranking fray. They choose to use a methodology that included the following percentages: Listing of Alumni in the 2008 Who’s Who in America (25 percent); student evaluations of professors from Ratemyprofessors.com (25 percent); four-year graduation rates (16 2/3 percent); enrollment-adjusted numbers of students and faculty receiving nationally competitive awards (16 2/3 percent); average four year accumulated student debt of those borrowing money (16 2/3 percent). They did not break colleges down into different schools as U.S. News does, but instead choose to separate private and public colleges instead.

Methodology: In conjunction with Dr. Richard Vedder, an economist at Ohio University, and the Center for College Affordability and Productivity (CCAP), Forbes inaugurated its first ranking of America’s Best Colleges in 2008. They based 25 percent of their rankings on seven million student evaluations of courses and instructors, as recorded on the Web site RateMyProfessors.com. Another 25 percent depended upon how many of the school’s alumni, adjusted for enrollment, are listed among the notable people in Who’s Who in America. The other half of the ranking was based equally on three factors: the average amount of student debt at graduation held by those who borrowed; the percentage of students graduating in four years; and the number of students or faculty, adjusted for enrollment, who have won nationally competitive awards like Rhodes Scholarships or Nobel Prizes. CCAP ranked only the top 15 percent or so of all undergraduate institutions.

Negative Commentary on the Methodology (Excerpt from Suite101.com: The Forbes Best College Rankings 2011: Are They Kidding?

What Goes in Must Come Out

First of all, a quick review of the Forbes methodology. It is the goal of the rankings to evaluate college as a consumer or investor would evaluate a commercial product. The focus is on the return on investment–for what you pay, do you get a good “value”? The most important element in assessing this value is “Post-Graduate Success,” accounting for 30 percent of the total.

This “success” is measured by the salaries of graduates as reported by Payscale.com; membership in “Who’s Who”; and by alumni representation on a list of corporate officers chosen by Forbes and the Center for College Affordability and Productivity (CCAP). CEOs and board members of leading companies are the only persons who are eligible, thereby narrowing the definition of “success” to achievement in the business world only.

It is interesting that Forbes would allow use of “Who’s Who” listings as a measure of college success. In a 1999 article for the magazine called “The Hall of Lame,” Tucker Carlson, a Fox News commentator, derisively showed how inclusion in Who’s Who publications did not require notable achievement.

Another 17.5 percent of the total is based on student evaluations of instructors, taken from the website Ratemyprofessors.com. While student evaluations are useful, they can also lead professors to emphasize popularity at the expense of scholastic rigor.

An additional 17.5 percent of the total comes from actual and anticipated four-year graduation rates. Using four-year rates rather than six-year rates clearly favors colleges that are wealthy enough to subsidize virtually all eligible students based on need or merit, or whose student body is made up of highly-prepared students with sufficient economic support. State universities, whose students often have to work part-time or even take a semester off from school, usually cannot match the four-year graduation rates of private colleges.

Likewise, the rankings penalize colleges whose students have higher student debt loads, and this also slants the rankings toward wealthy colleges and parents.

Academic Reputation—Forget It

The most glaring deficiency of the Forbes survey is that the only standard it uses to assess the intellectual credibility of a college is the data from Ratemyprofessor.com. Academic reputation and faculty achievement count for nothing, even though a recent UCLA study of more than 200,000 freshman across the country revealed that undergraduate academic reputation was the most important factor for these students when they were choosing a college. Forbes wants to change that perception, but does the magazine really believe that reputation counts for nothing in the business world as well?

It is ironic that a survey that is supposed to be student-centered disregards the one factor that students themselves cite as being most important to them: quality. Interestingly, the UCLA study also showed that prospective students are learning to be guarded in their use of college rankings, a healthy sign indeed.

Advertisement

August 10, 2011 Posted by | higher education, marketing, reputation management | , , , , , , | Leave a comment

University Branding Takes to the Air

I know this is not “new” news, but the recent addition to the fleet of Horizon Airlines airplanes to don public university branding caught my attention again. The Montana State University Bobcat theme brings the total to eight planes in the airline’s fleet promoting public universities in the northwest.

How do you measure success of that brand advertising initiative?? More importantly, how do you justify the cost? I have no clue what the price tag was for those branded planes, nor am I aware of the budget situation in these states. I do however know about the state of Illinois budget situation and the resulting pinch the public universities are facing. (And pinch is an understatement!) I would find it very hard to justify the cost without being able to prove a direct link to enrolling more students or raising more money for the foundation.

I do also recognize that each of us in the marketing field is trying to be more creative than our competition in getting our message out to our audience. This latest advertising space is certainly creative; I am just not sure how effective it is. Thoughts?

If you are really bored, you can watch the video of the Washington State University plane being painted with the school’s fight song in the background. 

November 13, 2010 Posted by | higher education, marketing | , , , , , | 2 Comments

Hunger Strike to Challenge College Rankings

Continuing with the season of college rankings, here is an interesting story about a student embarking on a hunger strike to draw attention to the inadequacy of the U.S. News and World Report college rankings process. I don’t know about anyone else, but I think there are more important issues in the world about which we should go on a hunger strike.

Washington Monthly College RankingsWashington Monthly puts out an interesting college guide. They rate schools based on their contribution to the public good in three broad categories:

  • Social Mobility (recruiting and graduating low-income students)
  • Research (producing cutting-edge scholarship and PhDs)
  • Service (encouraging students to give something back to their country)

This certainly sounds like a much more worthwhile ranking system for prospective students and parents than the U.S. News rankings based on fame, exclusivity and money.

September 7, 2010 Posted by | higher education, marketing, public relations | , , , , | Leave a comment

College Rankings: Popularity Contest or External Credibility?

Last week was what many in higher education considered a stressful week. The U.S.News and World Report rankings were released to the schools on Monday (8/16), with a press embargo until 12:00 midnight Eastern time Tuesday. So most college communications teams spent the day either breathing a sigh of relief and sending the release that they achieved a good rank, or frantically scrambling to craft the message drawing attention away from the fact that they slipped in the rankings.

In addition to the usual stress, U.S. News made significant changes to the methodology and presentation of the rankings this year. Full details can be found on their blog, but in summary they:

  • changed the category names
  • listed all schools, not just the top tier
  • increased the weight of the graduation rate
  • included the opinion of high school counselors in the calculation

There has always been a question about whether rankings like these and countless others are just a popularity contest, or rather a valid external assessment of college choices for prospective students and their parents. The subjective opinions of peers, and now this year high school counselors, factor into the rankings. The chief admissions officers, provosts and presidents of all colleges and universities have the opportunity to provide their opinion of the institutions in their geographic region. This peer assessment variable accounts for 25% of the total score–the most heavily weighted variable. If we are trying to assess outcomes of an institution, why aren’t the managers at companies hiring the graduates asked?

You could argue that this skews the rankings, as surely an institution can influence those opinions through a variety of communication channels timed with the survey response due date. Or, you can view this an opportunity to educate your peers on the accomplishments and accolades your institution has recently achieved, and create a communication strategy for this target audience.

Have you ever noticed how the underdogs who make it to the Sweet Sixteen in the NCAA Division I Men’s Basketball tournament manage to place high in the rankings? (Think Butler, Northern Iowa this year.) And how the tournament is right around the time that the survey is completed? Coincidence? Or is that there is increased visibility and communication about those schools while they are featured on TV?

Assessment is always a big topic at universities. To me, this is one more way to assess success. There are qualitative and quantitive, objective and subjective, ways to measure nearly everything.

Additionally, when you increase factors like graduation rate, your overall score increases. So, in theory, would your rank.

Regardless of what side of the fence you fall on, there is something to be said from a marketing perspective about credibility through external validation. Several of the categories, like Up and Comers and Focus of Student Success, are great to use in a communication strategy highlighting recent innovations you have added to your institution.

There are also those schools that do not appear on the rankings who try and use that to their advantage. I have seen taglines such as “awards won’t change the world, but our graduates will” on billboards.

Has anyone asked whether prospective students and parents are using these rankings in their decision making process? If you appear favorably in the rankings, are you calling attention to it and asking your prospective students and parents to pay attention?

An article appeared in the Journal of Marketing for Higher Education in 2008, titled De-Mystifying the U.S. News Rankings: How to Understand What Matters, What Doesn’t and What You can Actually Do About It. I highly recommend reading this article.

August 22, 2010 Posted by | higher education, marketing, public relations, reputation management | , , , , | Leave a comment

SIUE cited again in U.S. News rankings!

SIUE cited again in U.S. News rankings! http://bit.ly/siueusnews! One to watch for “innovative changes” for 2nd year in a row! #siue

August 17, 2010 Posted by | higher education, marketing, public relations | , , , | Leave a comment

A Few of My Favorite Marketing Resources… What Are Yours?

A friend recently asked me for suggestions of books she could read to help freshen her marketing skills, and bring them up to the bleeding edge of the social media marketing/technology/SEO/SEM world. So I responded to her via email, and then thought I might as well share my thoughts here as well.

My first reaction to that question is that the technology and user interfaces are changing so quickly that books teaching applications almost immediately become obsolete when they are published. There are a few that address theory and approach that are applicable whether there is a shiny new technology object.

My favorite book:
The New Rules of Marketing and PR: How to Use Social Media, Blogs, News Releases, Online Video, and Viral Marketing to Reach Buyers Directly, 2nd Edition by David Meerman Scott (amazon affiliate link). This has helped me re-frame the way I approach marketing drastically.

Next up on my reading list (after the mindless, yet terribly entertaining, crap I am currently reading):

Tribal Knowledge: Business Wisdom Brewed from the Grounds of Starbucks Corporate Culture by John Moore (amazon affiliate link). This came highly recommended to me by a new colleague as I am navigating the new waters of radio station management.

Also, blogs I follow include Mashable, Jay Baer’s Convince and Convert, David Meerman Scott’s Web Ink Now, Sysomos for research, Dan Zarella, and Marijean Jaggers.

Google has a whole slew of free training videos for Google Analytics and Google Adwords. The Analytics for Dummies book may still be useful, but they keep changing the interface of both so books quickly become obsolete. I’d use the free online tools.

LinkedIn groups I belong to: Web 2.0 for Higher Education, Marketing Profs, the Social Media Marketing Group, Southern IL Marketing and Communications. Connect with me if we haven’t already: linkedin.com/in/BethKeserauskis.

July 19, 2010 Posted by | engaging, marketing, social media | , , , , , | Leave a comment

There’s No Crying in Volleyball…or Marketing

volleyball as analogy for marketing strategyMy team and I experienced substantial frustration this weekend as we played in a grass triples volleyball tournament, the US Open of Grass Volleyball, or the Waupaca Boatride tournament in WI. Our fatal error causing the frustration: assuming. Yes, I admit our frustration was largely our doing thanks to assuming that the rules would be what we were used to, and assuming that our fellow players would have the same integrity and honesty that we did.

Without going into excruciating volleyball detail, essentially our competition was not holding themselves to the same high quality play standards we have grown accustomed to in outdoor play. Additionally, since the rules were essentially “police yourself”, there were a few dishonest folks who did not call their own net fouls.

So we lost more than we should have. However, that is no excuse for us not playing at minimum to our potential to overcome that. OR, changing our strategy to adapt to the “new” rules.

Yes, I am about to turn a volleyball tournament scenario into an analogy for marketing. I can’t help it- it’s what I do.

All too often the rules change at some point throughout our execution of our carefully planned marketing strategy. What defines us as marketing strategists is whether we can see that the rules have changed, and adapt our strategy and course accordingly. So many factors can change: the economy, a natural disaster, a product failure, a PR crisis, etc. We cannot possibly predict all of the options. We can however have a system in place to help detect the change in rules and help us adapt to a new direction. To me, that is a sign of a top notch marketing strategist.

Clearly we did not identify the changing landscape during our volleyball tournament and adapt our strategy accordingly. So for a few days I will just complain about the unfairness of the situation to anyone willing to listen. But then, I will be sure that the next time I play, I am ready to meet that challenge.

July 12, 2010 Posted by | leadership, marketing | , , , , | 2 Comments

Ed-Glen Chamber Presentation

Today I spoke with the members of the Edwardsville-Glen Carbon Chamber of Commerce. They invited me to speak with them about incorporating social media tools in their marketing strategy. If you missed the event, or are just plain curious, you can download my slide deck on SlideShare, along with several recent presentations I have made on web user behavior (specifically millenials) and more (http://www.slideshare.net/bethkeserauskis).

April 27, 2010 Posted by | connecting, marketing, relationships, reputation management, social media | , , , , , | Leave a comment

Tree Pollen Distribution Has Similar Success to Direct Mail

It dawned on me the other day as I was trying without success to get the pollen off my windshield that the pollen distribution strategy of trees in the spring is remarkably similar to the traditional, unsolicited direct mail strategy. My gross overgeneralization of unsolicited direct mail strategy is that you cast a HUGE net over a very loosely targeted area to people who have not tried to start a dialog with your company or about your product/service with your direct mail pieces, hoping you get a response that will eventually turn into a sale.

Trees release a HUGE amount of pollen, over whatever area they can get using wind as the distribution strategy. That area is not necessarily the right area, as much of the pollen lands on concrete, houses, cars, etc. Then the tree has to pray that the seeds can make it into the ground, that rain falls, and then that successful germination occurs. We still haven’t made it to the “sale” part of the equation, because now the tree has to hope that someone doesn’t pull it out of the ground, or mow their grass before it can grow tall, or that some animal doesn’t find it a tasty treat.

See how this is remarkably similar to unsolicited direct mail?? The advantages that the trees have that has allowed this strategy to remain successful are:

  • They have a lot of time to be patient and wait for success. Their life cycle is long. If they don’t have successful germination and growth in one spring, they can try again for likely several hundred more springs.
  • There are a lot of the same trees out there doing the same thing.
  • They have “ambassadors” in that people are actively planting trees in the spaces they would like them to grow.

So can you and your business afford to spray your direct mail once a year, hope something sticks  and if not, just wait until the next spring? I would venture to guess not. Just one more piece of evidence why unsolicited, unqualified direct mail cannot be your only strategy.

April 26, 2010 Posted by | connecting, higher education, marketing, sales | , , , | 2 Comments