Elizabeth Keserauskis

Building relationships and making connections

Sample work

Here are a few examples of work we have completed recently.

2012 Chancellor’s Report:

2012 Chancellor's report

2012-2013 Admissions Viewbook

2012-2013 Viewbook

2012 Promotional Video

Promotional video

Sample Advertising

Print Ads:

Print ad 1

Sample print adPrint ad 2 Print ad 3

Outdoor Ads:

Billboard 1

Billboard 2

Billboard 3

Billboard 4

Radio Ads:

STEM thought leadership positioning for the Chancellor, 0:60

Campus growth, 0:15 (voice over by me)

Student “Andrew” 0:60

Student “Natalie” 0:60

May 21, 2013 Posted by | communication, higher education, marketing, media | , , , | Leave a comment

Forbes College Ranking: True Gauge or Sketchy Data?

Forbes recently released their annual rankings of colleges. The rankings were calculated in partnership with the Center for College Affordability and Productivity. My institution was conspicuously absent, and I was asked to investigate why. (The responsibility for responding to external credibility surveys lies within my unit.) What I have subsequently found is that the rankings are based on existing, publicly available data (IE no surveys were sent to each institution requesting data, information.) Which in and of itself would not trouble me, except that the data points used, to me and others, are questionable at best. Additionally, information like the infamous “other cost” category institutions report as an allowable cost at the discretion of the student to cover expenses like mileage to clinicals, internships, etc. in their financial aid package, are treated as actual billed charges, therefore increasing the “cost of attendance” and subsequently this elusive “net price” calculation and the debt load calculation (predicted, I might add).

I have several excerpts from articles, blog posts, and even their own methodology posted below giving a glimpse of the, in my humble opinion, “sketchiness” of this ranking. I can only hope that Joe and Jane Sixpack are able to sort through the variables…oh wait, they likely can’t. So now we have another “think tank” with a clear agenda (political or otherwise) leveraging a brand like Forbes to advance their cause.

How important are rankings like this and U.S. News and World Report Best Colleges? Only the audiences we are trying to attract will tell. And believe me, I intend to ask them just that, so we can tailor our approach accordingly to these surveys.

Compiling the Forbes /CCAP Rankings (excerpt from the methodology document, full document can be found on their site)

By the Staff of the Center for College Affordability and Productivity

 Ranking Factors and Weights

The Center for College Affordability and Productivity (CCAP), in conjunction with Forbes , compiled its college rankings using five general categories, with several components within each general category. The weightings are listed in parentheses:

1. Student Satisfaction (27.5%)

  • Student Evaluations from RateMyProfessor.com (17.5%)
  • Actual Freshman-to-Sophomore Retention Rates (5%)
  • Predicted vs. Actual Freshman-to-Sophomore Retention Rates (5%)

2. Post-Graduate Success (30%)

  • Listings of Alumni in Who’s Who in America (10%)
  • Salary of Alumni from Payscale.com (15%)
  • Alumni in Forbes/CCAP Corporate Officers List (5%)

3. Student Debt (17.5%)

  • Average Federal Student Loan Debt Load (10%)
  • Student Loan Default Rates (5%)
  • Predicted vs. Actual Percent of Students Taking Federal Loans (2.5%)

4. Four-year Graduation Rate (17.5%)

  • Actual Four-year Graduation Rate (8.75%)
  • Predicted vs. Actual Four-year Graduation Rate(8.75%)

5. Competitive Awards (7.5%)

  • Student Nationally Competitive Awards (7.5%)

School Selection

The 650 institutions of higher education in this ranking are schools which award undergraduate degrees or certificates requiring ―4 or more years‖ of study, according to the U.S. Department of Education, and only those schools categorized by The Carnegie Foundation as Doctorate-granting Universities, Master‘s Colleges and Universities, or Baccalaureate Colleges are included in this sample of schools.

Of the 650 schools included in the sample, 608 wereincluded in the 2010 college ranking. (A total of 610 schools were ranked in 2010, but two of them, Bryant University and Missouri University of Science and Technology are now classified as ―Special Focus‖ institutions by the Carnegie Foundation). We have accounted for any name changes that have occurred over the past year. The 42 schools added this year to the sample are all institutions classified by the Carnegie Foundation as Doctoral/Research Universities and were added based upon undergraduate enrollment size.

A Little History of the Forbes Rankings from 2008-present, excerpt from a commentary on methodology (full commentary can be found at: http://bestcollegerankings.org/popular-rankings/forbes-college-rankings/)

2008 marked the first year that Forbes entered the college ranking fray. They choose to use a methodology that included the following percentages: Listing of Alumni in the 2008 Who’s Who in America (25 percent); student evaluations of professors from Ratemyprofessors.com (25 percent); four-year graduation rates (16 2/3 percent); enrollment-adjusted numbers of students and faculty receiving nationally competitive awards (16 2/3 percent); average four year accumulated student debt of those borrowing money (16 2/3 percent). They did not break colleges down into different schools as U.S. News does, but instead choose to separate private and public colleges instead.

Methodology: In conjunction with Dr. Richard Vedder, an economist at Ohio University, and the Center for College Affordability and Productivity (CCAP), Forbes inaugurated its first ranking of America’s Best Colleges in 2008. They based 25 percent of their rankings on seven million student evaluations of courses and instructors, as recorded on the Web site RateMyProfessors.com. Another 25 percent depended upon how many of the school’s alumni, adjusted for enrollment, are listed among the notable people in Who’s Who in America. The other half of the ranking was based equally on three factors: the average amount of student debt at graduation held by those who borrowed; the percentage of students graduating in four years; and the number of students or faculty, adjusted for enrollment, who have won nationally competitive awards like Rhodes Scholarships or Nobel Prizes. CCAP ranked only the top 15 percent or so of all undergraduate institutions.

Negative Commentary on the Methodology (Excerpt from Suite101.com: The Forbes Best College Rankings 2011: Are They Kidding?

What Goes in Must Come Out

First of all, a quick review of the Forbes methodology. It is the goal of the rankings to evaluate college as a consumer or investor would evaluate a commercial product. The focus is on the return on investment–for what you pay, do you get a good “value”? The most important element in assessing this value is “Post-Graduate Success,” accounting for 30 percent of the total.

This “success” is measured by the salaries of graduates as reported by Payscale.com; membership in “Who’s Who”; and by alumni representation on a list of corporate officers chosen by Forbes and the Center for College Affordability and Productivity (CCAP). CEOs and board members of leading companies are the only persons who are eligible, thereby narrowing the definition of “success” to achievement in the business world only.

It is interesting that Forbes would allow use of “Who’s Who” listings as a measure of college success. In a 1999 article for the magazine called “The Hall of Lame,” Tucker Carlson, a Fox News commentator, derisively showed how inclusion in Who’s Who publications did not require notable achievement.

Another 17.5 percent of the total is based on student evaluations of instructors, taken from the website Ratemyprofessors.com. While student evaluations are useful, they can also lead professors to emphasize popularity at the expense of scholastic rigor.

An additional 17.5 percent of the total comes from actual and anticipated four-year graduation rates. Using four-year rates rather than six-year rates clearly favors colleges that are wealthy enough to subsidize virtually all eligible students based on need or merit, or whose student body is made up of highly-prepared students with sufficient economic support. State universities, whose students often have to work part-time or even take a semester off from school, usually cannot match the four-year graduation rates of private colleges.

Likewise, the rankings penalize colleges whose students have higher student debt loads, and this also slants the rankings toward wealthy colleges and parents.

Academic Reputation—Forget It

The most glaring deficiency of the Forbes survey is that the only standard it uses to assess the intellectual credibility of a college is the data from Ratemyprofessor.com. Academic reputation and faculty achievement count for nothing, even though a recent UCLA study of more than 200,000 freshman across the country revealed that undergraduate academic reputation was the most important factor for these students when they were choosing a college. Forbes wants to change that perception, but does the magazine really believe that reputation counts for nothing in the business world as well?

It is ironic that a survey that is supposed to be student-centered disregards the one factor that students themselves cite as being most important to them: quality. Interestingly, the UCLA study also showed that prospective students are learning to be guarded in their use of college rankings, a healthy sign indeed.

August 10, 2011 Posted by | higher education, marketing, reputation management | , , , , , , | Leave a comment

“In God We Trust, All Others Must Bring Data”

I love the quote from W. Edwards Fleming, “In God We Trust, All Others Must Bring Data”. [Excerpt from Wikipedia: William Edwards Fleming (October 14, 1900 – December 20, 1993) was an American statistician, professor, author, lecturer, and consultant. He is perhaps best known for his work in Japan. There, from 1950 onward, he taught top management how to improve design (and thus service), product quality, testing and sales (the last through global markets) through various methods, including the application of statistical methods.]

I came across a few blog posts/articles recently addressing marketing trends, stats, etc., that I found interesting. Just sharing light reading to consider as we continue our marketing and advertising efforts!

How are we paying attention to data? How do we have the right blend of data versus instinct? For my job specifically, how is higher education marketing and recruiting adapting to the changing consumer behavior, specifically on the web, and getting away from the direct mail (spray and pray, shotgun method, etc.) strategy? What is a good percentage of your budget to move away from the “tried and true” methods and test in some of the strategies described in the following posts? So much to consider!

June 17, 2011 Posted by | communication, higher education, marketing, social media | , , , , | 2 Comments

The State of Today’s Graduate Seeking Work in Communications

I attended a speed networking event pairing current students (most of whom are about to graduate) with alumni working in various fields last night. The idea is based on the “speed dating” concept, but in this situation alumni are stationed at tables and students rotate among them for 15 minute networking sessions. The concept is fantastic, and I am so glad I participated and could provide perspective to students entering the workforce.

What concerns me after the event is the state of preparedness of the students about to enter the workforce, particularly in communications fields. They don’t have experience beyond their internships, and are about to compete with folks who have much more experience for the same jobs in our economy. Internships are almost critical for students these days. Many of them realize they will likely have to take full-time internships with companies (many without benefits) in order to get their foot in the door.

But the problem doesn’t stop at their lack of tangible experience. To me it is more worrisome that they have not been required to hone their writing skills. Many have not had to compose extensive persuasive papers in their last year of two in school. I have yet to find one who understands just how drastically the internet has changed the strategic communications field. Most of them believe they are going to find a job in “PR”. Well, it’s not just PR anymore. You have to have the understanding of how to help a company establish and manage a reputation, among all their audiences and across all media. It is not just traditional media releases and pitching. Most of them don’t understand the direct-to-consumer conversation potential with the internet. Many of them look at me with blank stares when I ask them if they understand the basics of SEO & SEM.

The best I can hope for is that the sites where these students are doing their internships give them several sips from the fire hose and give them a chance to realize what they haven’t learned yet. Then the most motivated will make it a priority to teach themselves what they can. And in the meantime, I am dreading the day when I have to hire an entry level communications position. I certainly will have to manage my own expectations!

March 30, 2011 Posted by | communication, higher education, reputation management | , , , , | Leave a comment

Knowledge is Power

You might think this post title indicates prose on the importance of continually educating yourself on changing technology, your customer behavior and trends, and other marketing speak. On the contrary, this post is about the importance of institutional climate and culture, and specifically the importance of internal communication at an organization.

Have you ever been in a workplace where people flaunt the fact that they have knowledge about a topic, new process, upcoming change, etc? Rather than taking the opportunity to educate others, build consensus for the direction of the company, and overall support the mission, people tend to “collect” knowledge as people in medieval times collected property, slaves, etc to show their wealth, position and power. While my life experience is relatively average, I believe this is more rampant in higher education than in any other sector. I also believe that higher education places less emphasis on the importance of an internal communication strategy than other companies. Perhaps the decentralized nature of the typical higher education structure fosters this.

While I spent a good two days stewing over my recent specific experiences with this “knowledge is power” phenomenon, my take-away (or “aha” moment or life lesson or silver lining, blah blah blah) from this is that I need to circle my communication wagons and rejuvenate my push for a more robust, comprehensive internal communication strategy for the institution. I am going to stop wishing that people would just “get it” and stop collecting knowledge as power. Since I obviously have no control over that, I’ll focus on that which I can control (and happen to be good at)–communication.

Any suggestions for how other institutions help proportionately allocate/expend resources on internal communications?? Any help is welcome!!!

February 10, 2011 Posted by | communication, connecting, higher education | , , , | 2 Comments

University Branding Takes to the Air

I know this is not “new” news, but the recent addition to the fleet of Horizon Airlines airplanes to don public university branding caught my attention again. The Montana State University Bobcat theme brings the total to eight planes in the airline’s fleet promoting public universities in the northwest.

How do you measure success of that brand advertising initiative?? More importantly, how do you justify the cost? I have no clue what the price tag was for those branded planes, nor am I aware of the budget situation in these states. I do however know about the state of Illinois budget situation and the resulting pinch the public universities are facing. (And pinch is an understatement!) I would find it very hard to justify the cost without being able to prove a direct link to enrolling more students or raising more money for the foundation.

I do also recognize that each of us in the marketing field is trying to be more creative than our competition in getting our message out to our audience. This latest advertising space is certainly creative; I am just not sure how effective it is. Thoughts?

If you are really bored, you can watch the video of the Washington State University plane being painted with the school’s fight song in the background. 

November 13, 2010 Posted by | higher education, marketing | , , , , , | 2 Comments

Hunger Strike to Challenge College Rankings

Continuing with the season of college rankings, here is an interesting story about a student embarking on a hunger strike to draw attention to the inadequacy of the U.S. News and World Report college rankings process. I don’t know about anyone else, but I think there are more important issues in the world about which we should go on a hunger strike.

Washington Monthly College RankingsWashington Monthly puts out an interesting college guide. They rate schools based on their contribution to the public good in three broad categories:

  • Social Mobility (recruiting and graduating low-income students)
  • Research (producing cutting-edge scholarship and PhDs)
  • Service (encouraging students to give something back to their country)

This certainly sounds like a much more worthwhile ranking system for prospective students and parents than the U.S. News rankings based on fame, exclusivity and money.

September 7, 2010 Posted by | higher education, marketing, public relations | , , , , | Leave a comment

Would You Like to be Given a D+?

Drake D+ CampaignIs Drake University’s new ad campaign a bust, or successfully getting everyone talking about the school and its benefits to prospective students? It has been the center of significant attention on the web, including just a few I ran across recently:

If their marketing team’s purpose was to create something viral that everyone would talk about, mission accomplished. However, the fact that people are referring to the education you can get at Drake as a “D+” grade is probably not the image they were hoping for.

I also found it interesting that the marketing team did not even include their own staff and faculty in the testing of the new campaign. Your internal audience does not necessarily have to approve everything you do, but you can certainly create a sense of ownership and buy in if you involve them in the process where appropriate. By not including them, the marketing team had to backpedal and be on the defensive, explaining what the ad campaign was about in an internal email that of course someone posted online.

And what does this mean to the reputation of their advertising partner in the process, Stamats? Are they now branded as an agency that can create edgy advertising campaigns, or a team that didn’t include all stakeholders in the testing process or notice that ramifications of a “D+ Grade”?

In today’s communications climate, everyone is searching for the story or idea or campaign that is going to get everyone talking about their product/service/school. Was Drake successful? What do you think?

September 5, 2010 Posted by | engaging, higher education, marketing, reputation management | , , , , | Leave a comment

College Rankings: Popularity Contest or External Credibility?

Last week was what many in higher education considered a stressful week. The U.S.News and World Report rankings were released to the schools on Monday (8/16), with a press embargo until 12:00 midnight Eastern time Tuesday. So most college communications teams spent the day either breathing a sigh of relief and sending the release that they achieved a good rank, or frantically scrambling to craft the message drawing attention away from the fact that they slipped in the rankings.

In addition to the usual stress, U.S. News made significant changes to the methodology and presentation of the rankings this year. Full details can be found on their blog, but in summary they:

  • changed the category names
  • listed all schools, not just the top tier
  • increased the weight of the graduation rate
  • included the opinion of high school counselors in the calculation

There has always been a question about whether rankings like these and countless others are just a popularity contest, or rather a valid external assessment of college choices for prospective students and their parents. The subjective opinions of peers, and now this year high school counselors, factor into the rankings. The chief admissions officers, provosts and presidents of all colleges and universities have the opportunity to provide their opinion of the institutions in their geographic region. This peer assessment variable accounts for 25% of the total score–the most heavily weighted variable. If we are trying to assess outcomes of an institution, why aren’t the managers at companies hiring the graduates asked?

You could argue that this skews the rankings, as surely an institution can influence those opinions through a variety of communication channels timed with the survey response due date. Or, you can view this an opportunity to educate your peers on the accomplishments and accolades your institution has recently achieved, and create a communication strategy for this target audience.

Have you ever noticed how the underdogs who make it to the Sweet Sixteen in the NCAA Division I Men’s Basketball tournament manage to place high in the rankings? (Think Butler, Northern Iowa this year.) And how the tournament is right around the time that the survey is completed? Coincidence? Or is that there is increased visibility and communication about those schools while they are featured on TV?

Assessment is always a big topic at universities. To me, this is one more way to assess success. There are qualitative and quantitive, objective and subjective, ways to measure nearly everything.

Additionally, when you increase factors like graduation rate, your overall score increases. So, in theory, would your rank.

Regardless of what side of the fence you fall on, there is something to be said from a marketing perspective about credibility through external validation. Several of the categories, like Up and Comers and Focus of Student Success, are great to use in a communication strategy highlighting recent innovations you have added to your institution.

There are also those schools that do not appear on the rankings who try and use that to their advantage. I have seen taglines such as “awards won’t change the world, but our graduates will” on billboards.

Has anyone asked whether prospective students and parents are using these rankings in their decision making process? If you appear favorably in the rankings, are you calling attention to it and asking your prospective students and parents to pay attention?

An article appeared in the Journal of Marketing for Higher Education in 2008, titled De-Mystifying the U.S. News Rankings: How to Understand What Matters, What Doesn’t and What You can Actually Do About It. I highly recommend reading this article.

August 22, 2010 Posted by | higher education, marketing, public relations, reputation management | , , , , | Leave a comment

How *NOT* to Ruin Your Reputation Online

I work with the student athletes here on campus to improve their skills in working with the media, but also to help them develop their reputation online. Or more immediately- how not to ruin their reputation online. I am always looking for articles, stories and examples of how social media can negatively impact a career, education or reputation to pass on to them. These are the most recent ones I have added to the list.

Do you have any articles that would be good for me to share with our students?

August 18, 2010 Posted by | higher education, reputation management, resources, social media | , , , , , | Leave a comment