Posts

CLICK - DrKPI for improving college marketingIt is again the time of year when parents and prospective students pore over recently published university rankings.

“…US News asked top college officials to identify institutions in their Best Colleges ranking category that are making the most innovative improvements in terms of curriculum, faculty, students, campus life, technology or facilities.”

But should we recommend such rankings, like the one above from the US News & World Report?  Are college rankings good, bad or ugly as Yale’s former Dean of Admissions suggests?

Or could it be the single worst advice we could possibly give a high school student?

Fact 1: College rankings generate revenue for publishers

Media houses know very well that university rankings are of great interest to prospective students and parents. So, newspapers like the Financial Times (FT) feature a weekly special section on education. In addition, the paper publishes numerous rankings throughout the year.

Then there are the various feature reports (see 2015-11-03 FT Special Report on Innovations in Education). Of course, they carry also advertising like the one below from Thunderbird.

[su_custom_gallery source=”media: 2517″ limit=”7″ link=”image” target=”blank” width=”519px” height=”380px” alt=”FT Special Report Innovations in Education – Thunderbird promotes itself as innovative leader – half page – front page color ad $120,000″]

Looking at the FT Special Reports Ad rates shows that getting involved with educational institutions pays well for media houses. Universities are forced to get increasingly famous to attract more resources and qualified students. In turn, advertising in special editions about education is a sure way to reach more of your target audience.

Marketing 101

Publishing college rankings makes sense from a publisher’s perspective. Advertising brings in the revenue needed (FT Special Reports Ad rates), and people read the stuff because as Langville and Meyer (2012, p. 1) suggest:

In America, especially, we are evaluation-obsessed, which thereby makes us ranking-obsessed given the close relationship between ranking and evaluation.

Fact 2: Schools love to use rankings

Everyone certainly loves rankings when they place in the top 10. And regardless of whether we agree with the findings, if we are the top dog, we let the whole world know about it.

The great thing is such rankings are based on a third party’s opinion, which lends it all credibility when we advertise our achievement.  Arizona State University (ASU) continues to tout their top rankings in the US News & World Report list of most innovative schools.

[su_custom_gallery source=”media: 2514″ limit=”7″ link=”image” target=”blank” width=”519px” height=”381px” alt=”US News & World Report – Most innovative school ranking 2015 – an advertising bonanza”]

Marketing 101

You don’t have to be brilliant or innovative, you just have to convince others that you are. Of course, if you have an external reference point that ranks you highly, such as a well-known publication, so much the better for your recruiters.

Can prospective students trust these school rankings?
Are they useful when choosing a university/college or program of study?

Fact 3: This stuff is less useful than you think

It is best to look at the methodology used in a ranking. What measures were used to conclude that ASU should be considered more innovative than Stanford and MIT? Fair question – let’s see.

US News & World Report asked deans and presidents to rank their peers. The magazine wants to compare apples with apples. Hence, national schools such as ASU, Stanford and MIT are ranked with their peers.

By the way, did you know that the US News & World Report puts the United States Naval Academy into the category of national liberal arts colleges?

So how does one measure the innovativeness of a university? We are told:

“…2015 survey that received the most nominations by top college officials for being the most innovative institutions. They are ranked in descending order based on the number of nominations they received. A school had to receive seven or more nominations to be listed.”

In plain English, this means you need to get as many high level university administrators as possible to nominate your university for innovation.

Accordingly, if you manage to make everyone perceive you as innovative, you are. That is all there is to it. Isn’t that wonderful?

Of course, we have no idea if whether a product innovation or a process innovation helped you rank highly. In either case, to claim to have made an invention, and thereby become an innovative university, you should answer things like: Why is this curriculum change an invention? They can be evaluated according to:

  • novelty (new),
  • inventiveness (i.e. must involve a non-obvious inventive step), and
  • industrial applicability (can be used).

Of course, in this case we have no clue what makes a curriculum change a simple change and what makes it an invention.

Marketing 101

US News & World Report rankings illustrate very well that how you measure things matters little. It just has to come across as making sense because 90 percent of readers do not bother to read about your methods or the fine print.

However, if you invest several years of your life in attending a school, while paying through the nose for tuition, fees and so on, you are well-advised to ensure the ranking makes sense to you.

Of course, even if the measure is bad, this does not necessarily mean ASU and Stanford are bad schools. They’re great, but

the US News & World Report’s attempts to measure innovativeness is a useless vanity exercise, to put it politely.

Fact 4: Using just one ranking is the worst

You basically have to do the homework. The five points spelled out in the table below will help you make better sense out of any ranking.

Please keep in mind – the perfect ranking does not exist. Each one has strengths and weaknesses, but you can only learn what those are by following these steps.

[su_box title=”5 critical things to do before trusting a college ranking.” box_color=”#86bac5″ title_color=”#ffffff”]

1. Take the time and make the effort to learn about the methodology. Where is the description, and how thorough is the ranking we are looking at? An example of a good method section is PEW Research‘s study on multiracial Americans, which explains how data were collected, weaknesses of the study, etc. This is also easy for the uninitiated to understand.

If you have done this homework, you know better how much weight you should give the rankings in front of you. That is a great start.

2. Does the study measure what it is supposed to (also called validity)? What criteria were used to make up a component in the ranking? Do these make sense to you?

3. Are there components of the ranking that particularly interest you? We may look at costs as an important factor. It could be interesting to understand how, for instance, a university degree (e.g., undergraduate or graduate) affects one’s career prospects and / or income 10 years after we graduate.

3 very good examples of included interesting factors:

4. Come up with a set of criteria that are important to you (see also image below), such as:

4.1 – location (e.g., which country and what area of the country/city), and
4.2 – costs (e.g., tuition, fees, health insurance, accommodation).

5. Write down a set of criteria that are not that critical to you, such as:

5.1 – GPA of incoming class,
5.2 – number and value of student scholarships, and
5.3 – diversity of faculty (e.g., gender, race, country and language)

The above makes it clear that using just one ranking is plain stupid. Using two is risky and using three or more allows you to pick and choose, thereby empowering you to make the decision that best suits you.

If the ranking uses those criteria that are of limited importantce to you (see point 5), you know what to do – ignore it.

[/su_box]

[su_custom_gallery source=”media: 2536″ limit=”7″ link=”image” target=”blank” width=”520px” height=”414px” alt=”Balancing the worth of education with outcomes.”]

One must balance the resources put in and the outcomes we hope for. This also indicates that we need to look at several rankings to choose the right university.

Who is number 1?  Create the best ranking

Of course, in addition to US News & World Report and the Financial Times, others do not want to be left out of this lucrative business. For instance, The Economist (a weekly magazine) also produces a ranking of MBA programs. So does the Wall Street Journal. Of course, even more rankings exist, such as the best 100 Employers to Work For or the Best Consulting Companies (German-language Handelsblatt).

In the case of the Best Consulting Companies, participants are asked three questions about the firm and voilà, we have the 2015 rankings. This may indicate more about how much you advertise (helps increase brand recognition) than how satisfied your clients are with your work.

These examples illustrate, everybody and anybody can create a college ranking. However, to avoid becoming a laughingstock, I urge you to follow the nine steps outlined below.

Join the 3,000+ organizations using the DrKPI Blog Benchmark to double reader comments in a few months while increasing social shares by 50 percent - register now!

How exact and thorough we are when addressing each step will, in turn, affect the overall quality of our rankings.

[su_box title=”9 steps to develop your favorite ranking system for just about anything.” box_color=”#86bac5″ title_color=”#ffffff”]

1. Write a one-page summary of why this ranking is needed and explain its purpose (to help readers… lose weight, pass the certification exam, purchase the best car, etc.).

2. What can readers do with these data? For example, does studying these data help improve performance? Does it show one’s weaknesses? Does it outline how one can improve (see DrKPI BlogRank)?

3. Come up with some indicators or measures that allow the collection of data from individuals (e.g., salary three years after graduation), the institution (e.g., faculty with doctorate), and possibly other indicators (e.g., inflation rate, purchasing power parity (PPP) data from the International Monetary Fund (IMF) to adjust salaries).

4. Use the indicators to make up components that make sense to the uninitiated (e.g., career progress, quality of faculty).

5. Add up the indicators to attain the overall score for each component the school, firm or student achieved.

FT uses three indicators to make up the “idea generation” component of its MBA rankings.

  • percentage of faculty with doctorates,
  • number of doctoral students that graduated last three years, and
  • research output created using a set of 45 journals (no Chinese or Spanish research journals need apply).

6. Convert the component scale to a common one such as 0 to 100, whereby the best gets the top score and average performers hover around 50.

7. Determine the importance of each component.

In many cases, some components are weighted higher than others. That is a value judgment that warrants an explanation. The same goes if you weigh each component the same! Explain your decision to the uninitiated reader.

8. Compute the aggregate score as the weighted sum of the previously calculated scaled component scores.

9. Present the aggregate score from the desired scale, such as 0 to 100.

Thanks to Fung (2013, p. 22-23) for inspiring me to write up this list.

Whenever looking at a university or any other ranking, keep the above in mind. Is the methodology spelled out, explaining the issues raised above? If these things are not made transparent, caution is called for.

[/su_box]

Don’t forget: Subscribe to our newsletter!

What is your take?

What’s your favorite ranking (e.g., sports) AND why do you like it?

– Which university ranking did you use when you applied for college?

What do you like the most about rankings?

What advice would you give a high school student regarding college rankings?

FT Global MBA Ranking

As I pointed out above, each ranking has something we might be able to use for our own purposes. The one below shows which business school provides you with the best value (i.e. current income minus tuition, books, lost wages while attending the program, etc.).

Surprising, is it not? The best known schools rank low. But maybe you want to use different criteria to rank… Check it out yourself.

[su_custom_gallery source=”media: 2552″ limit=”7″ link=”image” target=”blank” width=”497px” height=”650px” alt=”Financial Times Global MBA Ranking – Value for money”]

FT Global MBA Ranking – the winner based on value is the University of Cape Town – Graduate School of Business.

Things worth reading

1. Fung, Kaiser (2013). Number Sense. How to use big data to your advantage. New York: McGraw-Hill. Available on http://www.mheducation.co.uk/9780071799669-emea-numbersense-how-to-use-big-data-to-your-advantage

2. Kenrick, Douglas, T. (September 30, 2014). When statistics are seriously sexy. Sex, lies and big data. Psychology Today online. Retrieved November 2, 2015 from https://www.psychologytoday.com/blog/sex-murder-and-the-meaning-life/201409/when-statistics-are-seriously-sexy

3. Kenrick, Douglas, T. (June 20, 2012). Sexy statistics: What’s the one best question to predict casual sex? The science of sex, beer and enduring love. Psychology Today online. Retrieved November 3, 2015 from https://www.psychologytoday.com/blog/sex-murder-and-the-meaning-life/201206/what-s-the-one-best-question-predict-casual-sex

4. Langeville, Amy N. & Meyer, Carl D. (2012). Who’s #1? The science of rating and ranking. Princeton, NJ: Princeton University Press. Available from http://press.princeton.edu/titles/9661.html

5. Rudder, Christian (September 2015). Dataclysm: Love, sex, race, and identity – what our online lives tell us about our offline selves. New York: Broadway Books. Available on http://www.penguinrandomhouse.com/books/223045/dataclysm-by-christian-rudder/9780385347396/

6. Stake, Jeffrey Evans and Alexeev, Michael (October 30, 2014). Who Responds to U.S. News & World Report’s Law School Rankings? Indiana University School of Law-Bloomington Legal Studies Research Paper No. 55. Available at SSRN: http://ssrn.com/abstract=913427

Single worst advice – the answer

After reading this blog entry, it is obvious that using a single ranking is not smart.
Use a few and be aware of each one’s weaknesses and strengths.

Choose the component that helps you the most. If by any chance two rankings use the same component (e.g., salary), compare the numbers and smile.

Nothing is perfect. And since you read all the way to the end, why not write a comment and subscribe to our newsletter?

CLICK - Facebook Likes tell a lot about you, such as if you drink beer, have sex regularly and are happy.

Facebook engaged in a large study to see if users’ emotional states could be affected by their news feed content.
Consent of Human Subjects: Subjects not asked for permission first.
Findings: Extremely small effects.
Research methodology: Poor algorithms used, questionable findings.

Key finding: A reduction in negative content in a person’s newsfeed on Facebook increased positive content in users’ posting behavior by about 1/15 of one percent!

We address 3 questions

1. Why did some of the checks and balances possibly fail?
2. Should we worry about the study’s findings?
3. What benefits do Facebook users get out of this study?

Non-techie description of study: News feed: ‘Emotional contagion’ sweeps Facebook

1. Some checks and balances failed

Following the spirit as well as the letter of the law is the key to successful compliance. In turn, any governance depends upon the participants doing their job thoroughly and carefully.

In this case, the academics thought this was an important subject that could be nicely studied with Facebook users. They may not have considered how much it might upset users and the media.

Cornell University has a its procedure in place for getting approval for research with human subjects. As the image below illustrates, the researcher is expected to reflect on the project and if in doubt, ask for help.

CLICK - Why does the media not get the facts right about the Facebook study? #BigData

The university points out that it did not review the study. Specifically, it did not check whether it met university guidelines for doing research with human subjects. The reasons given were that its staff:

Curious? Join 1500 other subscribers to this blog’s newsletter and read on! Pls. use an e-mail that works after you change jobs! Read more