0924: First panel of the day: Do rankings in the media drive university priorities? Moderator: Scott Jaschik; Bob Morse, US News; John O'Leary, Times Good University Guide; Simon Beck, Globe & Mail; Mary Dwyer, Macleans; Indira Samarasekera, University of Alberta; David Naylor, University of Toronto.
Bob Morse: Citation analysis; are we creating bad behaviour? There's a large impact on academia from our best college rankings, no doubt. Many academic studies showing that the US News Best College Rankings influenced decision within the university, influenced priorities and strategies. These actions the schools take could be argued as both positive and negative. An example from a recent study (May 2010), NACA (admissions counsellors in high school, Canada & US); study on US News rankings on admissions counsellors perspectives. Rankings have grown in influence over the past 5 years. The majority hold a negative opinion of the rankings. 90% believed the rankings put pressure on the schools to maintain ranking; 46% believed that their schools make programmatic changes (other schools do it; we don't).
From a ranker's perspective, the academic reality is that you can be heavily criticised by Provost, President, but the campus can still be using the rankings for marketing and alumni. Is this hypocrisy?
Should be viewed as part of US higher ed accountability movement. Education policies and fund expended, how much they learn, whether the students earn enough to pay off their loans.
Rankings created a competitive environment in higher ed that didn't happen before; some see this as an improvement. Rankings are no an annual public benchmark against which academics measure themselves. Moving up in the rankings has become often a very public goal for universities.
College presidents are able to say that rankings have become a management tool. They're able to say if they move up in rankings, that means that our educational policies have worked; they've made "progress". Make administrators do the wrong thing? Is the sole purpose to improve in the rankings? Are the decisions good for students, do they foster learning? Are those policy choices good or not? When a school makes an effort to improve graduation rates... [he assumes this means that schools fund more classes; but I would disagree.] Students "benefit" from the rankings. They can attract better faculty and students.
Some call ranking a case of extreme unintended consequences; there have been a lot of these consequences. But rankings have become a reality and they've become the forefront of higher ed. I think the rankings are here to stay.
09:32: John O'Leary: It would be naive to believe that there's been no effect. Very different effect in the UK but still a major influence. Started in 1993. Influence that "no ranker wants": most don't want any influence because this distorts the ranking process as well as distorting education, But they do have an effect. What isn't affected are the universities in a position of strength at the top. Oxford and Cambridge have been first and second every year.
Southampton university was deciding whether to take in a college of education and were concerned about the effect of this on their ranking. They realised and still took in the college. Ranking dropped but then rose again after a few years.
Main drivers are the governors of universities and to some extent the alumni.
Effect on applications; but not for all universities. But the rankings do effect prestige and the international market. Concern that it'll distort the mission of universities further down the tables, particularly those that want to open access (this would lower the average entry grade of students coming in). Research is only one measure of 8, in the domestic case (UK).
Some beneficial effects; the UK rankings most ehavily weighting factor was teahcing quality, which made universities pay a lot more attention than they had before the rankings existed. Eventually the universities had the system abolished; there are now student satisfaction measures, having the effect also of paying more attention to students. Happy to admit that there is an effect on behaviour but not all negative.
Simon Beck: Canadian University Report Globe & Mail for past 10 years; not a ranking, student satisfaction survey. Annual survey of students is based on grades, we do compare and contrast universities but it's based purely on a survey of undergraduates.
Larger schools tend not to do as well on the survey.
One past UT President's reaction to not getting a good result on the survey: "Why the hell should we care what students think about the burgers in our cafeteria?"
An increase in response to the student survey from Canadian universities. Student satisfaction has gone to the top of universities' agendas. "You're providing a service to a consumer, students are paying customers and their quality of life is important. [Note: annoying when things like class size are linked to consumerist attitudes. Class size is important but why should this be something driven by a consumerist perspective?]
Universities have been influenced by rankings but this is not always a positive thing [all panelists seem to agree on this so far. Ironic?]
Focusing attention on the quality of life of undergraduate students. Criticism of international rankings is that there's too much emphasis on research. As long as universities are paying attention to rankings for the right reasons this is a positive thing [what exactly are the right reasons?]
[Note: it's very evident from what these panelists are saying that rankings contribute significantly to marketisation of university education, including the references to students as consumers.]
09:45: Mary Dwyer, Macleans: 2 decades of ranking. I gather that the rankings DO have an influence on university priorities, to what extent and effect, it's harder to say. I've heard of both positive and negative effects.
No perfect mechanism for comparing universities across the country; the universities vary quite a bit. When we set up the rankings we had many consultations with universities and education experts to decide what should be included.
Is there too much focus on research? In Macleans, 5 years ago we changed how we did the ranking; switched to collecting third-party data from sending a long survey to universities. Having to work now with available data.
Macleans, there IS more of a focus on research funding.
What would be some of our "dream" indicators? What about quality of teaching? We can look at faculty teaching awards; but this is a very difficult thing to measure, there's no data there that just show what this quality level is. Same with student outcomes; and student satisfaction; interested in results of the NSSE survey [too bad there have been many methodological flaws in that one as well].
Rankings are just one tool that can get the ball rolling for students. The rankings issue still sells very well after 20 years; there's a strong interest in this information. Students can look at data for every indicator and see the numbers. They can compare the schools [again: this is a marketisation tactic--comparing schools means "consumer choice" is invoked.]
09:56: Common university data for Ontario. One of the organisers behind CUDO saying "we don't really expect students and their parents to be looking up this data." A lot of this kind of information does get presented in the media but interested readers can dig deeper.
Indira Samarasekera, President, University of Alberta: Speak to the Canadian context; students tend to stay close to home, to go to school in their home market. Because of this the comparison of universities across Canada has little effect on the majority of students.
University priorities: David Naylor and I decided that we didn't want to use public money to support rankings that we at the time didn't believe were serving the university mission of teaching and research. We boycotted the rankings; the data avilable publicly was used. I think we stood for a principle that rankings shouldn't consume university resources.
Second difficulty; individual indicators are potentially useful b/c they show what colleagues and peers are doing; but put "in a blender" they become a meaningless number. Our students never pay attention to this; we pay no attention either. No-one noticed this on campus when we went up in the rankings. It depends on how institutions have viewed the rankings.
Drive uni priorities to the extent we value the data Macleans outs out because it provides comparison to peers, not because we want to change our position. Our priorities are reall driven by our teaching and research mission; most concerned about undergrad student experience. Funding in Canadian universities has been on the decline.
If they did use measures that were meaningful, maybe we would use them to make changes.
People use proxies for education quality rather than actual measures of educational quality. You can't measure educational quality directly. Student-faculty ratio? One metric for a whole university? Same with class sizes. An outstanding professor with 1,000 students is better than a crap professor with a class of 2 students. Students have no way of comparing their experiences to those of someone in another class; they tend to respond according to their easiness of the class, whether they got a good grade, and so on.
10:05: David Naylor: compares university rankings to a colonoscopy ;-)
Aggregation of a mixed bag of measures--even if they were perfect--that causes concern. You're looking at a certain reductionism.
There is a certain cynicism about this.
Issue of measurement is inevitable; part of the ethos of our time. Public accountability is inevitable and reasonable. If you don't have god measures you can end up with a quite misleading portrait of an institution. Ranking agencies that have disaggregated data: a good idea, very helpful for students and families.
Yes, I worry about the burgers. The reality is that food service is a part of the student experience.
Broad academic priorities: the rankings don't drive what we do. We also respond to labour markets, to research priorities, and so on.
Disaggregate rankings by types of institutions. Yes the categories are arbitrary but at least we can try to avoid the comparison of highly different institutions.
Yes, universities shamelessly flog the rankings when it's to their advantage. You want to call this hypocrisy? I call it creative adaptation.
10:15: New open sourced modes of collaboration for academics online. Speakers: John Willinsky, UBC & Stanford; Mia Quint Rapaport.
Emphasis on a lot of "openness": and focussing in on specific projects.
Publishing as a form of scholarly communication.
The journal is a carry over from the 17th century; how is open source changing that? What are the instruments used in open collaboration?
Public Knowledge Project: an urge to do something about sharing knowledge.
Faculty of education: we try to get teacher candidates interested in research before they get sent out into the classroom.
Something wrong with this picture: we want to share knowledge, but you're not allowed to share academic knowledge because it's restricted. Contradiction in terms and in practice.
What would it take to make research available to the public?
1999: How can we get our journals online? We've always been a print journal!
Undergrads explained: a new open source movement. This gave a focus, to build something that could be shared.
Open Journal Systems: as a platform for people to publish. A way to do a traditional practice. You pretend to change only one thing. Say to journal editors: there's only one thing that'll be different. You can still do all your traditional practices; but you'll have a copy online. The platform was free and distributed for free; shared software.
Scholarly communication needs a series of platforms, places where we can come together and work, to reduce costs.
What's your excuse for not sharing your knowledge? What are the technical barriers?
Everyone will download and no-one will buy journal anymore. But this isn't what happened. There's been a continued subscription in print.
9,300 journals have used the software. What are the implications for this? Starting figure in journal publishing was... we don't know how many journals there are. About 25,000; some say 50,000.
New kinds of platforms create new communities, more forms of communication and collaboration. We couldn't even send people an email unless they asked for it. They could download the software without us knowing. It could be modified; it was theirs to develop and build upon. Open source economy is very different.
10:45: Scholar-publishers: an ancient phenomenon that virtually disappeared in the face of commercialisation of journals. Losing access because of the high price of journals; corporate consolidation. Increasingly buying up smaller publishers. Creating an alternative channel; a non-proprietary, non-market economy. 9,000 journals that aren't part of the 25,00 journals; a good proportion were new but a good amount were also already "alternative", outside the notion of commercialism.
Scholars come together--low barrier to publishing; ability to circumvent both commercial publishers and societies. Built in all the processes used in journal publishing; emulated this in a workflow. Including double blind peer review and so on.
Half of the journals that are using OJS are in developing countries. 4,500 visible journals, visible and searchable on Google Scholar (for example). Biggest continental "user" is Latin America. 30% have more than five editors; collaborative basis; once it's on the web, people can edit from anywhere. Rejection rate: distribution among the journals. 70% to 30%; it's a range. A profile that matches traditional journals.
In the "old days" you'd find scholars in the print shops [note: this is a great point; printers, writers, and others would all mingle in the print shop as a space of meeting and collaboration, discussion, debate.]
How do people contribute in terms of the software? Now: OJS, editors taking things back into their own hands. Core team of developers run through SFU library. 3,500 people participate on online forum, providing code and plug-ins and constructive criticism.
11:02: No official university policy around developing open source software. This is very important for biological and scientific research. Online global communities of academics have been developing.
Google Scholar approached us: they wanted to improve the indexing of the OJS journals. Open source projects are often under the radar. Google worked closely with us.
11:11: Emphasis on non-commercial vs. anti-commercial. Most important issue of academic freedom; open source software is at least one part of the future of academic freedom. To have your work reviewed and respected for what it's worth.
11:20: New panel: What are the emerging issues in higher education that the media could cover? Moderator: Stella Hughes, UNESCO. Panelists: Jane Knight, OISE; Vanessa Bridge, U of Leeds; Paul Fain, Widmeyer Communications; Philip Fine, University World News; Glen Jones, OISE; Mike Schoenfeld, Duke University.
Biggest story that I see right now is the lack of trust in higher ed. There are a ton of stories about the higher ed bubble. [Disruptive innovation!]
Major topics and themes in higher education right now: sports, salacious behaviour, salaries, tuition and cost, and for elite national media, the constant competition for admission to most selective universities in U.S. Policy environment, also media coverage, shows a huge amount of skepticism, but there's still a very high degree of trust in higher education institutions.
Economic impact of higher education: institutions of higher ed especially with medical centres have become some of the largest employers in the US. But you wouldn't know that from the higher education news coverage. Linkage between K-12 and higher education; they tend to look at the two as discreet, different entities. No connection between them.
Changes in the nature of teaching and learning. Media coverage tends to be focused on what are they not learning; or how is technology going to change the way we teach and learn. There's a lot of interesting things happening out there, that are NOT part of the media lexicon. [Very, very good points here!]
Stella Hughes: What do you think is one of the most significant issues that could suddenly come into the spotlight in the media? "Stir of interest".
Glen Jones: Quality of national data about higher education. The state of national data infrastructure has been in severe decline. E.g. Statistics Canada long form; [also YITS and others have lost funding; Canada Council for Learning]. The way we make policy decisions is based on data, but we know almost nothing about students and faculty and this is a huge detriment. There are some provincial data systems but most provinces are reliant on national data. Government is increasing release times on data as well; this is a very important story about how we make policy decisions; but it's a "dull" story, so it doesn't tend to make the media.
11:45: Higher ed should be more aggressive in trying to tell the stories about what it actually does.
Media are "transfixed by a very traditional notion of higher education". It's a romantic, quaint notion, describing a rapidly shrinking minority of the students engaged in higher education. No story arc for the less traditional forms of higher education. Students are going to come back to college/university multiple times; so the 4-year degree with 18-year-olds is becoming a very outdated notion.
Glen Jones: Internationally: the notion of increasing markets for faculty and for students is where things are going. Student recruitment student mobility; international student market. But less coverage about parallel story of faculty, "arms race" for top faculty [see yesterday's blog for more on that!]. Increasing differentiation of faculty careers in different countries and within countries. Opportunities vary a lot for different groups/people. It's not just a matter of getting more students in, we have to provide a good learning environment as well.
"Incredible renewal" of faculty upcoming. [Prob is that we've been hearing this for years and years.] More diversity.
Lack of understanding of who faculty members are. In the US we haven't had the conversation about this. Universities are hiring adjuncts in huge numbers without discussing whether that really makes sense.
Glen Jones: Lots of focus in the media on research universities, but relatively little conversation about changes to publication, of ratings and rankings and research productivity. Most major publications are associated with only a few large international companies. Destabilisation of traditional mechanisms of higher education. What does this mean for tenure and promotion, broader hierarchies of institutions, etc.
Paul Fain: "DIY U" [book that's out right now.] But a lot of free lectures are created by higher education research institutions.
Straighter Line; breaks courses into individual, cheap, online courses that you can buy from the company.
Mike Schoenfeld: Crowdsourcing, Wikipedia, etc. have already created an environment that's de-linking knowledge from credentials. Now you can get the knowledge without the credential. Where will the value end up? This is a new issue that the media will grapple with over time.
Paul Fain: Send a powerful message; Peter Thiel offered the scholarships of 100K to young people if they don't go to university. One of the most public examples at the moment.
Stella Hughes: What political battles are on the horizon, for faculty and students, national and international?
3:27PM: Wrap up panel: The role of media and higher education in promoting democratic culture. George Fallis, York University; John Burness, Duke University; moderated by Noreen Golfman.
George Fallis: Democratic culture: what is a democracy? How do we define it? Is Canada a democracy? The idea is rooted in political equality, an idea that all human beings are equal. (Universal Declaration of Human Rights) That's the root of a democracy. The idea is that people can govern themselves because they're free and endowed with reason [Liberal political discourse].
3:35: Academic literature: basic intuition is fine, but the actual definition of democracy is always ongoing. Not synonymous with freedom; negative and positive liberty [freedom from; freedom to]. "Thick" and "thin" definitions of democracy. Who are the opponents of democracy? It's about people governing themselves, so one of the opponents of democracy is "experts". It's never achieved; it's an ideal, and it's always under pressure.
There's even a ranking of democracy, and it looks somewhat like the university rankings. E.g. minority rights; peaceful transfer of power, and so on. Another category in the index: freedom of speech, thought, freedom of association, equality before the law, freedom of speech and so on. What about the political culture of a country? You can have institutions, but you might not have a vibrant democracy without characteristics of civil society.
3:37: The press/the media are clearly "there" in a democracy; everybody understands the role of the media in a democracy [really? I'd contest that one!]. The media monitor the state of democracy in a country [again--highly contested idea]. New technologies are opening up possibilities for a more vibrant discussion. The notions of author and distributor are being broadened.
Universities are virtually never mentioned in the literature about democracy. And in the definitions and index of democracy they say very little about education. What's the role that professors can play in democratic life? They contribute in ways that can be very like what the media do.
A government FOR the people? Basic characteristics of democracy--provide the positive liberty so that people can flourish to their full potential. The university has a significant role in this. But we haven't begin to reflect much about how good citizens are "created". The historic literature on education has much to contribute. The Greek notion of education was rooted in the idea of how good citizens are created.
3:41: Liberal education; the most important role that universities can play in a democracy is how they educate their students; we're now doing a poor job b/c we've pushed this away, we focus on employability, on research culture, and so on. While universities must conceive and evaluate themselves as institutions of democracy, we must be honest that our record on supporting democracy is not that great--universities weren't places where transformation to democratic life took place; and tough education breaks down some inequality, it also creates another level of inequality around "merit". So while we have equality of opportunity we're creating inequalities through a meritocratic system.
We have to acknowledge that much of what we do creates an inequality that's problematic in a democracy. We also create experts who try to shape/frame the public debate, which democracy is about the wisdom of citizens to govern themselves.
Academic freedom: in a democratic society, when that government provides the money to allow us to do what we do, there's a deep tension between parliament's responsibility to be accountable, and our desire for academic freedom. Some of the tensions we're facing are laudable in the sense that it's a democratic society asking to understand what we do and whether we do it well and whether the outcomes we claim are following from our work are what people want, and are being achieved. Tension between government support and academic freedom.
John Burness: Importance of linguistics. What do the terms mean that we're using? E.g. very different ideas of who the media are and what they do.
In the U.S. higher education is such a diverse enterprise that the label "higher education" is seen as an aggregated enterprise, when it's anything but. It's not a monolith, but it tends to be seen that way in a lot of the reporting that happens.
University mission statements: is the promotion of democracy part of the mission statements of universities? Academic freedom: encourages academics and others within the university to have the kinds of debates that are supposed to happen in the broader societies. Universities are places where people are encouraged to disagree. Younger people should be able to take these viewpoints and come to their own conclusions [critical thinking].