UBC Home Page -
UBC Home Page -
UBC Home Page UBC Home Page -
News Events Directories Search UBC myUBC Login
- -
UBC Public Affairs
UBC Reports
UBC Reports Extras
Goal / Circulation / Deadlines
Letters to the Editor & Opinion Pieces / Feedback
UBC Reports Archives
Media Releases
Services for Media
Services for the Community
Services for UBC Faculty & Staff
Find UBC Experts
Search Site

UBC Reports | Vol. 52 | No. 11 | Nov. 2, 2006

The Student Experience in a Rankings Wrangle

November is the traditional month of undergraduate student-focused Canadian university rankings. This year, UBC and 21 other Canadian universities declined to participate in the Maclean’s rankings because of methodological concerns.

At the same time, UBC has made the National Survey of Student Engagement (NSSE) results available on the Web (http://www.pair.ubc.ca/studies/nsse.htm). In 2003, UBC became the first large Canadian research university to subscribe to this US-based survey instrument that is widely recognized as a valid and reliable tool for providing benchmarks for the effectiveness of undergraduate education.

NSSE results consist of the summaries of students’ responses to about 90 questions, which NSSE aggregates into five broad areas of student engagement, referred to as “benchmarks”:

1 – Level of academic challenge
2 – Active and collaborative learning
3 – Student-faculty interactions
4 – Enriching educational experiences
5 – Supportive campus environment

UBC Reports asked Dr. Anna M. Kindler, Vice Provost and Associate Vice President, Academic Affairs to discuss surveys such as NSSE, and what UBC is doing with the data.

Why do so many universities believe that Maclean’s rankings are not a good guide to the student undergraduate experience?

The universities’ decision to withdraw from participation in the 2006 Maclean’s questionnaire was motivated by concerns over the methodology used in the survey and validity of some of its measures. The paramount issue was the inappropriateness of aggregating data across a wide range of programs to arrive at a simple ranking system. This problem is of particular concern in the context of large, comprehensive universities where the level of excellence of individual programs naturally varies across the board. Averaging the strength of all programs does not provide helpful information to prospective students as it overstates the quality of some programs and understates quality of others.

Concerns were also expressed about partial accounting and other methodological flaws which made it inappropriate for universities as institutions dedicated to the highest standards of scholarship to continue to contribute to a process that does not conform to these standards.

Is it really possible to measure the student experience through instruments such as NSSE?

No single measure can fully account for the quality of student experience. However, surveys such as NSSE provide valuable insights that can help universities assess and improve their teaching and learning environments. In the case of NSSE, this data focuses on various aspects of student engagement that have been identified through research as important factors leading to positive learning outcomes.

How should students and their parents use the NSSE data?

I would suggest that prospective students or their parents may wish to look at particular items or sets of related items of the survey that seem most relevant to their individual learning needs. In other words, the question “What really matters to me/my son or daughter when we think about quality of a learning environment?” should be asked when considering the NSSE data. This issue of a “personal fit” is really paramount in deriving meaningful conclusions based on surveys such as NSSE.

I would also suggest that they consider NSSE as only one piece of information in making decisions about post-graduate education. It is important to remember that NSSE is a survey of engagement, not outcomes. Prospective students and their parents may thus wish to also look at: data on graduation rates; graduate and professional programs admissions; employment success and income of graduates; community involvement; and reports by alumni reflecting on the value of their university experience may provide very useful insights into the decision-making process.

Does a research-intensive university like UBC inherently lessen the student learning experience?

Absolutely not. As exciting sites of knowledge creation, research-intensive universities have a unique potential to offer learning environments that are intellectually stimulating, challenging and engaging. The presence on campus of world class researchers in a broad range of academic disciplines and interdisciplinary fields allows universities to design and implement cutting-edge courses where the breadth and depth of curriculum are informed by both history and innovation.  A research-intensive environment is also conducive to setting high standards of scholarship and building a culture of rigorous academic discourse across all programs.

The challenge for research-intensive universities is to optimize this potential and seek creative ways of encouraging and supporting researchers’ active involvement in teaching — not only through graduate supervision and mentorship but through a wider repertoire of pedagogical engagement. I speak of this in terms of a challenge, because traditional teaching and learning models have often lead to polarization of teaching and research priorities of faculty members and have created competing time pressures.

Recently, UBC has made this challenge one of its priorities and through approaches such as the Carl Wieman Science Education Initiative and the recently announced President’s Teaching and Learning Enhancement Initiative will be piloting new models of engaging outstanding researchers in undergraduate teaching.

The NSSE results show that Canadian universities score poorly compared to their US counterparts, and that in Canada, UBC is slightly behind Canadian peers. Why do US universities do such an apparently better job of student engagement? And what does NSSE tell us about UBC?

There are several possible explanations that may account for this difference between US universities and their Canadian counterparts. The level of funding for public universities on each side of the border suggests a particularly plausible explanation. Many aspects of learning environments are tied to the issue of resources. Universities ability to hire and retain outstanding faculty, upgrade teaching infrastructure, enhance provision of technology-based learning environments that facilitate engagement, expand repertoire of relevant student services or build inviting informal learning spaces are all a function of the available resources. According to a recent report by the Association of Universities and Colleges of Canada (AUCC) based on data from the National Centre for Educational Statistics in the US and Statistics Canada, government funding of public four-year colleges and universities increased by 25 per cent in the US between 1980 and 2004/2005, compared to a 20 per cent reduction in Canada.

Because engagement in learning is very closely related to student-faculty interactions, data on enrollment and faculty growth in the US and Canada may further help understand the NSSE results. Between 1986 and 2003, as reported by AUCC, the growth of students in the US has been closely paralleled by the growth in full-time faculty, allowing US universities to maintain relatively low student/faculty ratios. Over the same period in Canada we have experienced a 45 per cent growth in student numbers with only about seven per cent growth in faculty. This uneven growth could naturally be expected to negatively impact at least some aspects of student engagement.

As to the UBC performance relative to our peers, on most of the NSSE indicators where UBC is positioned slightly behind our counterparts, the differences are very small when the effect size in considered — which suggests that Canadian large research-intensive universities face similar challenges related to provision of undergraduate education. Having said that, we recognize that there are areas where we specifically need to focus our efforts to improve the quality of student experience and the recent re-surfacing of undergraduate teaching and learning as one of the key priorities for UBC is an indication that we are serious about achieving progress on this front.

What is UBC doing with what it has learned from the NSSE data?

NSSE data feeds directly into our SHINE 2010 (Students Horizons In Education) initiative which commits the university to supporting and measuring the impact of a range of undertakings specifically aimed at enhancing the quality of teaching and learning at UBC. These initiatives have ranged from a review of our internal quality assurance processes with respect to performance of academic units; expansion of professional development opportunities in teaching for faculty members and Teaching Assistants — including a new Certificate Program for TA’s; support of faculty engagement in the scholarship of teaching and learning; to additional funding directed to efforts which focus on teaching improvement — some of which I have already mentioned.

On the measurement side, NSSE is one of the instruments of assessment that we have selected to use for our internal benchmarking and to allow us to tune our initiatives to yield optimal outcomes. We have also struck a joint Senate committee, involving student participation to enhance the process of gathering and analyzing student evaluations of teaching so that this data, in combination with NSSE and other forms of assessment, can be more effectively used to guide improvement.

While SHINE 2010 is a university-wide effort, there are numerous new initiatives recently championed by Faculties to enhance undergraduate education. For example, the Faculty of Arts, has engaged in an extensive undergraduate curriculum re-visioning process which has already resulted in significant enhancements of the first-year experience for students enrolled in the Coordinated Arts Program. In partnership with the Arts student government, the faculty has also implemented several projects to help build a sense of community in this very large, diverse unit, including the ArtsPeak events for the graduating class.

The Faculties of Science, Arts and the Sauder School of Business have embarked on new forms of collaboration to expand the repertoire of available courses and majors/minors. The Faculty of Forestry has incorporated development of effective communication skills into all of its programs. The Faculty of Land and Food Systems has developed a Career Ambassador Program in collaboration with UBC’s Career Services Office and enhanced its tri-mentoring program that prepares students for “real-world” careers.

While none of these initiatives has been directly prompted by NSSE, the NSSE data, along with other measures, has helped to focus and fine tune these undertakings, and, very importantly, will continue to help us assess their impact over time.

Our interest in NSSE has been focused on how the information that it offers can contribute to making UBC learning environments as effective and responsive to the needs of our students as they can be. We have recently engaged with the Deans in considering additional ways of using the NSSE data in the context of specific priorities of individual faculties and the UBC Trek 2010 goals.

NSSE Director on Maclean’s Rankings

The following letter was printed July 27, 2006, in the University of Calgary’s student newspaper The Gauntlet. It is re-printed with permission of Prof. George Kuh.

I am dismayed that Maclean’s used a few results from the National Survey of Student Engagement (NSSE) in its [Spring 2006] rankings of Canadian universities. NSSE always eschewed this idea, posting the reasons why on its website (http://nsse.iub.edu/html/usingst.cfm). Rankings are inherently flawed because they reduce complex dimensions of university life to a single number. Ranking Canadian institutions is especially problematic because they have different missions, offer different majors, and enroll different mixes of younger and older full-time and part-time students and transfers. These many other features affect student engagement and make it possible for institutions to offer different, yet rich, nuanced and meaningful educational experiences for their students.

Rankings may sell magazines but they do little to help the public understand what makes for a high-quality undergraduate experience. Rankings also have the potential to discourage universities from serious efforts to discover what their students are doing and learning, and then using this information to improve. By forcing universities to release their student engagement results before institutions have had a fair opportunity to understand and use the data to get better may mean some schools will forgo using NSSE or other assessment tools in the future. That outcome would be an ironic tragedy, contrary to the public interest.

Public disclosure is good, and we need more of it. Indeed, NSSE strongly encourages individual institutions to make available their student engagement results so that over time prospective students and others will become better informed about what to look for when choosing a university and the kinds of educational activities that matter to their learning. But estimates of university quality must be based on more information than rankings based primarily on student satisfaction indicators.

George D. Kuh,
Chancellor’s Professor and Director,
National Survey of Student Engagement
Indiana University, Bloomington

- - -  

Last reviewed 31-Oct-2006

to top | UBC.ca » UBC Public Affairs

UBC Public Affairs
310 - 6251 Cecil Green Park Road, Vancouver, BC Canada V6T 1Z1
tel 604.822.3131 | fax 604.822.2684 | e-mail public.affairs@ubc.ca

© Copyright The University of British Columbia, all rights reserved.