Posted on 14 Jun 2021 at 11:57 by Vicky Lewis
The first part of my research project on UK university global engagement strategies showed that there’s often a mismatch between the global rhetoric in strategy documents (with its emphasis on making a positive contribution to the world) and the measures of international success that are selected (which tend to be more about building institutional profile, reach and income).
I therefore probed this area with my interviewees, generating a variety of responses. This blog seeks to tease out some of their different perspectives and suggests some alternative approaches to measuring success.
It draws on the final chapter (Chapter 12) of my report – UK Universities’ Global Engagement Strategies: Time for a rethink?. (There’s an overview of all the sections and Chapters in the report here).
As one interviewee pointed out: ‘Ultimately we’re seeking transformational change, placing global at the heart of our thinking, services, provision. But that’s hard to measure’. The fundamental conundrum is that KPIs need to be measurable, but impact is difficult to assess. The large number of variables involved in most international goals also make it challenging to determine the causation of impact. As one person observed: ‘You can’t draw a straight line between, say, participation in international experience and performance’.
Some felt that, behind the ‘soft power’ rhetoric, it is important to have some hard measures, driven in part by the need for financial sustainability and the expectations of external bodies. Sometimes the choice of KPIs is a pragmatic response based on the data that are available.
Others noted that, when you go beyond numbers and income to experience and development, the impact will be less tangible but more meaningful and enduring. It was suggested that ‘internationalisation strategies need to be able to acknowledge fuzziness and leave some open space for things that can’t be neatly measured’.
On the whole, interviewees advocated using a balanced range of KPIs – some softer and more qualitative, others harder and more quantifiable.
Even if not used as a formal (or publicly stated) KPI, most UK HEIs are acutely aware of their position in national league tables and, especially for more prestigious institutions, global rankings. One interviewee suggested that the sector has failed to communicate a clear purpose (what we want to be measured by) so league tables have filled the vacuum.
Many interviewees agreed that league tables represent unsatisfactory and blunt proxies and are not very good at driving what you do. One noted that the efforts made to improve one’s position ‘take up too many resources and focus energies on the wrong things’.
This suggests that they may be more about ‘looking good’ than ‘being good’, falling into the category of vanity metrics. The value of being in a club of peer institutions was also questioned since ‘everyone benchmarks against each other and mimics each other’, resulting in strategies and behaviours that resemble one another.
While there was a cautious welcome to the THE Impact rankings (launched in 2019), which rank institutions (based on information supplied by them) on their progress towards the UN’s Sustainable Development Goals, one interviewee noted that ‘there is now a kind of fatigue with the global rankings’. This is reflected in much media commentary, where the very notion of forcing institutions into a pecking order over which they have little direct control (their position depends on how others do) is frequently questioned. (For a great explanation of the differences between rankings and ratings, see this piece by QS.)
Several interviewees thought it unlikely that the current rankings model would be disrupted and one stated that ‘we need to start pushing the boundaries a bit more to get the mainstream global league tables to broaden their perspective’.
All in all, interviewees felt that, whatever stance one’s institution took towards rankings, it was important to maintain independent thought. As one interviewee put it: ‘I think inner confidence is more important than actually competing – being able to excel on your own terms’.
Several people observed that metrics should reflect what makes individual HEIs unique – or at least distinctive; and some seemed hopeful that the next generation of strategies would embrace a more nuanced set of metrics.
As one said, ‘We’ve used the same indicators for decades, but I think this will change. The current ones are not fit for purpose’. Another observed that: ‘We’re not getting it right yet. The measurement doesn’t speak to the value of internationalisation. Success is defined mainly from the institutional perspective, not from that of individuals engaged in the internationalisation process’.
Some interviewees made explicit mention of alternative measures. One mentioned the International Integrated Reporting Framework, championed by Advance HE, which focuses on the real value of higher education (including its contribution to sustainable development), rather than comparative metrics. Another mentioned the Positive Impact Rating, a new tool designed to capture the voice of students to evaluate the positive societal impacts of Business Schools across a range of criteria, to foster collaboration and inspire deep change.
Other indicators with a more directly international focus were mentioned. These included the Education Insight Global Engagement Index (GEI), launched in October 2020. This ‘aims to capture the state of international engagement across 30 measures, which have varying relevance to the UK’s diverse higher education sector’ and seeks to spark conversations about global engagement in support of institutional strategy development. Measures are proxies for different aspects of internationalisation and include: geographical diversity of international students, transnational education for capacity building, international student success (including graduate outcomes), international themes within curricula, environmental impact, sustainable development and engagement with ODA countries, and impact of research produced in international collaboration. The GEI is freely available and draws on data that are already either reported or publicly accessible. As a result, it rates the vast majority of UK higher education institutions.
A commercial i-graduate product (not mentioned by any interviewees), which takes a completely different approach, is the Global Education Profiler (GEP). This has a robust academic underpinning, having been developed at the University of Warwick to focus on the factors that lead to ‘community internationalisation’. These include levels of intercultural interaction and integration across the diverse university community. There is an emphasis on developing global graduate skills and achieving more meaningful internationalisation by closing the gap between what students – and staff – value and what they are actually experiencing. As such, it is firmly focused on diagnosing areas for development and tracking institutional improvement. To date, it has experienced better take-up outside the UK than it has among UK HEIs. However, it aligns well with the ‘internationalisation for all’ approach that many UK universities are now saying is important to them.
Many HEIs use their own internally developed surveys (and other, qualitative mechanisms such as focus groups) as ways of identifying areas for improvement and measuring progress over time. The important thing is to ensure that strategic insights gained from such exercises are fed into overarching strategy. Just as metrics need to align with the priorities articulated in a strategy document, strategy needs to be rooted in the realities of practice and experience.
Universities currently reviewing their strategies should ask themselves the following questions:
The full Global Strategies report can be downloaded from Global Strategies Report – April 2021.
That page also includes download buttons for the Executive Summary and for an Overview of key questions for HEIs to ask, as leaders develop, review and consult on strategy.