|Download the PDF|
Introduction and aim
Once an institution’s strategic plan is agreed, governors’ responsibilities are to ensure the plan is delivered and any planned outcomes achieved. This note examines how governors can effectively discharge their responsibilities.
Monitoring strategic plans
Once the institution’s strategic plan is signed-off the executive is responsible for its implementation and the delivery of the agreed strategic priorities and objectives.
For governors to be able to hold the executive to account a process of reviewing institutional performance against planned outcomes needs to be established. This should allow areas of weak implementation to be easily and quickly identified.
Both the Committee of University Chairs (CUC) and Scottish higher education codes of governance highlight that the governing body should review the implementation of the strategic plan and institutional performance. The CUC higher education code of governance, states the governing body ‘must rigorously assess all aspects of the institution’s sustainability, in the broadest sense, using an appropriate range of mechanisms which include relevant key performance indicators (KPIs) and other performance measures are to be adopted in a risk based framework.’ The Scottish code states there should be an annual process focusing on whether ‘long-term strategic objectives and short-term KPIs’ have been met.1
Rationale for measuring
The rationale for measuring, monitoring and assessing performance is that: ‘Measurement provides focus and feedback. Focus comes from an awareness that outcomes will be examined, and success and failure noted, creating a personal incentive to perform well.’ 2
Unless an alternative provider set up for profit, higher education institutions are not-for-profit ‘public’ organisations, providing education for public benefit. While institutions exist for educational purposes, they are also autonomous and independent legal entities and over the longer term need to operate as a sustainable business in order to survive. This highlights the importance of institutions having multiple objectives, and need to manage performance for both today and tomorrow.
Institutions need give attention to their longer-term sustainability, as well as short-term performance . They therefore need to adopt performance indicators that enable informed judgments to be made about their longer-term, as well as shorter-term, performance.
To monitor performance over the longer-term (eg. 3 years) institutions may wish to adopt two summary (or ‘super’) performance indicators as follows:3
Academic profile and market position: is primarily a contextual indicator, reflecting the institution’s academic aspirations and its intended position within the higher education sector. It seeks to capture the academic character and positioning of the institution, including the institution’s academic profile.
Institutional sustainability: sustainability is not the same as survival, and although frequently framed in financial terms, it is not just about money. Institutional sustainability is about operating in ways, which do not impair the institution’s future capabilities. It includes the ability toattract and retain the staff required to deliver its vision and goals and generate sufficient cash to support strategic investments and manage risk.
Linked to the summary indicators, a set of KPIs supporting the achievement of top-level indicators, but focusing on a shorter time period, should be identified. These indicators will reflect performance in more specific areas of institutional activity.
The balance scorecard4
To accommodate multiple objectives many institutions have adopted a balance scorecard (BSC) or dashboard to report performance.5 The idea of the BSC is simple. Rather than relying only on a set of, financial measures to assess and monitor performance, non-financial measures are also introduced to give a more rounded or balanced basis on which to judge performance.
Adopting a balanced scorecard
The BSC approach starts with the institution’s strategic plan and detailed strategic priorities and objectives. Typically, this might involve looking at objectives in respect of, say, four key areas. For example:
For each strategic priority and associated objective the idea is to select KPIs that offer a reliable basis of judging performance. For example, a focus on student recruitment and experience could lead to indicators covering recruitment, retention and satisfaction, eg:
Agreeing the KPIs
Governors should be engaged in the process of agreeing the KPIs. Governors should satisfy themselves that the chosen indicators are not simply things which can be easily measured, but reflect key priorities and objectives. When selecting individual indicators governors should consider:
The governors should focus on a small number of top-level indicators (both summary and short-term) allowing attention to be focused on the key priorities and objectives, reducing the risk of governors being swamped by excessive data.
While governors should only review a small number of indicators, this does not preclude the cascade of the KPIs for the purposes of operational management, enabling, for example, the executive team to link operational performance at faculty/school/department level to the institution’s strategic priorities and objectives.
Normally, a performance target covering the period of the institution’s strategic plan will be agreed for each KPI. Setting the target (objective) may be informed by comparisons with the institution’s past performance (historical trend) or by the level of performance achieved by competitors (peer referencing). The institution may also agree milestones – intermediate targets - for each year of the strategic plan.
If the intention is to improve on current performance then a target may be set which is stretching. Alternatively, where current performance is judged to be good and matches competitors, attention may be directed at maintaining the current level.
Targets may be expressed in absolute terms (level of success in the NSS), as a percentage change, or percentage point improvement (eg movement of 5 percentage points from, say, 70% to 75%).
Governors should ensure that appropriate performance measures and targets are adopted by the institution, and that the data is presented in a form which enables lay governors to make informed judgments about the institution’s progress even though they may not have professional expertise in the area to which the indicator applies.
How often will KPIs be updated?
The frequency by which individual KPIs are updated will depend on the availability of data. For instance, institutions can track applications to full-time courses throughout the annual recruitment cycle, enabling judgments during the year to be made about whether recruitment is on target or action required to correct an adverse trend. Data for other indicators, for example the NSS, is only collected and published annually.
Review of KPIs by governors
Governors will normally review the institution’s KPIs perhaps several times a year. Examination of the data should focus on assessing whether satisfactory progress is being made towards the agreed strategic priorities and objectives.
When reviewing outcomes, many institutions use a traffic light system to highlight performance. At its simplest, a three-phase red-amber-green (RAG) system is used. If an indicator is significantly off target it is shown as red; if there are some concerns, amber; and if performance is judged to be on target green.
If an indicator is red or there is a marked deterioration in the level previously achieved, governors should expect to receive a clear explanation of the reasons from management and what action, if any, is required. In exceptional circumstances, if poor performance is judged to be outside of the control of the institution, the target itself may be reset.
What can go wrong?
Typical problems that can occur are that the ‘wrong’ areas are monitored, inappropriate indicators are chosen; there are too many indicators; the data is hard to understand and interpret; governors fail to understand the links between different measures; and insufficient attention is given to institutional sustainability or academic profile and market position.
Questions to consider
End notes and further reading
Associate Director, Governance
Aaron Porter was appointed as Associate Director, Governance at the Leadership Foundation (LF) in January 2014. He is also a higher education consultant and a freelance journalist, having previously been president of the National Union of Students (NUS) in 2010 – 2011. He is also an associate for the LF and the Higher Education Academy (HEA), on the advisory network for the Office for Fair Access (OFFA) and CFE research and consultancy alongside a number of other portfolio roles.
During his high profile term at NUS, he was the first NUS President to be invited as an observer to the board of the Higher Education Funding Council for England (Hefce) and to address the annual Universities UK Conference in September 2010. In addition he served as a non-executive director on the boards of UCAS, the HEA and Endsleigh Insurance. He also co-chaired the Beer/Porter Student Charter group which reported to Higher Education Minister David Willetts in January 2011, and was a member of the Hefce Online Learning Taskforce and the review of External Examiners chaired by Dame Janet Finch both conducted in 2010/11.
Previous to his term as NUS President, Aaron served two successful terms as NUS Vice-President (Higher Education), helping to build NUS’ reputation with the sector. He also served as a non-executive board member for the Office of the Independent Adjudicator (OIA) and on the board of the European Students’ Union (ESU). He was also a member of the Burgess Implementation Steering Group and the National Student Survey Steering Group. In 2009, he was part of the UK delegation to the European higher education ministerial summit in Leuven, Belgium.
Aaron studied BA English at the University of Leicester and graduated in 2006. He then spent two years as a sabbatical and trustee of the students’ union, he was also the founding chair of Unions94 (the students’ unions of the 1994 Group). As a student he was editor of the student newspaper, ‘The Ripple’.
Governance Web Editor
David Williams is Governance Editor for the LF website. He has over 25 years experience of working in higher education, as both as an academic and senior manager. During this time he has worked closely with governing bodies, contributing to, and supporting their work
in a variety of ways.
As Governance Editor, David works with the wider LF community and its members to ensure the governance website offers a repository of information and signposts recent developments in the field on governance.