Skip to content

IT benchsharing – Preliminary results from UCISA’s HEITS survey

March 19, 2012

Reasons for a new survey

In 1996-7 UCISA began gathering statistical information about IT services in higher education.  This became an annual exercise, but over time we noted a decline in the numbers of institutions responding to the survey.    

IT Directors were telling us that a comparison of Key Performance Indicators (KPIs) was needed for the sector.   Anecdotal evidence revealed that that while HE IT departments could easily identify IT payroll costs, not many were collecting data that showed, by  IT staff member,  the services provided to academics, administration staff and students.   

It was therefore decided that the survey would be redesigned.   A member of the UCISA Executive Board proposed that UCISA adopt a benchsharing methodology.  Benchsharing is like benchmarking but with a greater emphasis on sharing what is learned across all groups, rather than simply creating a league table of who is doing what best:  www.chriscollison.com/pib.html 

A broken down, more detailed survey would enable IT Directors to make meaningful comparisons of resources (including staff).  They would be able to see how much time and money their institution spent on a specific area of activity per head compared with other universities.  

HESA would continue to provide a total number of FTE staff and students for our survey but the collection of additional data would mean that comparisons of actual activities would be possible rather than just a comparison of total institutional spend. 

For example if  institution A had six dedicated audio visual (AV) staff and institution B had none (but supported AV to a similar level), if they both quantified their activity and compared institutions, A might consider adopting B’s approach (outsourcing perhaps) or institution B could use A’s figures to argue for greater staff numbers. 

The UCISA Executive Committee felt that the benchsharing model would work well as there was already strong culture of sharing information amongst UK HE IT Directors, both at UCISA events and through UCISA email distribution lists.

What’s new?

The format of the survey and survey questions were changed radically.  We launched the survey using Vovici survey software which enabled respondents to complete the questionnaire on line and forward it to colleagues for completion.   The most significant change was to have two distinct parts of the survey:  Tier 1 and Tier 2.

Tier 1

Answers to Tier 1 questions would provide simple metrics to enable immediate comparisons between universities and colleges. This part of the survey, which would provide data on basic key performance indicators, would be made available to HESA.  It would therefore need to be completed by the Director of IT. 
We were hopeful that the relatively small number of questions in Tier 1 (thirty) would encourage a high completion rate.   We encouraged all institutions to fill in Tier 1 as minimum.

Tier 2

Directors who wished to drill down to the next stage of more detailed questions and complete them (or ask staff to complete them on their behalf) would be able make more meaningful benchsharing comparisons with other institutions.

For example an institution could simply answer yes to “Do you have a 24/7 student support for IT?” in Tier 1 but an IT Director who was interested in help desks could also drill down further to answer questions on outsourcing student support, and whether, for example, the help desk fixes students’ own laptops in Tier 2.

Directors could choose to answer all of Tier 2, some of Tier 2 or none of it.

A list of all the questions, as well as some of the high level results can be found on the UCISA website: www.ucisa.ac.uk/heits

UCISA’s special interest groups were consulted to ensure we asked a range of relevant questions from networking to procurement and from corporate information systems to staff development. 

The survey was set up so that an IT Director could forward on sections of it to different members of the IT team for completion.

Examples from Tier 1

Question 6 Total number of student workstations available across the whole institution?
Question 9 Do you have 24/7 student support for IT?
Question 12 What was the overall capital spend on IT project in this academic year? 
Question 15 Does your institution calculate and use at seat costs?
Question 22 What is the performance of the IT department measured against? Please select all that apply: Service Level Agreement (SLA); last year’s annual report; External satisfaction survey e.g. National Student Survey; internal satisfaction survey; strategic objectives; it is not measured; other, please list (free text field).
Question 23 Does the central IT department have a Service Level Agreement (SLA)?
Question 24 Is the IT department’s power consumption measurable?
Question 28 Does your institution have a specialist networking team?   If yes how many FTE staff are employed in the networking team?

Examples from Tier 2

Question 31 Do you use eduroam?
Question 38 Do you have a system for sending text messages to students?
Question 40 Does your institution have VOIP?
Question 49 How are management information systems linked to statutory returns? Multiple choice: fully linked reporting; we make manual returns on an ad hoc basis; other please list (free text field). 

Question 78 What is your institution’s policy on shutting down accounts (and email) for students when they graduate? (select all that apply): it is immediate; students retain accounts for a period after graduating; auto forward put in place; email addresses are retained or modified for alumni; other, please list (free text field).

Results

Participation was voluntary but we were pleased to that 60 out 140 UK higher education institutions participated in the survey.    This was a marked improvement on the previous HEITS survey, where numbers had sometimes fallen to as low as 40 institutions.

The results presented are for the academic year 2009/10. 

Some results were expected, others were not.

The following health warning should be observed:  the conclusions drawn from the results give useful indications about trends within the sector but as only 43% of all HEIs in the UK took part, these are not definitive.

The expected…some highlights

Many of the results we saw reflected trends already observed in other research such as the UCISA Top Concerns survey and the UCISA Technology Enhanced Learning survey

For example we found, as expected, that for virtual learning environments (VLEs) a mixture of proprietary and open source products were used (20 institutions used open source , 40 institutions used commercial solutions).

UCISA asked institutions to indicate which services they were delivering through shared services or outsourcing.  We knew, anecdotally, that outsourcing of services was fairly immature in the sector.  This was confirmed by the survey results with one notable exception:  nearly half of all respondents indicated student email had been out sourced at their institution.   I had been aware that a many universities had moved (or were considering moving) their student email to providers such as Microsoft and Google, but I was surprised that the number of institutions who had moved from in-house arrangements was so high.

Out of hours student support was the most popular service to outsource (11/60 institutions).

Few respondents (4/60 IT departments) told us that they calculated at seat costs.  On the one hand this is not surprising because of the difficulty in determining this metric in a complex environment.  But on the other hand it is likely that this figure is likely to grow over the coming years as IT departments come under increasing pressure to demonstrate how they add value to the student experience and the institution’s bottom line.  I think too that we will see a greater number of IT Directors join HE from industry,  and that there will be an attendant increase in performance measuring.

The unexpected…some highlights

I was very surprised to see that while 33 institutions used eduroam, 21 did not.  This might indicate that further marketing is required by Janet to promote eduroam (in addition to the work undertaken in collaboration with the UCISA Networking Group: www.ucisa.ac.uk/groups/ng/resources.aspx )

Measuring performance

I was interested to see that 7/ 60 responding IT departments did not measure their performance in any way (multiple choice options given included fairly “soft” options such as measuring against last year’s annual report). 

We also noted that 38 out of 60 IT departments did not have a SLA for the IT department.  This seemed fairly high.  We observed that SLAs were more widely used by the universities from the University Alliance mission group.

Power Management Software

We had some really encouraging results on the use of power management software (PMS).    The survey showed that 2/3rds of respondents had implemented, or were in the process of implementing PMS.   This is great news, and for me, really demonstrates the sector’s readiness to adopt good ideas.   Many universities have opted to use the University of Liverpool’s PowerDown code which is freely available here:  www.liv.ac.uk/csd/greenit/powerdown/index.htm 

PowerDown switches off idle student PCs when no one is logged in, saving over a million hours of unnecessary energy per month for the University of Liverpool.  PowerDown uses simple scripts which are based on existing Microsoft Windows features and freeware utilities. 

Liverpool was awarded the prestigious Green Gown Award  for this activity and have presented the results of PowerDown at a number of UCISA events.   

PMS is a double win, in that the university addresses the green agenda whilst making efficiency savings.  It’s likely that we will see even greater adoption in the coming years.

The unexpected…some highlights

We asked whether institutions had a system to sending text messages to students (used, for example, to advise of campus closure due to adverse weather conditions).  Of those who responded 41 did, and 11 did not.  It’s hard to know if this figure is representative for the whole of the sector but I was surprised that this “push” method of communication was so popular.  In the UK higher education sector we have already seen other trends emerging such as the use of institutional mobile apps and Twitter accounts. 

Only nine institutions provided 24/7 student support for IT.  Fifty institutions did not.  I thought this figure was rather low, given the number of institutions who are engaged in distance learning programmes, online courses and expansion to overseas campuses.  Of course, it should be remembered that this data is for academic year 2009/10. The incidence of 24/7 support may well now be higher.  For example we have observed this year that a number of institutions have joined the NorMAN Out of Hours Helpline service,  a shared service providing first line IT support to students and staff of participating universities.  

Only eight institutions had campus-wide monitoring systems.  50 did not.  This was unexpected, I thought, given increased attendance reporting requirements required of universities and colleges by the UK Border Agency.   I believe that institutions will look to further automate attendance reporting for two reasons: to lessen the data burden of attendance monitoring (for international students) and increase overall student retention (through early intervention for non – attendance).

Next steps: survey results as a change agent 

 Q 65 Are questions about IT included in your institution’s student satisfaction survey?

In one sense the question itself is less important that the reader’s reaction to the “outlier” in the pie chart. 

In this scenario the presentation of the image above may lead to the university or college to reflect on why they do not gauge student satisfaction of IT through feedback especially given that IT is measured in the influential National Student Survey.   And perhaps that if the institution chose to start collecting this information, they could approach a fellow respondent for help or advice (suggestions on approaches to student surveys for IT, perhaps).

The results as whole are intended to encourage IT Directors to consider:             

How similar/ different are the results of my institution?
What can I learn from another institution that will improve my IT department?
What can I do with the same resources?
What am I doing really well?

And to then make use of UCISA’s formal and informal networks to share information with their peers.

————————————————————————————————————————————–

This text is the narrative of a presentation on IT Benchmarking in the UK given by Anna Mathews to ZKI in Hamburg (2nd March 2012).

Advertisements
2 Comments leave one →
  1. April 28, 2012 11:22 am

    Great to see you putting benchsharing into action! Well done!

    • July 18, 2012 8:58 am

      A belated reply. Thanks for the comment. It be interested to know how other organsiations are using your concept, especially if you have any examples from IT, or from public facing institutions.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: