Survey of the State of Analytics in UK HE and FE institutions

Link: Survey of the State of Analytics in UK Higher and Further Institutions 2013 (pdf).
Link: Survey of the State of Analytics in UK Higher and Further Institutions 2013 (MS Word docx).

An informal survey was undertaken by Cetis in May and June 2013. Subscribers to a number of email circulation lists – with members coming largely from institutional IT, administration and educational technology responsibilities – were invited to respond.
The purpose of the survey was to:

  • Assess the current state of analytics in UK FE/HE.
  • Identify the challenges and barriers to using analytics.

Chart showing reported data sources for analytics

Chart showing reported data sources for analytics

For the purpose of the survey, we defined our use of “analytics” to be the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. In practical terms, it involves trying to find out things about an organisation, its products services and operations, to help inform decisions about what to do next.
Continue reading

Activity Data and Paradata

Illustration of activity data and paradataLink: Activity Data and Paradata (pdf)
Link: Activity Data and Paradata (MS Word .docx)

This briefing introduces a range of approaches and specifications for recording and exchanging data generated by the interactions of users with resources.

Such data is a form of Activity Data, which can be defined as “the record of any user action that can be logged on a computer”. Meaning can be derived from Activity Data by querying it to reveal patterns and context, this is often referred to as Analytics. Activity Data can be shared as an Activity Stream, a list of recent activities performed by an individual. Initiatives such as OpenSocial, ActivityStreams and TinCan API have produced specifications and APIs to share Activity Data across platforms and applications.

While Activity Streams record the actions of individual users and their interactions with multiple resources and services, other specifications have been developed to record the actions of multiple users on individual resources. This data about how and in what context resources are used is often referred to as Paradata. A specification for recording and exchanging paradata has been developed by the Learning Registry, an open source content-distribution network for storing and sharing information about learning resources.
Continue reading

Cetis Analytics Series: Case Study, Acting on Assessment Analytics

Link: Cetis Analytics Series Vol 2, No 2. Acting on Assessment Analytics (pdf)
Link: Cetis Analytics Series Vol 2, No 2. Acting on Assessment Analytics (MS Word docx)

Over the past five years, as part of its overall developments in teaching and learning, The University of Huddersfield has been active in developing new approaches to assessment and feedback methodologies. This has included the implementation of related technologies such as e-submission and marking tools.

In this case study Dr Cath Ellis shares with us how her interest in learning analytics began and how she and colleagues are making practical use of assessment data both for student feedback and overall course design processes.
Continue reading

Cetis Analytics Series: Case Study, Engaging with Analytics

Link: Cetis Analytics Series Vol 2, No 1. Case Study, Engaging with Analytics
Link: Cetis Analytics Series Vol 2, No 1. Case Study, Engaging with Analytics (MS Word .docx)

Jean Mutton, Student Experience Project Manager, University of Derby, shares with us some approaches she has been spearheading in terms of using data and analytics to help improve the student experience. Through their participation in Jisc development programmes, Jean and her team (including paid student interns) have taken a service design approach that focuses on the needs of end user first.

This case study explores the wider issues around using data to inform decision making, and the strategies the University of Derby are developing to improve their student enhancement processes by addressing key questions such as:

  • What is actually happening to students, how can we find out?
  • What are the touch points with between students and the institution?
  • What are the institutional “digital footprints” of our students?
  • What really matters to our students?

Continue reading

CETIS Analytics Series: Infrastructure and Tools for Analytics

Link: CETIS Analytics Series Vol 1, No 11. Analytics Tools and Infrastructure (pdf)
Link: CETIS Analytics Series Vol 1, No 11. Analytics Tools and Infrastructure (MS Word .docx)

Analytics is notable in that it is a headline grabbing trend in many domains, but has also been around for a long time under various other labels. One consequence of that longevity is that there is a bewildering array of tools available that can support an analytics process in some way.

An exhaustive overview of all such tools is near impossible, and probably out of date the moment it’s finished. What is possible, however, is to provide a map of the major categories of tools, and highlight some landmark tools that are available now.

Because of the diverse history and practice of analytics, many different categorisations are possible, but we choose to group them by tradition, or established approach. One reason is that such an approach makes tools more easily comparable, because they have been developed to meet the needs and expectations of their communities over time. The other reason is that it tallies closely with other papers in the CETIS Analytics Series of which this briefing is a part.
Continue reading

CETIS Analytics Series: The impact of analytics in Higher Education on academic practice

CETIS Analytics Series Vol 1, No 10. Analytics for Teaching Practice (pdf)
CETIS Analytics Series Vol 1, No 10. Analytics for Teaching Practice (MS Word .docx)

Many strong claims have been made for Learning Analytics and the potential which it has to transform the education system, which deserve to be treated with caution, particularly as they regard teaching practice.

The introduction of these techniques cannot be understood in isolation from the methods of educational management as they have grown up over the past two centuries. These methods are conditioned by the fact that educational managers are limited in their capability to monitor and act upon the range of states which are taken up by teachers and learners in their learning activities. Strategies for simplification have been developed which classify the range of knowledge as a number of subjects, reduce the subjects to courses, and assign students to cohorts which carry out the same activities. Teachers, meanwhile, deal as best they can with the full variety of learners’ needs in their practice. Over the years, an accommodation has developed between regulatory authorities, management and teaching professionals: educational managers indicate the goals which teachers and learners should work towards, provide a framework for them to act within, and ensure that the results of their activity meet some minimum standards. The rest is left up to the professional skills of teachers and the ethical integrity of both teachers and learners.

This accommodation has been eroded by the efforts of successive governments to increase their control over the education received by both school and higher education students. Learning Analytics radically reduces the effort involved in gathering information on the way in which lecturers deliver the curriculum, and also to automate the work of analysing this information. An alliance of these two trends has the potential to constrain teaching practice, and therefore it is necessary to take a systemic view when assessing the impact of analytics on teaching practice.

Three types of analytics intervention are discussed, in terms of their impact on practice.

  • efficiency in the wider functioning of the institution, which has few implications for teaching practice,
  • enhanced regulation of the teaching and learning environment, which has potentially negative impact on teaching practice,
  • methods and tools intended to help lecturers carry out their tasks more effectively, which have the potential to be a useful tool in teaching practice.

It is concluded that Learning Analytics should not be seen as a short cut to providing teaching professionals with universal advice on ‘what works’, and that its use to increase the accountability of teachers to management may have unintended negative consequences. Rather, the most promising area for enhancing teaching practice is the creation of applications which help teachers identify which of the many interventions open to them are most worthy of their attention, as part of an on-going collaborative inquiry into effective practice.

Continue reading

CETIS Analytics Series: A Brief History of Analytics

Link: CETIS Analytics Series vol 1, No 9. A Brief History of Analytics (pdf)
Link: CETIS Analytics Series vol 1, No 9. A Brief History of Analytics (MS Word .docx)

The potential of analytics according to this definition is to help us to evaluate past actions and to estimate the potential of future actions, so to make better decisions and adopt more effective strategies as organisations or individuals. Analytics allows us to increase the degree to which our choices are based on evidence rather than myth, prejudice or anecdote.

Several factors are coming together at the moment to stimulate interest in making more use of analytics. One of these is the increased availability, detail, volume and variety of data from the near-ubiquitous use of ICT (Information and Communication Technology) throughout almost all facets of our lives. This aspect tends to be the focus of the news media but data alone is not enough to realise benefits from analytics. A less popularised factor driving effective exploitation of analytics is the rich array and maturity of techniques for data analysis; a skilled analyst now has many disciplines to draw inspiration from and many tools in their toolbox. Finally, the increased pressure on business and educational organisations to be more efficient and better at what they do adds the third leg to the stool: data, techniques, need.

This paper, one of the CETIS Analytics Series, is aimed at readers who wish to be introduced to the range of techniques that are being pieced together and labelled as Analytics. It does this by outlining some of the most important communities – each with their own origins, techniques, areas of limitation and typical question types – and suggests how they are contributing to the future, with special reference to the context of post-compulsory education.

The diversity and flexibility of some of the techniques lined up under the analytics flag is evidenced by the numerous different applications of analytics: financial markets, sports analytics, econometrics, product pricing and yield maximisation, fraud, crime detection, spam email filters, marketing, customer segmentation, organisational efficiency and even tracking the spread of infectious disease from web searches i. Behind these applications we can find the roots of analytics in the birth of statistics in the eighteenth century but since then different applications of statistics and IT have led to different communities of practice that now seem to be merging together. We see that Web Analytics pioneers are now expl oiting data from the “social web” by using Social Network Analysis and that the techniques of Information Visualisation are supporting interactive and exploratory forms of analysis rather than just the graphs in management reports. Subjects that some see a s old-hat such as Operational Research and others that are often perceived as futuristic such as Artificial Intelligence are each making contributions in surprising ways. Meanwhile, the education community has made its own contributions; Social Network Analysis and Artificial Intelligence have both emerged from academic research and we are now beginning to see sector – specific variants of analytics being put to work in the form of Educational Data Mining, Learning Analytics and bibliometrics.
Continue reading

CETIS Analytics Series: Institutional Readiness for Analytics

Link: CETIS Analytics Series Vol 1, No 8. Institutional Readiness for Analytics (pdf)
Link: CETIS Analytics Series Vol 1, No 8. Institutional Readiness for Analytics (docx)

This briefing paper is written for managers and early adopters in further and higher education who are thinking about how they can build capability in their institution to make better use of data that is held on their IT systems about the organisation and provision of the student experience. It will be of interest to institutions developing plans, those charged with the provision of analytical data, and administrators or academics who wish to use data to inform their decision making. The document identifies the capabilities that individuals and institutions need to initiate, execute, and act upon analytical intelligence.

For the purpose of this paper, the term Learning Analytics (LA) is used to cover these activities using the definition of:

Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data. (CETIS, 2012)

The proposition behind learning analytics is not new. In the school sector particularly, good teaching practice has long involved record keeping with pen and paper and the analysis and reflection on this data to inform courses of action, and more recently using technology. Similarly, in different ways, all higher education (HE) and further education (FE) institutions use data to inform their decision making in assessment boards and course committees. However, as institutions increasingly use technology to mediate, monitor, and describe teaching, learning and assessment through Virtual Learning Evironments (VLEs) and other systems, it becomes possible to develop ‘second generation’ learning analytics. The large data sets being acquired are increasingly amenable to new techniques and tools that lower the technical and cost barrier of undertaking analytics. This allows institutions to experiment with data to gain insight, to improve the student learning experience and student outcomes, and identify improvements in efficiencies and effectiveness of provision.

Continue reading

CETIS Analytics Series: A Framework of Characteristics for Analytics

Link: CETIS Analytics Series Vol 1, No 7. A Framework of Characteristics for Analytics (pdf)
Link: CETIS Analytics Series Vol 1, No 7. A Framework of Characteristics for Analytics (MS Word docx)

This paper, the seventh in the CETIS Analytics Series, considers one way to explore similarities, differences, strengths, weaknesses, opportunities, etc of actual or proposed applications of analytics. It is a framework for asking questions about the high level decisions embedded within a given application of analytics and assessing the match to real world concerns. The Framework of Characteristics is not a technical framework.

This is not an introduction to analytics; rather it is aimed at strategists and innovators in post-compulsory education sector who have appreciated the potential for analytics in their organisation and who are considering commissioning or procuring an analytics service or system that is fit for their own context.

The framework is conceived for two kinds of use:

  1. Exploring the underlying features and generally-implicit assumptions in existing applications of analytics. In this case, the aim might be to better comprehend the state of the art in analytics and the relevance of analytics methods from other industries, or to inspect candidates for procurement with greater rigour.
  2. Considering how to make the transition from a desire to target an issue in a more analytical way to a high level description of a pilot to reach the target. In this case, the framework provides a starting-point template for the production of a design rationale in an analytics project, whether in-house or commissioned. Alternatively it might lead to a conclusion that significant problems might arise in targeting the issue with analytics.

In both of these cases, the framework is an aid to clarify or expose assumptions and so to help its user challenge or confirm them.

Continue reading

CETIS Analytics Series: Analytics for Understanding Research

Link: CETIS Analytics Series Vol 1. No 4. Analytics for Understanding Research (pdf)
Link: CETIS Analytics Series Vol 1. No 4. Analytics for Understanding Research (MS Word .docx)

Analytics seeks to expose meaningful patterns in data. In this paper, we are concerned with analytics as applied to the process and outputs of research. The general aim is to help optimise research processes and deliver improved research results.

Analytics is the use of mathematical and algorithmic methods to describe part of the real world, reducing real-world complexity to a more easily understandable form. The users of analytics seek to use the outputs of analytics to better understand that part of the world; often to inform planning and decision-making processes. Applied to research, the aim of analytics is to aid in understanding research in order to better undertake processes of planning, development, support, enactment, assessment and management of research.

Analytics has had a relatively a long history in relation to research: the landmark development of citation-based analytics was approximately fifty years ago. Since then the field has developed considerably, both as a result of the development of new forms of analytics, and, recently, in response to new opportunities for analytics offered by the Web.

Exciting new forms of analytics are in development. These include methods to visualise research for comparison and planning purposes, new methods – altmetrics – that exploit information about the dissemination of research that may be extracted from the Web, and social network and semantic analysis. These methods offer to markedly broaden the application areas of analytics.

The view here is that the use of analytics to understand research is a given part of contemporaneous research, at researcher, research group, institution, national and international levels. Given the fundamental importance of assessment of research and the role that analytics may play, it is of paramount importance for the future of research to construct institutional and national assessment frameworks that use analytics appropriately.

Evidence-based impact agendas are increasingly permeating research, and adding extra impetus to the development and adoption of analytics. Analytics that are used for the assessment of impact are of concern to individual researchers, research groups, universities (and other institutions), cross-institutional groups, funding bodies and governments. UK universities are likely to increase their adoption of Current Research Information Systems (CRIS) that track and summarise data describing research within a university. At the same time, there is also discussion of increased ‘professionalisation’ of research management at an institutional level, which in part refers to increasing standardisation of the profession and its practices across institutions.

The impetus to assess research is, for these and other social, economic and organisational reasons, inevitable. In such a situation, reduction of research to ‘easily understandable’ numbers is attractive, and there is a consequent danger of over-reliance on analytic results without seeing the larger picture.

With an increased impetus to assess research, it seems likely that individual researchers, research groups, departments and universities will start to adopt practices of research reputation management. However, the use of analytics to understand research is an area fraught with difficulties that include questions about the adequacy of proxies, validity of statistical methods, understanding of indicators and metrics obtained by analytics, and the practical use of those indicators and metrics in helping to develop, support, assess and manage research.

To use analytics effectively, one must at least understand some of these aspects of analytics, and certainly understand the limitations of different analytic approaches. Researchers, research managers and senior staff might benefit from analytics awareness and training events.

Various opportunities and attendant risks are discussed in section 5. The busy reader might care to read that section before (or instead of) any others.
Continue reading