Geraldine Foley, Assistant Learning Technologist and Athina Chatzigavriil, Senior Learning Technologist share their thoughts on Learning from Digital Examinations – 26th April 2018 conference.

Learning from Digital Examinations, a one day conference organized by Brunel University brought together practitioners form different universities across the country and from abroad. It was a great opportunity to share best practices, lessons learnt and provided detailed examples of the complexities involved with digital examinations as well as some of the potential benefits.

Students are used to typing their work electronically and the majority have their own devices, yet when it comes to exams at LSE and elsewhere in the UK the standard expectation is to hand-write responses for final examinations. This is due to multiple reasons including; infrastructure, regulations, spaces and facilities. However, some universities have started to shift to electronic examinations and so we went along to find out more and to present on the pilot projects we have done here at LSE (more details below).

Brunel University commenced research of digital examinations in 2015. They used WISEflow, a platform provided by the Danish based company UNIWise. They used students’ own devices Bring Your Own Device (BYOD) and implemented 1 exam (115 students) in 2016. Following a successful proof of concept with this one exam they moved to a pilot with 1300 students in 2016/17. Since then the university moved to a staged implementation of the assessment platform in September 2017. WISEflow was the highlight platform for digital examinations but also Electronic Management of Assessment (EMA) of the conference.

There were quite a few institutions at the conference that have already moved wholesale to typed examinations while others are still starting out. Moreover there seems to be a greater interest among institutions to move towards EMA approaches to assessment and not only typed instead of handwritten examination. Line Palle Andersen described how staff at University College Copenhagen, Denmark use WISEflow to support flows of other forms of assessment (such as oral, MCQs etc.) and how their staff are involved in marking and feedback provision taking advantage of the extensive feedback features available.

The full conference programme and the presentation slides can be viewed online but some general themes and questions over the day are discussed here.

  • Bring your own device (BYOD)

    Space and facilities tend to be limited in HE so the majority of institutions appear to be adopting the BYOD approach. In Norway and Denmark where the move to typed exams was a nation-wide project it is mandatory for all students to have a device for their studies. UK universities using the BYOD approach provide support for those that do not have their own devices such as loans and grants with a small number of devices for those that experience problems on the day of exams.

  • Student training and support are essential… and students can help!

    Students need chances to test out and get used to any new system or approach. Unsurprisingly those students that didn’t go to support sessions tended to be the ones that needed more support. Brunel University employed students as assistant learning technologists to run drop in support sessions leading up to the examinations so students could install and test out the software on their devices and they also worked with invigilators to offer technical support during the exams. This model has been used successfully in Demark and Norway too. Dr Liz Masterman from the University of Oxford presented on the literature review that looked at studies from 2000 onwards on typed exams to assess the equivalence on the psychological and academic aspects of moving from handwritten to typed examinations. The various studies surveyed yielded inconsistent results; nevertheless, the findings prompt a number of questions for consideration when moving essay-based examinations to typed ones.

  • Change requires strong project management

    Assessment processes involve multiple stakeholders and facilitators; professional support staff, admin staff, estates, IT, academic staff, students, and invigilators all need to be involved, informed and on-board in order to move successfully to digital assessment. Learning technology and Educational development staff have a critical role in working with academics to ensure that they engage with the process and don’t just replicate existing practice. Moving online should present an opportunity to design assessment that is in-line with the course learning outcomes, with clear links between the formative and summative assessments and is balanced across the course.

  • Electronic assessment may lead to more inclusive assessment

    Dr Torben K Jensen on his keynote talking about the reason for which universities should digitise examinations raised the ‘generation argument’ in terms of fairness; handwritten exams are far from students’ every day activities. Making spell checkers, screen readers, remote assessment and other assistive technology available to everyone can reduce the need for individual adjustments. More work is needed to find out the impact of moving to electronic assessment, but Brunel University reported that they received no appeals with regards to moving to electronic exams. As mentioned above changing assessment can provide an opportunity to rethink assessment and even move away from examinations. Many institutions demonstrated digital assessment in various forms, including oral presentations, video submissions, multiple choice questions, simulations and group projects.

  • Feedback can be electronic too!

    Feedback on work in HE has been similarly slow to move to electronic form and yet handwritten comments are often hard to read and slow to produce and distribute compared to typed comments. Many institutions moving to electronic assessments are shifting the entire process online. Professor Denise Whitelock from the Open University presented the final keynote on the various ways that technology can be used to train and support teachers to give useful and supportive feedback. She has been involved in creating several automated feedback tools for students and highlighted the importance feedback can have on students’ learning.

Pilot e-exams at LSE

Our presentation focused on three past LSE pilots that took place in order to:

  • Explore students’ perceptions of typing versus handwriting exams.
  • Test out online examination software
  • Evaluate the requirements for online examinations including: security, regulations, facilities, training and support.

All three pilots were for formative assignments which provided feedback for final examinations. In each case various software were compared and the departments made the final selection for the platforms used in-line with their individual requirements.

Two of the pilots were in the Law department for take home mock examinations using the software Examsoft which allowed students to access examination questions and type 1 essay response from a choice of 3 within 2 hours.  Students were given 5 working days to access the questions and it was up to them to find a suitable space to type their response (see full report here).

The third pilot was with the Government department for a mock on campus invigilated examination using the software Exam4 (see full report here). Students brought their own devices to type 4 essays questions (from a choice of 16) within 3 hours. Exam questions were given in hard copy format with extra information provided to invigilators. In both cases students were given opportunities to test out the software in advance.  Both pilots were evaluated with questionnaires and focus groups with students and feedback from staff.

Overall students welcomed the typed examinations and many appreciated producing a typed script which was more legible for examiners to read some students, but some had concerns about the expectations of examiners who might assume typed answers required better quality answers even though they were produced under exam conditions. Several students found editing their examination answers was easier when typing, but others felt penalized by their slow typing skills. Some students believed the cognitive process of typing an exam answer differed to handwriting one and that grammar and spelling errors were less easy to spot when typing. The identified institutional implications for scaling up typed examinations, include substantial overhaul of the regulations, provision in case students cannot use their own device and adequate student support and training.  The full evaluation reports of the pilots can be found on LSE Research online.

Next steps

The conference gave lots of detailed examples of the complexities involved with digital assessment as well as some of the potential benefits. Going forward at LSE, the Assessment Service Change Project (ASCP), led by Cheryl Edwardes, Deputy Head of Student Services, is collaborating with staff and students to design enhanced assessment processes and systems which incorporate best practice and expert knowledge from across the School community and wider HE sector. If you wish to learn more and/or share your views you can sign up to attend any of the Validation Workshops. Moreover, the Assessment Working Group, led by Dr Claire Gordon, Head of Teaching and Learning Centre are taking forward work on the following areas: i) assessment principles, ii) good practice in assessment design, iii) inclusive practice in assessment, and iv) quality assurance and regulatory arrangements in assessment. Also, the Law department are currently trialing a small-scale proof of concept exam using DigiExam with ipads and keyboards – providing devices for students.

LTI is involved in all the above initiatives and support courses and programmes in the use of electronic assessment and are working with several departments to move their processes online.  Please contact LTI.Support@lse.ac.uk if you would like to discuss this further with us.