Clement House

From interviews to Instagram, how did we engage students in the evaluation of Clement House?

This article is one of three blog posts on the newly refurbished learning spaces in Clement House. It is written by Emma Wilson, Graduate Intern for LTI. You can find her on Twitter (@MindfulEm). For more information about the Clement House evaluation, please take a look at our final report.


Working with students as partners in the development of their university experience should form an integral part of any institution’s set of policies. However, securing a sufficient level of student engagement, which is also meaningful, poses a challenge across the sector.

Within the evaluation process for Clement House, we have been keen to utilise a wide array of communication channels – including some innovative new approaches which have involved social media. By complimenting the old and new, our mixed method approach to data collection has secured the involvement of 196 students. In addition, we carried out 67 non-participant observation; as such, the Clement House evaluation benefited from 263 pieces of data for analysis.

How did we publicise the work and recruit volunteers?

Put simply: targeted and personalised communications. Which departments are the most active users of Clement House? Where are students most likely to pay attention to posters on the wall? What incentives would attract students to participate? If students want to get involved, how would they like to do so? With the never-ending stream of emails, how do we know which will be paid most attention by students, and what are the alternative channels of communication?

By taking the time to consider the above, it is far more probable that students will show a willingness to engage themselves in a project evaluation.

The use of visual communications has been a core component of this project evaluation. Posters were visible in strategic locations throughout the project, whereby a QR code and bespoke hashtag was used (where applicable). These posters were displayed across all floors of the Student Union’s building, and electronic versions were broadcast in the library and Clement House (including the International Relations Department which is based there).

Poster One: Seeking student engagement in an online survey
Poster Two: Seeking student engagement in a social media competition

   

Findings based on method of engagement

We created an online and paper version of a survey. The questions were identical although the online survey provided space to make any additional comments. We received 55 responses to the survey in paper format, and 45 via the online survey. The social media campaign ran outside of term time, for a shorter period of time (2.5 weeks), and received 12 responses. This data was supplemented by 74 structured interviews of 1-3 minutes that were carried out during the non-participant observations (of which 67 were carried out across 4 weeks).

Key findings from the evaluation can be found in our report and in our other blog posts (see links). We have also drawn together a selection of Tweets and Instagram responses and displayed them as a collection on StorifyA sample of Tweets and Instagram posts can also be viewed in the slideshow below.

Sample of Tweets and Instagram posts 

What lessons have we learned?

A mixed approach to data collection enabled us to find a balance between a purely qualitative or quantitative approach. Whilst interviews provide an opportunity to understand how and why a student feels a certain way, the use of close-ended survey questions ensures a certain amount of objectivity in particular instances. For example, in the survey it was useful to provide students with four options when asked about the purpose of their visit to the learning space. This allowed comparability across floors. However, it was the richness of data collected from the subsequent open-ended questions (whether in the interview or survey) that enabled us to fully understand the reason why a student feels a certain way.

With a mixed method approach, it is important to ensure consistency of methodology across data collection methods. Do you have the same questions for the paper and online versions of the survey? If not, why not? How can any differences be taken into account?

Looking ahead, I would be keen to encourage the future use of a mixed methods approach to data collection. If carrying out a social media campaign, it is important to consider the time of year in which the campaign in launched; if it’s outside of academic teaching, many students will not be on campus, and you will have to place a greater reliance on online promotion. It is also useful to check whether the university is conducting any other surveys – such as the NSS or end-of-year departmental feedback questionnaires – to ensure that students are not overwhelmed by the number of surveys they are being asked to complete.

There is no one-size-fits-all solution to successful student engagement and it is important to consider the following:

  1. Know your audience
    • Who are you trying to secure engagement from? (Students? If so, are you seeking feedback from those in a particular department or academic year?)
    • When might they be most willing to get involved? (Whilst waiting for their next class? As a break or distraction from revision? During a particular event?)
    • What are the incentives for them to get involved? (Focus on your language – emphasise the power of the student voice in contributing towards policy change; offer students the chance to win a voucher; if running a workshop, say that it’s an opportunity to network with peers and even make new friends)
  2. Think about how the ways in which they can get involved
    • Will canvassing a busy student before class necessarily be more effective than a survey that can be filled out in their own time?
    • Is the university keen to promote engagement through Instagram or Snapchat? Can your project also utilise these platforms?
  3. Connect with colleagues across departments and student groups or societies
    • Partnerships and collaborative working are great ways to contact groups of students who might be harder to reach.
    • Think about your audience – who are they likely to be in contact with? If students, do they have a student representative for their academic course?
    • Make contact with the university’s Student Union (SU); for example, their student engagement and communications officer. Getting some publicity on their website, social media feeds and newsletters is great for exposure. Asking to place posters around the SU building is a good way to reach more students.

Ultimately, this project unveiled a positive message: students are keen to get involved in sharing their views on the teaching and learning experience at LSE. 

Don’t be scared to pilot a new approach to student engagement. Understand your audience, think about how they interact in the university community, and take advantage of the new channels of communication. Over the next few years, we are likely to witness a changing landscape in higher education as Generation Z bring to their university a whole set of new expectations, skills and approaches to life in an ever-evolving digital environment. It is an exciting time for universities to engage with students and discuss the potential and opportunities for the future of higher education.  By approaching engagement in a creative way, we are more likely to kickstart a widespread conversation across the entire learning community.

 

Links

Other blogs in the LSE 2020 series: (see here and here)

Understanding student use of informal learning spaces with cognitive and photographic mapping

This article is one of three blog posts on the newly refurbished learning spaces in Clement House. It is written by Emma Wilson, Graduate Intern for LTI. You can find her on Twitter (@MindfulEm). For more information about the Clement House evaluation, please take a look at our final report.


In 2016, LSE unveiled six refurbished informal learning spaces in Clement House.

As part of this process, we sought to uncover how spaces such as these fit into the day-to-day life of a student. To help with our enquiry, we decided to design and deliver a one-hour interactive workshop with students at LSE. We had three objectives:

  • To better understand the behaviours, attitudes and preferences of LSE students using informal learning spaces such as those within the Clement House rotunda. Specifically, to better understand how, what, when, where and why students use particular learning spaces.
  • To compare the original design intentions for each floor at Clement House to how these spaces are viewed by students.
  • To better understand how the Clement House spaces fit into the overall student learning experience at LSE

The workshop was divided into two parts:

  1. A cognitive mapping exercise
  2. A photographic mapping exercise

 

1. Cognitive Mapping

The first part of the workshop, adapted from work undertaken in the ERIAL project and developed by Donna Lanclos at UNC Charlotte, aimed to explore how the learning spaces at Clement House fit into the overall learning journey at LSE. Each student was provided with a blank sheet of A3 paper and four different coloured pens. The first part of the activity required students to list all the places in which they go to study – from the library, to a local café, to halls of residence. This part of the exercise took 6 minutes in total. Every 2 minutes, students were asked to switch the colour of their pens in this order: blue, red, black. After 6 minutes, students were asked to annotate their maps using a green pen, to say why they chose these spaces and what they do in these spaces – individual reading; group work; essay writing? By using different coloured pens, it was possible to see which locations came to the forefront of students’ minds when asked to think about places they go to study. Students were then asked to discuss their maps with the group.

 

2. Photographic Mapping

Building on the work at the University of Rochester, (Briden, 2007) the activity in photographic mapping asked students to take photographs of their preferred spaces at Clement House based on a list of questions:

  1. Something you would like to see replicated on other parts of campus.
  2. Something you think could be improved.
  3. Your favourite piece of technology.
  4. Your favourite piece of furniture.

Students worked in pairs and were asked to write down reasons for their photos. Following the exercise in photographic mapping, students were asked about the design intentions of each floor. This was a useful opportunity to compare student opinion with original design intentions.

Full size images of the exercise in photographic mapping can be found here: 1, 2, 3.

 

This workshop was an opportunity to engage with students using a creative and interactive approach. The 10 spaces for the workshop were filled in a short space of time and recruitment was conducted over Twitter and/or departmental newsletters. Participants were awarded £10 for their time and Eventbrite was used. Looking ahead to future sessions, it would be worth having 1.5 hours for the workshops in order to have more time for discussion.

The cognitive mapping exercise provided some insightful data and it was interesting to see the different approaches that students took to presenting their mind maps. Cumulatively, a total of over 40 different areas, both on- and off-campus, were cited as places where students choose to study. This signals two things: firstly, there is diversity in preference that moves beyond traditional learning spaces such as the library; secondly, the learning environment reaches far beyond the classroom walls, and stretches across the day – from checking emails on the morning commute, to finding a study space after – or in-between – class, in an area such as Clement House.

The photographic exercise was another interesting activity that highlighted the diversity of student preferences within the built learning environment. It was important to ask students the reasons behind their choices. When looking at possible improvements to the learning spaces, all of the students focused their discussion on spatial factors, such as maximising the number of tables and chairs.

Regarding the second objective – to compare the original design intentions for each floor at Clement House to how these spaces are viewed by students – it was interesting that student perception of each space contrasted to the original design intentions. Given that many of these spaces were designed to facilitate interactive group working rather than individual self-study, it is not surprising, at this stage, for a difference of opinion – at present, students are more familiar with the traditional methods of learning, rather than using interactive or static whiteboards for group discussion. Looking ahead, it is hoped that these spaces will be well-suited for LSE’s increasing focus on assessment diversification; particularly those projects which involve group work.

Further information about the workshop (including methodology, references, complete findings and discussion) can be found in the report. If you have any questions on the recruitment process or any of the activities, contact Emma Wilson (e.wilson2@lse.ac.uk).

A full version of the evaluation report, complete with the methodology used for the workshop, can be found here.

 

References

BRIDEN, J. 2007. Photo surveys: eliciting more than you knew to ask for. In: FOSTER, N. F. & GIBBONS, S. (eds.) Studying Students: The Undergraduate Research Project at the University of Rochester. Chicago: Association of College and Research Libraries.

ASHER, A. & MILLER, S. 2011. A Practical Guide to Ethnographic Research in Academic Libraries: The ERIAL Project.

LANCLOS, D. 2013. Playing with Cognitive Mapping. the Anthropologist in the Stacks [Online].