“Connecting, Listening, and Enhancing”

UCC learner feedback review

Advertisements

Earlier this year, colleagues at University College Cork (UCC) conducted a review of learner feedback processes, particularly in terms of how and why student surveys are undertaken at their institution compared to other universities. They recently published their findings in an eminently accessible report with transferrable findings entitled Connecting, Listening, and Enhancing: Placing Student Perceptions of their Educational Experience at the Heart of Decision Making at UCC.

Apart from advocating that QA@NCI blog readers should themselves have a look through the report, this posting wishes to highlight one or two of the many takeaways from this important piece of work. For example, without proper oversight, there are dangers inherent in over-surveying our students, leading to survey fatigue and/or low (and by definition less representative) response rates. In turn, there can be real frustrations involved in showing evidence that the feedback loop is being closed effectively, particularly if there is an intertwined culture of disengagement or disinterest. Referring to the challenges facing learner survey activity at UCC, the review’s statement on ‘closing the loop’ resonates particularly loudly:

There is limited evidence that the analysis of data arising from some student surveys is being fed back to relevant stakeholders. Although data may be analysed, action may not be taken to make relevant changes, and the findings may not be disseminated appropriately.

This UCC review is also proving to be a useful means for us to consider our own practice and to help in determining whether it is effective or not. For instance, it highlights the role now being played nationwide by the Irish Survey of Student Engagement (ISSE) noting that, up to this point, analysis at UCC has centred on the quantitative – but not necessarily on the qualitative – data. Clearly, there is some real potential in considering the ‘words’ our students are offering through ISSE, as well as the ‘numbers’, which is why an “Irish Survey of Student Engagement (ISSE) 2017 – report” using NCI’s own quantitative and qualitative data is currently being put together; indeed, it will appear here on the QA@NCI blog in due course.

It is clear that Higher Education institutions – both their staff and learners – up and down the country are starting to recognise the power of, as well as the richness in, the student voice. The transparency exemplified through this UCC review is a very welcome addition to our body of knowledge and to its advocacy of due process. With any luck, but more likely through ongoing efforts to persuade, convince and cajole, it should help to prompt ever more effective practices to be enshrined when it comes to learner feedback in particular, and student engagement more generally. If we connect and listen, then we can also enhance.

Further details and resources related to the UCC ‘Review of student feedback’ are available here.

 

EvaSys resources … sometimes we forget what we have

Feedback

Earlier this week, we hosted a couple of colleagues from EvaSys/Electric Paper here at NCI as part of an annual touching-base, face-to-face, catch-up … that’s a lot of hyphens. This meeting was also part of a ‘spring-cleaning’ process that QA undertakes each summer in terms of data management, as well as the opportunity to be brought up-to-speed on developments in this whole area of learner feedback.

EvaSys supply the software and back-up for our formal module evaluation processes, but over time we have also learned how to extend out these opportunities for interaction with students and colleagues to include programme evaluations, feedback on services, 360° surveys, etc. Ultimately, we are only limited by our imagination and skills when it comes to running online and/or paper surveys (e.g. in terms of questionnaire design, the audiences we need/want to hear from, how to put the information received to effective use, etc.).

In reconsidering our policies and guidelines this summer as part of our Quality Assurance Review, it is always worthwhile to have another look back through the literature, including those resources already employed in informing practice. Thus, it’s no harm to signpost people towards EvaSys publications such as those listed chronologically below:

  • Smith, Phil, & Owen Morris (eds.), Effective Course Evaluation: The Future for Quality and Standards in Higher Education (London: Electric Paper, 2011) – click here for access
  • Murphy, Hannah, & Phil Smith (eds.), Closing the loop: Are universities doing enough to act on student feedback from course evaluation surveys? (London: Electric Paper, 2013) – click here for access
  • Kenny, Jordan (ed.), Breaking down the barriers: How to deliver best practice in HE course evaluation (London: Electric Paper, 2015)
  • Lim, Helena (ed.), The devil is in the data: How HE providers can benchmark their course and module performance (London: Electric Paper, 2015)

These resources have proven to be useful in terms of shaping our own thinking and help to reinforce many of the reasons why we undertake this work. Further details regarding the kind of research being undertaken by EvaSys/Electric Paper are available online.

Staff Feedback Forum

Feedback from staff

As part of its QA Review, as well as further evidence of our ongoing NStEP activities, it is worth recording here that NCI held a Staff Feedback Forum on March 22nd, 2017, thereby in effect running a staff session in parallel with the student event held a fortnight previously (see the related Student Feedback Forum post for more details).

Attended by a dozen NCI colleagues in total, including both academic and administrative staff, as well as the QASS facilitators, this meeting offered a focus for staff to get even more directly involved in the learner feedback discussions that are taking place, dialogues which now encompass a number of stakeholders including the aforementioned students themselves. Taken together, these discussions are helping to inform the QA Review with regard to, but by no means restricted to, feedback from students.

As with the student session, a set of introductory PPT slides were again used to help frame the conversations which then followed (click on Staff Feedback Forum), but the forum itself centred on a staff discussion regarding who, from their perspective, students should be giving feedback to and regarding what, as well as how they might do it and, in turn, what needs to happen as a result.

This Staff Feedback Forum thus builds on a conversation that is already taking place across NCI, while also offering a steer regarding how learner feedback processes might operate into the future. Further staff and student feedback and reflection upon these processes will continue to be sought ahead of revised QA procedures and guidelines being introduced next academic year.

 

Staff Feedback Forum
Staff Feedback Forum

 

Learner feedback – module evaluations

Learner feedback

From the lack of QA@NCI postings over the past month, it may look like it has been a quiet period since mid-March … rather the opposite is the truth. With any luck, there will be some catching up in the weeks ahead regarding processes undertaken, projects advanced, etc., in this intervening period. But, for now, some reflection upon a quality assurance perennial, i.e. module feedback.

Currently, one of the significant pieces of work being undertaken within the context of learner feedback is a series of module evaluations from Semester 2, 2016-17. According to the Quality Assurance Handbook, which itself is up for review at this time, these questionnaires are typically completed by NCI students just over half-way through the semester. This formal feedback is then reflected upon by the lecturers directly concerned, as well as colleagues more widely, and used to help inform on-going and future learning and teaching developments. In turn, the students also need to be involved in this process in an effort to close the feedback loop (i.e. to provide them with feedback regarding their, err, feedback).

The answers given in these surveys are anonymous and the main purpose is to offer and receive learner views regarding each module experience. In essence, each student is asked to present feedback on each module they undertake using a six-point Likert scale for 25 quantitative questions covering issues ranging from their own learning to the module’s organisation, from their own participation to the presentation style employed by the lecturer(s), from the module content to their experience of it. In turn, two qualitative questions ask ‘What was especially good about this module?’ and ‘How could this module be improved?’ and here the students are invited to present constructive and free-flowing comment.

Once the survey window closes – two weeks is the normal timescale offered to students – reports based upon the aggregated and anonymised feedback are sent directly to the module convenors concerned, and primarily (though not necessarily only) for their use. It doesn’t stop there, but that is the essence of the process. In future posts, this Q&A blog will explore how this particular learner feedback process has been established, how it has evolved, and where it might be heading into the future. But, for now, it’s just nice to be back blogging!

 

Student Feedback Forum

Feedback from students

As part of its Quality Assurance Review, and in line with its participation in the National Student Engagement Programme (NStEP), NCI held a Student Feedback Forum yesterday afternoon, March 8th, 2017, hopefully the first of many such pizza’n’policy meetings.

Attended by 20 students from a range of academic programmes, this meeting was facilitated by colleagues from both NCISU and NCI. The call to students to get involved, to have their say, and to make a difference clearly resonated with those present and the feedback received will obviously help to inform the Quality Assurance Review with regard to, but by no means restricted to, feedback from students.

Opening with some introductory PPT slides to help frame the conversations that followed (click on Student Feedback Forum), the forum centred on two exercises which saw students (1) reflecting upon how they have given feedback in the past (e.g. to teaching staff, through class representatives, etc.) before (2) considering what they want to give feedback upon into the future (incl. frequency, means, etc.).

This forum is the continuation of a conversation that has already been taking place across NCI, but yesterday’s session offered an ideal opportunity to lend more focus and purpose to learner feedback processes, at the same time as embodying the concept of students as partners and as co-creators. More reflection on these matters will follow in the weeks ahead, as well as creating opportunities for more students and staff to get very directly involved.

Student Feedback Forum (updated PPT slides)
Student Feedback Forum (updated PPT slides)

 

 

Opportunities and limits regarding the use of learner feedback data

Feedback from students

In order to support a conversation in the Centre for Research and Innovation in Learning and Teaching (CRILT) Research Group regarding the use of learner feedback data, a set of PowerPoint slides were used earlier this afternoon (see below).

If the animated and informed interactions and discussions that these generated are anything to go by, there is a lot of potential in this area, as well as considerations that are due further thought.

Considering the opportunities and limits regarding the use of learner feedback data doesn’t necessarily feel like Pandora’s box, there may be rich seams of data – ‘jewels’ if you like – that are available for us to mine, to display and to wear lightly.

Sincere thanks to the CRILT Research Group for the invitation to present this lunchtime … let’s see where this conversation takes us!

Opportunities and limits
Opportunities and limits regarding the use of learner feedback data