Module Evaluations – a snapshot

It is current NCI policy, as outlined in the Quality Assurance Handbook, to evaluate every module every time that it runs, for this feedback to be given anonymously by students, and for the quantitative and qualitative data to be shared with the module convenor and their Dean of School in the form of aggregated reports by module iteration.

In terms of learner feedback processes, which also include class representative meetings and service evaluations, not to mention the Irish Survey of Student Engagement (ISSE), it is worth reproducing the existing College guidance in full regarding module evaluations:

Module Evaluation

This is carried out in Week 8 of the Semester by the School. The survey is anonymous. The primary objective of this survey is to obtain the views of Learners on their experience in the module delivered. The learner is invited to assign a rating to a range of issues relating to the presentation of a module or module component as he/she experienced it. The completed questionnaires and analysis are returned to the Dean of School and to the individual lecturer. The Dean of School reviews the results of the survey with the individual lecturer.

Results of the survey are communicated to Associate Faculty by post.

Over the past couple of years, this statement regarding the processes involved in gathering and responding to learner feedback has been interpreted in ways which stretch its meaning somewhat so that it has practical application; for example, the results – i.e. the individual module evaluation reports – are disseminated by electronic mail rather than by post.

As part of the Quality Assurance Review which has been taking place this past year, it is expected that these statements regarding learner feedback will be updated so that they reflect best practice (e.g. by formally incorporating ISSE into the guidance), as well as what is happening in reality (incl. advice to the students to provide constructive feedback, suggestions to lecturing staff regarding how their feedback might be employed, etc.), and overarching efforts to improve the processes involved (i.e. by updating the questionnaire being used).

In essence this is not just an attempt to build on the Feedback from students project work, it also seeks to extend its findings and recommendations so that the data and analysis produced by surveys have real value and support effective responses to what we are hearing and learning.

Over the past 2½ years under consideration here, engagement in support of learner feedback and the feedback loop has steadily grown; this is encapsulated in increasing acceptance of the principles underpinning this whole process, such as those outlined in Embedding the Principles of Student Engagement:

Feedback and feedback loop

Institutions will welcome and encourage open and prompt feedback from students. Suitable measures will be put in place across the institution to ensure that students are facilitated in providing feedback in a safe and valued manner. Feedback practices will be transparent and the feedback loop will be closed in a timely fashion.

In truth, the willingness of students and staff to take part in this process, as revealed in statistics such as ever more representative feedback (see table 1 below), is testament to the trust that is being built on existing firm foundations.

table 1 – first semester response rates across three academic years
average response rates & total numbers of responses Semester 1, 2015-16 Semester 1, 2016-17 Semester 1, 2017-18*
School of Business 10%

772 responses

17%

1,255 responses

22%

1,675 responses

School of Computing 12%

696 responses

25%

1,251 responses

24%

835 responses

Learning & Teaching 33%

84 responses

38%

138 responses

27%

100 responses

* this particular round of surveys has not yet finished

We look forward to the quantitative and qualitative data that is being offered by our learners being used ever more effectively and transparently as part of the quality assurance process by individual members of academic staff, the teams to which they belong, and by those analysing and considering the data.

Advertisements

QA network #3: ʼTis the season?

No, we don’t necessarily mean the forthcoming holidays, even if that time of the year seems to start earlier and earlier.

Instead, we’d like to talk about learner feedback, not just because it went to the heart of the most recent QA network meeting hosted by Hibernia College. The national conversation will turn to this subject in the coming weeks with the impending publication of the ISSE 2017 report, but we also had the opportunity to compare and contrast aspirations, ideas, and practices at this our third meeting with a particular focus on module (and programme) evaluations supplied locally by our students.

The fact is that the QA network allows us to share our experiences and to learn from one another, but where it is arguably most valuable is that it also consistently reveals that, while we may be encountering curiously similar issues, we are also getting real insights from each other regarding potential next steps. Ah, the value of networking!

Our institutions have quite a degree of flexibility when it comes to learner feedback, which normally comes to us via both formal and informal means. In terms of formal mechanisms, local practices and experiences are not necessarily all that different, even if QA expends a lot of its efforts gathering data (usually paper and/or online by module), while also advocating that it is utilised appropriately, as well as being seen to be used.

If we could all get what we want this Christmas, it would doubtlessly be that the feedback loop is closed ever more effectively. We may well need to revisit this whole issue of learner feedback again soon, it is a perennial and not just seasonal.

Neapolitan coffee
Il Caffé di Napoli

 

 

“Connecting, Listening, and Enhancing”

UCC learner feedback review

Earlier this year, colleagues at University College Cork (UCC) conducted a review of learner feedback processes, particularly in terms of how and why student surveys are undertaken at their institution compared to other universities. They recently published their findings in an eminently accessible report with transferrable findings entitled Connecting, Listening, and Enhancing: Placing Student Perceptions of their Educational Experience at the Heart of Decision Making at UCC.

Apart from advocating that QA@NCI blog readers should themselves have a look through the report, this posting wishes to highlight one or two of the many takeaways from this important piece of work. For example, without proper oversight, there are dangers inherent in over-surveying our students, leading to survey fatigue and/or low (and by definition less representative) response rates. In turn, there can be real frustrations involved in showing evidence that the feedback loop is being closed effectively, particularly if there is an intertwined culture of disengagement or disinterest. Referring to the challenges facing learner survey activity at UCC, the review’s statement on ‘closing the loop’ resonates particularly loudly:

There is limited evidence that the analysis of data arising from some student surveys is being fed back to relevant stakeholders. Although data may be analysed, action may not be taken to make relevant changes, and the findings may not be disseminated appropriately.

This UCC review is also proving to be a useful means for us to consider our own practice and to help in determining whether it is effective or not. For instance, it highlights the role now being played nationwide by the Irish Survey of Student Engagement (ISSE) noting that, up to this point, analysis at UCC has centred on the quantitative – but not necessarily on the qualitative – data. Clearly, there is some real potential in considering the ‘words’ our students are offering through ISSE, as well as the ‘numbers’, which is why an “Irish Survey of Student Engagement (ISSE) 2017 – report” using NCI’s own quantitative and qualitative data is currently being put together; indeed, it will appear here on the QA@NCI blog in due course.

It is clear that Higher Education institutions – both their staff and learners – up and down the country are starting to recognise the power of, as well as the richness in, the student voice. The transparency exemplified through this UCC review is a very welcome addition to our body of knowledge and to its advocacy of due process. With any luck, but more likely through ongoing efforts to persuade, convince and cajole, it should help to prompt ever more effective practices to be enshrined when it comes to learner feedback in particular, and student engagement more generally. If we connect and listen, then we can also enhance.

Further details and resources related to the UCC ‘Review of student feedback’ are available here.

 

EvaSys resources … sometimes we forget what we have

Feedback

Earlier this week, we hosted a couple of colleagues from EvaSys/Electric Paper here at NCI as part of an annual touching-base, face-to-face, catch-up … that’s a lot of hyphens. This meeting was also part of a ‘spring-cleaning’ process that QA undertakes each summer in terms of data management, as well as the opportunity to be brought up-to-speed on developments in this whole area of learner feedback.

EvaSys supply the software and back-up for our formal module evaluation processes, but over time we have also learned how to extend out these opportunities for interaction with students and colleagues to include programme evaluations, feedback on services, 360° surveys, etc. Ultimately, we are only limited by our imagination and skills when it comes to running online and/or paper surveys (e.g. in terms of questionnaire design, the audiences we need/want to hear from, how to put the information received to effective use, etc.).

In reconsidering our policies and guidelines this summer as part of our Quality Assurance Review, it is always worthwhile to have another look back through the literature, including those resources already employed in informing practice. Thus, it’s no harm to signpost people towards EvaSys publications such as those listed chronologically below:

  • Smith, Phil, & Owen Morris (eds.), Effective Course Evaluation: The Future for Quality and Standards in Higher Education (London: Electric Paper, 2011) – click here for access
  • Murphy, Hannah, & Phil Smith (eds.), Closing the loop: Are universities doing enough to act on student feedback from course evaluation surveys? (London: Electric Paper, 2013) – click here for access
  • Kenny, Jordan (ed.), Breaking down the barriers: How to deliver best practice in HE course evaluation (London: Electric Paper, 2015)
  • Lim, Helena (ed.), The devil is in the data: How HE providers can benchmark their course and module performance (London: Electric Paper, 2015)

These resources have proven to be useful in terms of shaping our own thinking and help to reinforce many of the reasons why we undertake this work. Further details regarding the kind of research being undertaken by EvaSys/Electric Paper are available online.

Staff Feedback Forum

Feedback from staff

As part of its QA Review, as well as further evidence of our ongoing NStEP activities, it is worth recording here that NCI held a Staff Feedback Forum on March 22nd, 2017, thereby in effect running a staff session in parallel with the student event held a fortnight previously (see the related Student Feedback Forum post for more details).

Attended by a dozen NCI colleagues in total, including both academic and administrative staff, as well as the QASS facilitators, this meeting offered a focus for staff to get even more directly involved in the learner feedback discussions that are taking place, dialogues which now encompass a number of stakeholders including the aforementioned students themselves. Taken together, these discussions are helping to inform the QA Review with regard to, but by no means restricted to, feedback from students.

As with the student session, a set of introductory PPT slides were again used to help frame the conversations which then followed (click on Staff Feedback Forum), but the forum itself centred on a staff discussion regarding who, from their perspective, students should be giving feedback to and regarding what, as well as how they might do it and, in turn, what needs to happen as a result.

This Staff Feedback Forum thus builds on a conversation that is already taking place across NCI, while also offering a steer regarding how learner feedback processes might operate into the future. Further staff and student feedback and reflection upon these processes will continue to be sought ahead of revised QA procedures and guidelines being introduced next academic year.

 

Staff Feedback Forum
Staff Feedback Forum

 

Learner feedback – module evaluations

Learner feedback

From the lack of QA@NCI postings over the past month, it may look like it has been a quiet period since mid-March … rather the opposite is the truth. With any luck, there will be some catching up in the weeks ahead regarding processes undertaken, projects advanced, etc., in this intervening period. But, for now, some reflection upon a quality assurance perennial, i.e. module feedback.

Currently, one of the significant pieces of work being undertaken within the context of learner feedback is a series of module evaluations from Semester 2, 2016-17. According to the Quality Assurance Handbook, which itself is up for review at this time, these questionnaires are typically completed by NCI students just over half-way through the semester. This formal feedback is then reflected upon by the lecturers directly concerned, as well as colleagues more widely, and used to help inform on-going and future learning and teaching developments. In turn, the students also need to be involved in this process in an effort to close the feedback loop (i.e. to provide them with feedback regarding their, err, feedback).

The answers given in these surveys are anonymous and the main purpose is to offer and receive learner views regarding each module experience. In essence, each student is asked to present feedback on each module they undertake using a six-point Likert scale for 25 quantitative questions covering issues ranging from their own learning to the module’s organisation, from their own participation to the presentation style employed by the lecturer(s), from the module content to their experience of it. In turn, two qualitative questions ask ‘What was especially good about this module?’ and ‘How could this module be improved?’ and here the students are invited to present constructive and free-flowing comment.

Once the survey window closes – two weeks is the normal timescale offered to students – reports based upon the aggregated and anonymised feedback are sent directly to the module convenors concerned, and primarily (though not necessarily only) for their use. It doesn’t stop there, but that is the essence of the process. In future posts, this Q&A blog will explore how this particular learner feedback process has been established, how it has evolved, and where it might be heading into the future. But, for now, it’s just nice to be back blogging!

 

Student Feedback Forum

Feedback from students

As part of its Quality Assurance Review, and in line with its participation in the National Student Engagement Programme (NStEP), NCI held a Student Feedback Forum yesterday afternoon, March 8th, 2017, hopefully the first of many such pizza’n’policy meetings.

Attended by 20 students from a range of academic programmes, this meeting was facilitated by colleagues from both NCISU and NCI. The call to students to get involved, to have their say, and to make a difference clearly resonated with those present and the feedback received will obviously help to inform the Quality Assurance Review with regard to, but by no means restricted to, feedback from students.

Opening with some introductory PPT slides to help frame the conversations that followed (click on Student Feedback Forum), the forum centred on two exercises which saw students (1) reflecting upon how they have given feedback in the past (e.g. to teaching staff, through class representatives, etc.) before (2) considering what they want to give feedback upon into the future (incl. frequency, means, etc.).

This forum is the continuation of a conversation that has already been taking place across NCI, but yesterday’s session offered an ideal opportunity to lend more focus and purpose to learner feedback processes, at the same time as embodying the concept of students as partners and as co-creators. More reflection on these matters will follow in the weeks ahead, as well as creating opportunities for more students and staff to get very directly involved.

Student Feedback Forum (updated PPT slides)
Student Feedback Forum (updated PPT slides)