It is current NCI policy, as outlined in the Quality Assurance Handbook, to evaluate every module every time that it runs, for this feedback to be given anonymously by students, and for the quantitative and qualitative data to be shared with the module convenor and their Dean of School in the form of aggregated reports by module iteration.
In terms of learner feedback processes, which also include class representative meetings and service evaluations, not to mention the Irish Survey of Student Engagement (ISSE), it is worth reproducing the existing College guidance in full regarding module evaluations:
This is carried out in Week 8 of the Semester by the School. The survey is anonymous. The primary objective of this survey is to obtain the views of Learners on their experience in the module delivered. The learner is invited to assign a rating to a range of issues relating to the presentation of a module or module component as he/she experienced it. The completed questionnaires and analysis are returned to the Dean of School and to the individual lecturer. The Dean of School reviews the results of the survey with the individual lecturer.
Results of the survey are communicated to Associate Faculty by post.
Over the past couple of years, this statement regarding the processes involved in gathering and responding to learner feedback has been interpreted in ways which stretch its meaning somewhat so that it has practical application; for example, the results – i.e. the individual module evaluation reports – are disseminated by electronic mail rather than by post.
As part of the Quality Assurance Review which has been taking place this past year, it is expected that these statements regarding learner feedback will be updated so that they reflect best practice (e.g. by formally incorporating ISSE into the guidance), as well as what is happening in reality (incl. advice to the students to provide constructive feedback, suggestions to lecturing staff regarding how their feedback might be employed, etc.), and overarching efforts to improve the processes involved (i.e. by updating the questionnaire being used).
In essence this is not just an attempt to build on the Feedback from students project work, it also seeks to extend its findings and recommendations so that the data and analysis produced by surveys have real value and support effective responses to what we are hearing and learning.
Over the past 2½ years under consideration here, engagement in support of learner feedback and the feedback loop has steadily grown; this is encapsulated in increasing acceptance of the principles underpinning this whole process, such as those outlined in Embedding the Principles of Student Engagement:
Feedback and feedback loop
Institutions will welcome and encourage open and prompt feedback from students. Suitable measures will be put in place across the institution to ensure that students are facilitated in providing feedback in a safe and valued manner. Feedback practices will be transparent and the feedback loop will be closed in a timely fashion.
In truth, the willingness of students and staff to take part in this process, as revealed in statistics such as ever more representative feedback (see table 1 below), is testament to the trust that is being built on existing firm foundations.
table 1 – first semester response rates across three academic years
|average response rates & total numbers of responses||Semester 1, 2015-16||Semester 1, 2016-17||Semester 1, 2017-18*|
|School of Business||10%
|School of Computing||12%
|Learning & Teaching||33%
* this particular round of surveys has not yet finished
We look forward to the quantitative and qualitative data that is being offered by our learners being used ever more effectively and transparently as part of the quality assurance process by individual members of academic staff, the teams to which they belong, and by those analysing and considering the data.