NCISU class rep training, 13 October 2017

The annual invitation for QA@NCI to meet and speak with the new NCISU class representatives, to answer their questions and to establish a point of contact with them, was no less welcome this year than it has been in the past. Indeed, this opportunity to get directly involved, to listen to what our student volunteers are telling us, and to consider what we might want to do next is one of the highlights of the academic calendar, so sincere thanks to Stephen Cleary (NCISU president) and his team for inviting us back again this year.

As 2017-18 unfolds, we were asked to speak with NCISU class reps on the topic of NStEP and NCI’s role within it, which was very timely given the potential that this national programme is now realising across the sector, as well as locally at institutions like our own.

The training and development of class reps is an NStEP priority; the fact that it aligns so closely with NCISU’s own imperatives means that progress is now being made in our understanding of a number of student engagement principles including ‘transparency’, ‘collegiality and parity of esteem’, and ‘feedback and feedback loop’ (see Taking Next StEPs for more details).

NCI and NCISU have been asked to lead on the NStEP project entitled “the role and recruitment of class representatives”, more of which in due course. But, if it the outcomes of the national project are anything like the openness on display today at this training event, as demonstrated in the willingness of our class reps to get involved, and their developing awareness of the possibilities inherent in learner representation, then it should prove to be a success.

NStEP and NCI's role within it
NStEP and NCI’s role within it
Advertisements

QA network #2: re-engaging

QA network

Sometimes, a good idea proves to be just that. So, particular thanks to Hugh for continuing to drive this network forward and, even more usefully, to himself and Ruth for the opportunity to touch base, to look at a specific issue in a bit more depth, as well as the chance to place it within its wider context, and, of course, to drink coffee at 8:30am in a nice city centre hotel.

Our previous meeting – see QA network #1 for details – had proven to be so successful that it felt like (i.e. in the sense that it actually was!) a meeting with trusted sectoral colleagues where there are opportunities to give and to take, to learn and to impart, to share and to develop.

This time around, the subject was Re-engagement with QQI, and it was immediately striking how our experiences thus far are leading to increasing levels of resource sharing, informed interaction across and within our institutions, and an awareness that we are actually part of a wider learning community rather than just being employees of different HEIs.

Ideally, our little group will expand to involve other HE colleagues working within the QA space, but we are determined to meet again, next time to talk about learner feedback systems and experiences, in particular feedback from students. That third QA network meeting is expected to take place in six weeks’ time or thereabouts in one of our three locations; if you’d like to get involved, you’d be very welcome, just get in touch.

QA fuel
QA fuel

“Connecting, Listening, and Enhancing”

UCC learner feedback review

Earlier this year, colleagues at University College Cork (UCC) conducted a review of learner feedback processes, particularly in terms of how and why student surveys are undertaken at their institution compared to other universities. They recently published their findings in an eminently accessible report with transferrable findings entitled Connecting, Listening, and Enhancing: Placing Student Perceptions of their Educational Experience at the Heart of Decision Making at UCC.

Apart from advocating that QA@NCI blog readers should themselves have a look through the report, this posting wishes to highlight one or two of the many takeaways from this important piece of work. For example, without proper oversight, there are dangers inherent in over-surveying our students, leading to survey fatigue and/or low (and by definition less representative) response rates. In turn, there can be real frustrations involved in showing evidence that the feedback loop is being closed effectively, particularly if there is an intertwined culture of disengagement or disinterest. Referring to the challenges facing learner survey activity at UCC, the review’s statement on ‘closing the loop’ resonates particularly loudly:

There is limited evidence that the analysis of data arising from some student surveys is being fed back to relevant stakeholders. Although data may be analysed, action may not be taken to make relevant changes, and the findings may not be disseminated appropriately.

This UCC review is also proving to be a useful means for us to consider our own practice and to help in determining whether it is effective or not. For instance, it highlights the role now being played nationwide by the Irish Survey of Student Engagement (ISSE) noting that, up to this point, analysis at UCC has centred on the quantitative – but not necessarily on the qualitative – data. Clearly, there is some real potential in considering the ‘words’ our students are offering through ISSE, as well as the ‘numbers’, which is why an “Irish Survey of Student Engagement (ISSE) 2017 – report” using NCI’s own quantitative and qualitative data is currently being put together; indeed, it will appear here on the QA@NCI blog in due course.

It is clear that Higher Education institutions – both their staff and learners – up and down the country are starting to recognise the power of, as well as the richness in, the student voice. The transparency exemplified through this UCC review is a very welcome addition to our body of knowledge and to its advocacy of due process. With any luck, but more likely through ongoing efforts to persuade, convince and cajole, it should help to prompt ever more effective practices to be enshrined when it comes to learner feedback in particular, and student engagement more generally. If we connect and listen, then we can also enhance.

Further details and resources related to the UCC ‘Review of student feedback’ are available here.

 

QA network #1: contact initiated!

QA network

It’s always good to receive an invitation for cake and coffee a meeting with QA colleagues drawn from other institutions, so sincere thanks to Hugh Sullivan (Quality Assurance Officer) from Hibernia College Dublin for taking this initiative and also for the opportunity to connect and meet with Ruth Ni Bheoláin (Quality Assurance and Enhancement Officer), our Griffith College counterpart, earlier this week.

Just as academic staff will interact with colleagues from their own subject area from other institutions or when our sabbatical officers connect with their peers to learn and to share, this chance to meet up informally with QA colleagues was just too good an opportunity to miss.

Although not all aspects of the Chatham House Rule will necessarily apply to these meetings, indeed all participants are happy for this blog posting to go ahead, it is only fair to limit this post to say that the conversation was broad, open and constructive, exactly what was intended. This dimension – i.e. the chance to touch base with colleagues working in analogous areas in other institutions – will doubtlessly prove to be the most useful element for all involved long into the future. It certainly is good to talk, to listen and to learn! Next time we meet, we’re going to go into a bit more depth about Re-engagement with QQI, a very timely opportunity to build upon the briefings held earlier this spring (see QQI briefing regarding the new Validation Policy and Criteria and the pilot process for Re-Engagement for example for more details).

In sum, this QA network should offer us the chance to consider a wide range of topics as they emerge in the weeks and months ahead. If you’re working in the QA space and interested in joining us next time, feel free to let us know.

Faculty Induction Day, 18 August 2017

Staff

Faculty Induction Day presents the opportunity to orientate new/recently arrived academic staff, to introduce them to colleagues from within their own location and from across NCI, and to draw their attention to established practices and systems. But, induction should never be seen as a one-off event, done well it is always an ongoing and iterative process which takes time and effort. Indeed, the rewards can be profound for the individuals directly concerned, as well as their colleagues and, in turn, their students.

The Quality Assurance presentation, typically delivered in a one-hour slot after lunch, not only gives us the chance to discuss what we do in our area of expertise with these new colleagues, how we support them in their work, etc. It also offers us in QA the opportunity to reflect upon what it is that we think we are doing, to consider more deeply how we might best offer the backing required, etc. But, most importantly of all, it is about creating a connection.

Quality Assurance presentation, 18 August 2017

Just as the new colleagues had questions today and, even more significantly, will have more queries as they crop up individually/collectively long into the future, the same applies to our students. Thus, a key role in induction is not just addressing immediate entreaties, it is about equipping and supporting people to provide/look for/find the answers beyond induction day.

Knowing that those with (at least some of) the answers are accessible, supportive, and open to learning themselves is just one of the functions of those facilitating and helping to fulfil Faculty Induction Day. After all, we were each that new colleague on our first day too.

Faculty Induction Day
Faculty Induction Day, 18 August 2017

EvaSys resources … sometimes we forget what we have

Feedback

Earlier this week, we hosted a couple of colleagues from EvaSys/Electric Paper here at NCI as part of an annual touching-base, face-to-face, catch-up … that’s a lot of hyphens. This meeting was also part of a ‘spring-cleaning’ process that QA undertakes each summer in terms of data management, as well as the opportunity to be brought up-to-speed on developments in this whole area of learner feedback.

EvaSys supply the software and back-up for our formal module evaluation processes, but over time we have also learned how to extend out these opportunities for interaction with students and colleagues to include programme evaluations, feedback on services, 360° surveys, etc. Ultimately, we are only limited by our imagination and skills when it comes to running online and/or paper surveys (e.g. in terms of questionnaire design, the audiences we need/want to hear from, how to put the information received to effective use, etc.).

In reconsidering our policies and guidelines this summer as part of our Quality Assurance Review, it is always worthwhile to have another look back through the literature, including those resources already employed in informing practice. Thus, it’s no harm to signpost people towards EvaSys publications such as those listed chronologically below:

  • Smith, Phil, & Owen Morris (eds.), Effective Course Evaluation: The Future for Quality and Standards in Higher Education (London: Electric Paper, 2011) – click here for access
  • Murphy, Hannah, & Phil Smith (eds.), Closing the loop: Are universities doing enough to act on student feedback from course evaluation surveys? (London: Electric Paper, 2013) – click here for access
  • Kenny, Jordan (ed.), Breaking down the barriers: How to deliver best practice in HE course evaluation (London: Electric Paper, 2015)
  • Lim, Helena (ed.), The devil is in the data: How HE providers can benchmark their course and module performance (London: Electric Paper, 2015)

These resources have proven to be useful in terms of shaping our own thinking and help to reinforce many of the reasons why we undertake this work. Further details regarding the kind of research being undertaken by EvaSys/Electric Paper are available online.

Taking Next StEPS: … and inevitably leads to a second!

It’s nice when a plan starts to come together, one may even begin to believe that it was, err, pre-planned. But, as is almost always the case, plans take time to take shape, to develop and, in turn, to become some sort of reality. And, invariably, they bear a distant resemblance to what was originally conceived.

To be fair, the PPT slides which we have developed for the 2017 EAIR Forum presentation were drafted early on, indeed they helped to shape the paper, and they also drew heavily upon the original conference proposal. Turning back to these PPTs again, one sees the weaknesses in our initial efforts to say in a dozen or so slides just what it is that we have been doing across the past 15 months, and why, as well as what we see happening now, what we hope will take place next, etc.

Our EAIR presentation in early September is due to take half-an-hour in total, with twenty minutes spent on talking to/using these revised PPT slides, drawing out the main points of the paper, all the while answering a set of questions posed by the organisers, as well as anticipating/responding to those put down by colleagues in attendance. The guidance supplied by EAIR certainly helped to frame and focus what we want to say, and the resulting slides will also hopefully help to engage the audience, not just for the twenty minute presentation, but also for the ten minute Q&A session that will bring it to a close.

All in all, this has been a very interesting process, especially in terms of personal and professional development. The initial value will be seen early next month when it comes to how our presentation is executed, received and, in truth, with regard to what comes next for us with NStEP. If nothing else, it has certainly been very illuminating to be on this journey, and it is particularly nice to have the company of others while proceeding along this path.

2017 EAIR Forum presentation – FitzGerald & O’Sullivan

Taking Next StEPS - proposal
Taking Next StEPs – proposal