Faculty Induction Day presents the opportunity to orientate new/recently arrived academic staff, to introduce them to colleagues from within their own location and from across NCI, and to draw their attention to established practices and systems. But, induction should never be seen as a one-off event, done well it is always an ongoing and iterative process which takes time and effort. Indeed, the rewards can be profound for the individuals directly concerned, as well as their colleagues and, in turn, their students.
The Quality Assurance presentation, typically delivered in a one-hour slot after lunch, not only gives us the chance to discuss what we do in our area of expertise with these new colleagues, how we support them in their work, etc. It also offers us in QA the opportunity to reflect upon what it is that we think we are doing, to consider more deeply how we might best offer the backing required, etc. But, most importantly of all, it is about creating a connection.
Just as the new colleagues had questions today and, even more significantly, will have more queries as they crop up individually/collectively long into the future, the same applies to our students. Thus, a key role in induction is not just addressing immediate entreaties, it is about equipping and supporting people to provide/look for/find the answers beyond induction day.
Knowing that those with (at least some of) the answers are accessible, supportive, and open to learning themselves is just one of the functions of those facilitating and helping to fulfil Faculty Induction Day. After all, we were each that new colleague on our first day too.
Earlier this week, we hosted a couple of colleagues from EvaSys/Electric Paper here at NCI as part of an annual touching-base, face-to-face, catch-up … that’s a lot of hyphens. This meeting was also part of a ‘spring-cleaning’ process that QA undertakes each summer in terms of data management, as well as the opportunity to be brought up-to-speed on developments in this whole area of learner feedback.
EvaSys supply the software and back-up for our formal module evaluation processes, but over time we have also learned how to extend out these opportunities for interaction with students and colleagues to include programme evaluations, feedback on services, 360° surveys, etc. Ultimately, we are only limited by our imagination and skills when it comes to running online and/or paper surveys (e.g. in terms of questionnaire design, the audiences we need/want to hear from, how to put the information received to effective use, etc.).
In reconsidering our policies and guidelines this summer as part of our Quality Assurance Review, it is always worthwhile to have another look back through the literature, including those resources already employed in informing practice. Thus, it’s no harm to signpost people towards EvaSys publications such as those listed chronologically below:
Smith, Phil, & Owen Morris (eds.), Effective Course Evaluation: The Future for Quality and Standards in Higher Education (London: Electric Paper, 2011) – click here for access
Murphy, Hannah, & Phil Smith (eds.), Closing the loop: Are universities doing enough to act on student feedback from course evaluation surveys? (London: Electric Paper, 2013) – click here for access
Kenny, Jordan (ed.), Breaking down the barriers: How to deliver best practice in HE course evaluation (London: Electric Paper, 2015)
Lim, Helena (ed.), The devil is in the data: How HE providers can benchmark their course and module performance (London: Electric Paper, 2015)
These resources have proven to be useful in terms of shaping our own thinking and help to reinforce many of the reasons why we undertake this work. Further details regarding the kind of research being undertaken by EvaSys/Electric Paper are available online.
It’s nice when a plan starts to come together, one may even begin to believe that it was, err, pre-planned. But, as is almost always the case, plans take time to take shape, to develop and, in turn, to become some sort of reality. And, invariably, they bear a distant resemblance to what was originally conceived.
To be fair, the PPT slides which we have developed for the 2017 EAIR Forum presentation were drafted early on, indeed they helped to shape the paper, and they also drew heavily upon the original conference proposal. Turning back to these PPTs again, one sees the weaknesses in our initial efforts to say in a dozen or so slides just what it is that we have been doing across the past 15 months, and why, as well as what we see happening now, what we hope will take place next, etc.
Our EAIR presentation in early September is due to take half-an-hour in total, with twenty minutes spent on talking to/using these revised PPT slides, drawing out the main points of the paper, all the while answering a set of questions posed by the organisers, as well as anticipating/responding to those put down by colleagues in attendance. The guidance supplied by EAIR certainly helped to frame and focus what we want to say, and the resulting slides will also hopefully help to engage the audience, not just for the twenty minute presentation, but also for the ten minute Q&A session that will bring it to a close.
All in all, this has been a very interesting process, especially in terms of personal and professional development. The initial value will be seen early next month when it comes to how our presentation is executed, received and, in truth, with regard to what comes next for us with NStEP. If nothing else, it has certainly been very illuminating to be on this journey, and it is particularly nice to have the company of others while proceeding along this path.
Not sure where the past month has gone, but it has seen drafts, feedback, redrafts, further feedback, etc., in an ever-increasing cycle, frenzy and, err, recycle. But, the good news is that we have a paper to submit – hurrah and huzzar!!! In fact, even better than that, it’s been submitted!
As with many deadlines, the time between committing to the promise and actually meeting it was readily filled. But, the opportunity to reflect, think through, ask for (and receive) the views of others, etc., has all helped to take the first big step, i.e. sending off the paper which will be presented at the EAIR 2017 Forum by the submission deadline, which coincidentally, ahem, was today, July 31st.
Over the next fortnight, we’ll also be working on the PPT slides which will accompany and support the paper presentation. A first draft of the PPT has been completed, indeed it has helped to inform and structure the paper, but a few more bells and whistles might prove be useful before that is also sent in to the conference organisers.
Given that the PPT deadline is August 14th, there is a suspicion here in QA@NCI that the next fortnight will also witness a similar process of cycle, frenzy and recycle noted above, but our hope is to have something coherent to say, and to show, in the light of our efforts. After all, the first substantive step in any process is invariably followed by a second … even if that is sometimes backwards!
Having attended the 38th EAIR Forum Birmingham 2016 as delegates, we think that an ideal opportunity has now presented itself for QA@NCI to speak further afield about our work as part of the NStEP project. Entitled “Taking Next StEPs – a case study: NCI staff and students working together”, we have had a conference proposal accepted, so roll on the 39th EAIR Forum Porto 2017!
The truth is that, in one way or another, we undertake this kind of work all the time as part of our jobs, both here at NCI and externally. We talk about what we do with staff and students on an on-going basis, often in relatively informal settings, sometimes in a more representative capacity, but normally with our QA – sometimes with our QC, QE and/or QI – hats on. In turn, we systematically produce guidelines, update policies, write papers, etc., on a range of quality assurance subjects, though again usually for internal rather than necessarily for external consumption.
Having restricted ourselves to talking about our work as part of NStEP, all the while delivering upon the project’s priorities – including regularly, if figuratively, on this blog – the fact is that we have also now put down some words on electronic paper as part of a conference proposal. It’s not just a case of Portugal here we come, more’s the pity. Yes, our paper proposal has been accepted, we’ve been given guidance regarding how to present our findings, we’ve bought our flights and accommodation, etc. But, there is also very suddenly a dawning realisation: all we have to do now is to prepare what it is that we are going to say to an audience of our peers and to put it down in writing … yikes!!!
In short, between now and our departure in early September, there is the not inconsiderable matter of researching and writing a 3,000-5,000 word paper, all the while coupling it with a 30 minute PowerPoint presentation. Still, at least the two front pages have been drafted. Hmm, thinking about it, we’d better get cracking.
This post complements the recently published NStEP working groups – the past year in focus though, in addition to expanding what has been taking place in terms of the national project, it concentrates a little more on what we have been doing here at NCI. Thus, our direct involvement in NStEP has allowed us to listen and to learn, as well as to contribute and to discuss, a process which we aim to build upon across 2017-18.
As with the other pilot project members, we have obviously been using this project to reflect upon our own practices, recognising where we might wish (and need) to improve, as well as gauging where we fit in terms of established effective practice. And, just as the NStEP working group has met on three occasions – i.e. on 17th May 2016, 10th November 2016 and 5th April 2017 – we have responded to the national meetings by ensuring that we regularly convene our own institutional implementation group; in fact, these meetings have taken place on 1st June 2016, 11th January 2017 and 31st May 2017 respectively.
Looking back upon it now, we should perhaps be conducting these local meetings more regularly. This being said, we have been holding smaller working party meetings (i.e. involving the two student representatives from NCISU and the two staff members from NCI) on a much more frequent, if informal, basis in order to support the project both locally and nationally. This framework, combining formal with more informal meetings, seems to be working for us, so let’s see how that evolves into next academic year.
With its expansion from a pilot to a truly national project, the guidance on establishing and running an institutional implementation group has itself undergone some considerable development across the intervening period, as is detailed below:
It is only fair to say that our own experience has not necessarily always aligned with this ideal. Yet, at the same time, the organic nature of the project’s evolution, implementation and outcomes here at NCI has allowed us to move at a pace which we feel is appropriate.
The fact is that the three institutional implementation group meetings that have been conducted at NCI have themselves been very open in nature. We deliberately decided not to employ fixed agendas or to record formal minutes, and this approach has been reaffirmed at each of the three meetings held.
In truth, we have used the institutional implementation group meetings to ask questions, to share ideas and, in turn, we have followed these up with emailed notes shared between students and staff, as well as reporting regularly to NCI’s Learning, Teaching and Assessment Committee. And, while we do record who attends our Institutional Implementation Group meetings, the point of holding them is to enable but not to constrain, to share, to disseminate and to encourage, not to rein in, to limit or to hinder.
Our guiding principle remains the working definition of student engagement, that it is the “investment of time, effort and other relevant resources by both students and their institutions intended to optimise the student experience and enhance the learning outcomes and development of students, and the performance and reputation of the institution” (see Enhancing Student Engagement in Decision-Making for more details).
The proof therefore of NStEP’s impact upon NCI will not necessarily or ultimately be found in what we say, but in what we do.
Across the last 12 months, NCI has, alongside the other four pilot project members (CIT, LVIT, NUIG & WIT), participated in three formal National Student Engagement Programme (NStEP) Working Group meetings. Each of these meetings has allowed us the opportunity to contribute to the development and shaping of NStEP, as well as getting a real insight into the thinking of colleagues – both staff and student representatives – at other institutions and within the sector, and in turn to reflect upon our own practices.
This QA@NCI posting is a little longer than those which have gone before, but it allows us to examine the path we have been following, as well as the choices that lay ahead for us. A future post will evaluate NCI’s own project implementation group meetings which have been taking place locally, two of which have already been held, with a third planned, but the focus here in this posting is on the three national working group meetings.
Looking back upon it now, that first meeting on 17th May 2016 offered us the chance to explore the concept of student engagement, before considering how it might be engendered and supported in each of the participating institutions, as well as more generally. Key outputs identified at that stage included the institutional representative training programme and the strategic analysis sessions to be conducted at each of the five locations, as well as the development of work streams and the identification of national events to support the overall project.
The agenda from that first NStEP Working Group meeting follows below:
NCI subsequently held its own first NStEP Institutional Implementation Group Meeting a fortnight later on 1st June, more on which in a future QA@NCI posting when this blog will consider the local impact of the project in a little more depth.
Following what might at first appear to be a hiatus over the summer and autumn, a second NStEP Working Group meeting was held on 10th November 2016. Coming one day after our own institution’s Strategic Analysis Workshop, the first of the five HEIs to undertake this session facilitated by sparqs, that second national meeting proved to be an ideal opportunity to share our initial experiences of the project with our fellow participants. At that point, we had not yet availed of the class rep training programme, but the meeting did give us a real insight into how other members were beginning to conduct or plan for theirs; indeed, facilitated by an NStEP trainer and with support from NCISU colleagues, a well attended class rep training event was held on 8th December. Meanwhile, a first NCI class rep system focus group meeting involving student representatives and staff was held four days later – further details are available here – while the wider project featured prominently at the 2nd QQI conference on Quality Enhancement held on 15th December. It was becoming very clear that NStEP was having a local and a national impact.
The agenda from that second NStEP Working Group meeting follows below:
In due course, NCI held it own second NStEP Institutional Implementation Group Meeting early in the New Year on 11th January 2017, a deeper reflection upon which will follow in that aforementioned future QA@NCI blog posting.
However, it was the second NCI class rep system focus group meeting involving student representatives and staff which, held on 9th March, really accelerated our use of NStEP to promote developments at our own institution; as noted earlier, further details regarding this student representative system review are available here. At the same time, the QA Review has allowed a Student Feedback Forum and a Staff Feedback Forum to be conducted on 8th March and 22nd March respectively, with NStEP again featuring prominently, thereby illustrating again how this national project is impacting locally.
Most recently, the third NStEP Working Group meeting on 5th April 2017 also introduced a real step change into the national pilot project. In considering how the programme is now in the process of being rolled out nationally, we had the chance to examine the main outputs from the five institutional strategic analyses and the opportunity to identify future work steams. These new national work streams were subsequently agreed as: (1) the role and recruitment of class representatives; (2) the design, review and delivery of programmes; (3) student feedback opportunities, data and follow up; (4) students in formal system level procedures, strategy and decision making; and (5) staff roles and capacity building.
The agenda from the third NStEP Working Group meeting follows below:
Not surprisingly given the previous pattern outlined above, NCI is due to conduct its third NStEP Institutional Implementation Group Meeting on 31st May 2017; in turn, a considered reflection upon our local experiences will be offered here on the QA@NCI blog.
Over the past number of weeks, not only has the pilot project been rolled out nationally at an event held on 6th April – click here for more details on that Induction Event for the NStEP Student Training Programme – but a third NCI class rep system focus group meeting was held on 3rd May resulting in eight recommendations being made – more information relating to the student representative system review is available here.
These are clearly energising times in terms of student engagement both locally and nationally. Having benefitted from our participation in the pilot project, we now have the opportunity to continue to share what we have learned, to support future processes across the sector, and to implement the most effective practices here at NCI.