Risk and innovation

15 January 2015

As I noted in my recent teaching excellence application, one of the challenges of introducing innovations is that it comes with a degree of risk.

In thinking about my teaching effectiveness and my propensity to innovate, I am acutely aware of the tension between teaching effectiveness and introducing any innovation. There are risks associated with any innovation; it may not work despite ones best efforts. Furthermore, my research into the professions-law, medicine, engineering, higher education, and so on-suggests that innovation in those contexts is specifically problematic; we professionals are well-practiced our current way of doing things, and necessarily less skilled in new approaches. For example, if one is good at classic lecturing, it is very hard to immediately be as proficient when first undertaking, say, case method teaching. It takes time to develop skills any new way of teaching. If being effective as a teacher is hard, then being innovative and effective presents extra challenges. However, the benefits and the payoffs of innovation in teaching for students can be significant. Therefore, I think the risk is worth it.

So, how do I go about mitigating that risk? One of the ways is to be upfront with students, and to be able to respond/change/adapt if things do not go the way one expected.

Yet, I always create flexibility in my courses, and rapport with students so that-when things occasionally do not work out-there is room to gracefully recover without inhibiting either the students’ learning or their chance at performing well in the course.

However, even before introducing an innovation or making a significant change it behoves one to ‘talk it through’. For example, last semester the trend in the increasing class size in MGMT 300 meant that I needed to make changes; the previous model was becoming untenable.

Talking it through

At the end of each course we complete a self-review of our teaching; what went well, what was not so good, what were the problems, and what were the outcomes (grades etc). These self-reviews are an important input to our Assurance of Learning process. For the past couple of years, I have been noting the changes in the course and my intentions to make changes (innovations).

In the previous year (2012), at the relevant Examiners’ Meeting, I raised both my intent to make changes and explained both my rational and the nature of those changes. At that time there were no comments made by the examiners and assessors regarding the changes. Course self-reviews also form part of this process too. The changes I proposed were not seen as being problematical (i.e., they were not too risky).

Every course has an Examiner and an Assessor. For a small course, often the Examiner is the person delivering the course, but the Assessor is always someone other than the Examiner. Ideally the Assessor provides oversight on the course. I had discussions regarding those changes with my assessor (both for the year before the changes happened, and with the new assessor going into the teaching MGMT 300 in 2014.

I had a number of conversations with Director of Teaching and Learning (in the Business School) about the changes I was hoping to do. Coming out of those discussion, I asked for support to ensure I wasn’t ‘getting ahead of myself’. I was fortunate enough to have one of her staff members, Emma, ride alongside me for the duration of the course. Indeed we had many conversations before the start of the class and a number of changes regarding transparency, were introduced. Emma was very insightful and pragmatic, and she attended many of the class sessions, and read most of the students weekly learning journals (something over 1000 of them). Both she and I kept the Director of Teaching and Learning ‘in the loop’.

I had a number of discussions with staff of CLeaR, regarding the changes and the risks and practicalities associated with them. At all times, they was supportive of the changes and eager to find out the outcomes of them.

So, I talked a lot about the changes with a wide variety of colleagues before I made the change/did the innovation. I also talked with students about the changes. At the Student Staff Consultative Committee meetings, I asked questions about changes that I might make the following year. When the changes seemed undesirable to students, I would re-think them. Of course, the students who participate in SSCC are rarely representative of the class as a whole, despite their efforts to represent them. Even so, this seems like a reasonable way to check out changes with students.

The course in progress

To return to the student feedback, during the course, no concerns were raised through the SSCC process about the changes. What changes were requested (e.g., lab time) were made and where possible were ‘institutionalized’ for 2016. With regard to the qualitative feedback from the course and lecturer evaluations, my initial interpretation is that a significant number of the 100+ students liked the changes. Only two or three would like more ‘lecturing’. Other concerns that the students raised were about previously existing features of the course (that I’m now reflecting on).

Finally, with weekly feedback coming from the class (through their learning journals), I felt well position to spot any problems arising from the changes … and, as already discussed, I have not been made aware of any problems.

The results

It is probably appropriate to look at the student satisfaction with the course, and that the overall course evaluation scored 91.7 (A+SA), and the overall lecturer evaluation score was 82,5 (A+SA).

Overall, I hope that I’m shown that I try to be sensible vis-a-vis risk when introducing significant changes to my courses, so as to minimise the likelihood of problems.