Abstract | Twelve years ago I wrote a piece for The Sport and Exercise Scientist entitled ‘Preparing Students for the Real World’. It addressed the, at the time, all too evident skills gap between what sport and exercise sciences programmes were producing and what many employers in industry and public health needed. A lot has happened since, and it could be argued that universities have addressed the problem. In this article however I’d like to draw attention to a second and different problem that has emerged, potentially as the result of the methods many universities have used in addressing the first. I have been working in the physical activity and health sector since 1992. In two senior roles in industry I have employed over 3,000 health and fitness professionals, the majority of whom were graduates. I have personally delivered and assessed continuing professional development (CPD) courses, usually 3-5 days intensive contact focusing on applied and interdisciplinary sport and exercise, for a further 1,000+ candidates, again the majority of whom were graduates. All of the above were working in capacities such as fitness instructor, personal trainer, sports development officer, sports therapist, and nutritionist. Given the public health issues the UK is currently facing, these professionals play an increasingly important role. The early stages of my CPD work in industry focused on my belief that knowledge was an important factor in determining the quality of exercise prescription, nutritional analysis, etc. It became increasingly evident however that the best practitioners were not necessarily the most knowledgeable; in fact, an almost text-book knowledge of physiology or nutrition were often relatively unrelated to success. Gaining more coal face experience of the applied settings in which our graduates are often employed, I began to orientate my teaching to the idea that it was the practitioner’s ability to communicate with clients that was in fact the critical factor; that there was a ‘sweet-spot’ where adequate levels of knowledge met with good communication skills characterised the best practitioners. But this idea was also found wanting. I became increasingly aware that many effective and successful practitioners were in fact not especially good communicators. After perhaps 15 years of working in industry (and at the same time holding posts in HE), I realised that the core indicator of effectiveness was the practitioner’s ability to find things out, to identify what information s/he needed, to know where to find it, and once found, to discriminate good information from bad information. This idea also extended to information about the client; what are the key variables that underlie to the client’s current health status and their goals? How can these be manipulated to a successful outcome? How can these be reliably assessed over time? In short, the best practitioners are good at research methods. That doesn’t mean that they’re good statisticians, that they can necessarily define epistemology, or can necessarily argue the relative merits of quantitative versus qualitative methods. But they do understand the ideas that underpin all of these, and often they’ve been doing it so consistently and for so long, that this understanding has become automatic and intuitive as opposed to deliberate and formal. Now, all well and good. All - or certainly most - undergraduate programmes in the UK have a significant Research Methods component. Over and above this, research methods are learned explicitly and implicitly in other modules such as biomechanics, nutrition, physiology, psychology and sociology. We are therefore surely preparing students for the real world by the criteria I’ve presented above? Ten years ago I would have agreed. Now, I’m less convinced. There has certainly been a shift in the way we teach the undergraduate curriculum, with an evident focus on greater real world relevance. But at the same time, the ‘real world relevance’ is morphing into ‘employability’. The evolution of the graduate skills agenda is entirely consistent with my 2005 paper described above. That the graduate skills component so often appears to be embedded in Research Methods components of programmes is not. In fact, I argue that it is entirely counter-productive and entirely at odds with what employers need and expect of graduates. It is increasingly my experience that Research Methods modules are front-loaded with content aimed at orienting the student to the higher education environment and back loaded with content aimed at enhancing employabilty (I will return to this poorly used term below). In some programmes I have seen, this process has reduced the Research Methods content by over 50%. This comes with two clear problems. Firstly, the students are not receiving the breadth and quality of teaching that the subject deserves. Second, the students are not seeing research methods per se as that important relative to other areas of content. Whereas historically Research Methods constituted up to 30% of some programmes - especially if a dissertation is factored in to the equation - in some cases it is now less than 10%, and as low as 5% in real terms. “But employability has to go somewhere, and Research Methods presents the best place” is an argument I often hear. Well, I beg to differ. Firstly, one of the reasons that employability and skills gets dumped into Research Methods modules is that, unlike the case with biomechanics, nutrition, physiology, psychology and sociology, all of which tend to have discipline-specific teaching staff who will often defend their ‘air time’ vigorously, many departments have no such dedicated staff for Research Methods. In short, there is no-one to defend Research Methods in the ‘where shall we put employability’ debate. Secondly, and most importantly, Research Methods, if taught and assessed appropriately, is employability. And unlike much of what passes for it these days, it is real employability. If we reduce the total content of Research Methods to increase employability content, we are de facto throwing the baby out with the bathwater. How many graduates will actually use the knowledge base of biomechanics, nutrition, physiology, psychology and sociology in their future careers? Some for sure, but all things considered, it’s a relatively low number. How many graduates will use research skills? Probably all of them… And here’s why (and I know I’m preaching to the converted here). We live in an age in which there has been an explosion of often conflicting information media. There is at the same time an emerging post-truth landscape in which it is considered OK for politicians to tell outright lies to win elections or referendums. Further still, individuals and groups are increasingly being handed responsibility for their own health under the guise of prudent healthcare (a proxy for lower cost healthcare). In these contexts, the graduate with research skills is not only going to be more employable, more effective and more successful, but is also going to be able to make better informed decisions about her or his life (and those of their families) in relation to health, law, finance, and many other core aspects of life. Now all of the above is well and good, I doubt many would disagree. Here is the sting in the tail. Employers are increasingly saying that graduates are better able to get the job but less able to do it (a comment to me a while back summed it up; “in the old days good candidates often had poor CVs but you knew what to look for, now everyone, even the worst candidates has a good CV.”) The employability agenda is working, but it might be counterproductive in the long term. If sport and exercise sciences graduates are seen as being less effective in the workplace than graduates of other disciplines, we are doing them, and ourselves a disservice. So what’s the solution? First, defend Research Methods modules. It is the science in sport and exercise science, arguably more so than any other discrete components of the programme. Do not let it be encroached upon but generic and often significantly less valuable content. Second, assess Research Methods broadly; students work hard to learn and understand content on which they are assessed (or at least, the majority work harder on content that is assessed), but as importantly, students also see the content on which they are assessed as the important stuff! Lastly, interrogate what your employability content is really achieving; is it providing students with skills that the employers need, or is it ticking boxes defined by units in universities which, in many cases, do not have one staff member who has worked outside the university sector. Speak to employers, speak to alumni, speak to the individuals who our graduates are increasingly working with, for example, the inactive or at risk. And this last point is crucial; in a recent conversation it became clear to me that the concept of employability that defines the approach of many academics is itself defined by the university context. Whilst interrogating assignments based on 2,000 word essays and 10,000 word dissertations, I pointed out that the need to ever write anything of such length in the ‘real world’ was minimal. The response was that all academics need to be able to write! Agreed, but must all undergraduates aspire to academic skills? No, of course not (and how many journals would accept a 10,000 word paper these days anyway). Let’s not assume that employability means giving students the skillsets we ourselves need. I’ve interviewed hundreds of graduates and non-graduates. What’s my favourite response, irrespective of the question? “I don’t know the answer, but I do know how I would find it.” Give a student a piece of reliable information and they’ll eat for a day. Teach them how to find that information for themselves, to discriminate it from poor information, and how and when to apply it, and, well you know the rest… |
---|