Four findings from research on employers and apprenticeships

Over the summer we conducted three separate research projects for different clients on the subject of apprenticeships.

While each project was tailored to the client and included different elements – such as a major catchment area review in one and lead generation in another – they all included detailed qualitative research among (different types of) employers.

What I personally find most interesting are the similarities between projects. Here are four trends, and why we think they are important:

  1. Creating a typology of organisations most likely to take on apprentices is difficult – for a good reason

Pages 18 to 27 of  BIS research paper 204  from December 2014 provide a detailed profile of the types of apprenticeship employer; the paper also includes data on what employers will look like in the future (as does the UKCES Employer Perspectives Survey 2014). It is tempting for individual providers to think that their local or regional profile will match this national data.

That is very unlikely because:

  1. Individual catchment areas don’t typically mirror England in terms of industrial sectors and it is (the patterns of) these sectors that most heavily influence what apprenticeship provision looks like.
  2. When you are researching at a smaller scale it becomes quickly very clear that organisational culture, personal experiences of apprenticeships, the frequency of junior vacancies and preconceptions of how young people behave are as important as organisational structure when it comes to predicting propensity to take on an apprentice.

To put it another way, among supporters of apprentices there’s typically a commitment that extends beyond the practical. While (in our research and the literature) businesses are most commonly hiring apprentices in order to fill specific current or projected skills gaps (and generally not, by the way, in order to get hold of cheap labour) the decision-makers very often want to get involved because they or the wider business has an affinity with youth development or workforce diversity as social issues – and they believe apprenticeships are a positive force in this regard.

There are two reasons why we would argue that it is imperative providers understand this distinction. Firstly, it’s vital to understand how practical drivers for taking on an apprentice differ from the emotive in order to tailor marketing communications and your broader employer engagement. Ultimately people make decisions, not business units. Secondly, this passion that apprentice employers display provides a major opportunity for providers; which of these might volunteer to help improve your tutor CPD or provide equipment in kind in order to ensure their apprentices are getting the highest quality training?

  1. Apprentices (and providers) are ambassadors in more ways than you might think

We were particularly struck during all three projects by the extent to which the views of decision-makers were shaped by their previous experience of individual apprentices. This included businesses who had never employed an apprentice but whose HR directors, say, had worked in one who had. More surprising still was the influence of training provider sales teams and the frequency across all three projects in which employers said that their view of apprenticeships as a concept had been negatively affected by pushy, poorly-informed sales teams. In this respect, all providers are in this together and your competitors are shaping your reputation.

  1. Understanding the word does not mean businesses understand the concept

A simple view of Google trends will demonstrate the growth in public awareness of apprenticeships as a term. This does not necessarily translate into understanding in any detail. Confusion among businesses not currently employing apprentices (85% of organisations nationally) was widespread in our studies, with a particularly noticeable propensity to conflate apprenticeships and internships. Awareness of higher apprenticeships was markedly low and again this is borne out by the literature. There is obviously still much work to be done in this regard despite the very laudable efforts of Government, agencies and PR consultancies over the past decade. This point is related to our findings at #1 – if first-hand experience of an apprentice is a major driver of awareness and opinion and the vast majority of organisations do not yet employ an apprentice, it’s not much of a surprise.

  1. Population change as a short-term threat

There is a lot of talk in HE circles at the moment about the impact of the big drop in the youth population up to 2020 on university recruitment – the population of UK 18-year-olds is set to fall by around 80,000 (11%) by that time. There appears to be less noise in relation to apprenticeships but it’s just as big an issue in terms of under 19 and, later on, 19+ apprenticeship recruitment. If you are a college or training provider and not taking population change into account in your planning you’re probably making a big mistake.


The tyranny of Eduspeak – how jargon and acronym corrode understanding and reputation

In 2008 a charity called the Learning and Skills Network published a guide to improving communication within the further education sector.

‘It’s a communication jungle out there’ reported the results of a survey of just under 1000 further education lecturers, support staff and managers. The vast majority of respondents felt that jargon and acronyms were far too common and inhibited effective communication.

As the then Chief Executive John Stone wrote (with admirable clarity), the findings posed a challenge to departments, agencies and educators. “[They] show that jargon isn’t just an annoyance; it’s a genuine problem that acts as a real barrier to understanding….At its worst, this shared language can colour our communications with communities, employers and even learners – ultimately shutting out the very people we exist to serve.”

What has changed? Not much, it seems. Fresh jargon and acronyms blossom across agency, department, college and private provider communication like annuals in summer. Perennials return; ‘NEETS’ (Not in Education, Employment or Training) and ‘blended learning’ were among the various banes of 2007, still growing strong today.

The problem, of course, is not exclusive to further education. As a relatively new primary school governor I find this new blend of Eduspeak[i] – from SATs (Standard Assessment Tests) to CAFs (Common Assessment Frameworks) and SENCOs (Special Educational Needs Co-ordinators) – a constant challenge. Higher and secondary education have their own argot, of course.

Teaching is a specialised profession. As is research. The management of institutions that educate people typically requires a team of diverse professionals, from accountants to facilities managers, human resources practitioners to caterers (in some settings now referred to as ‘midday supervisors’, as if they were The Guardians of Noon). It would be naïve to suggest that these groups should and would not develop a professional lexicon. The problems start when – as the Learning and Skills Network study showed – exclusivity inhibits understanding.

A 2012 study of further education employees revealed that one of the main aspirations that teaching staff have for their management teams was that they ‘protect’ the institution from Government policy changes (and the language they come wrapped in). I read John Stone’s use of ‘learner’ with a wry smile; I would argue that the word is a fine example of a failure to protect, of the linguistic imperialism successfully employed by dominant coalitions.

‘Learner’ is commonly used by Government and agencies as a unit of measurement or summation – from ‘learner participation statistics’ to ‘learner voice’. It has now been assimilated into the vocabularies of further education institutions; colleges use it with, to, about and for their students. The problem is – it is jargon. A bean counter’s word, algorithmic, denominational and a barrier to meaning. You can test this by walking into a pub or a corner shop and starting a conversation about education. Count the minutes until someone – excepting the presence of a tired looking lady who teaches in further education – uses the word ‘learner’. You will wait a long time. Moreover, it can alienate; in the (admittedly few) studies we have undertaken which include a question on the subject, a sizeable majority of post-16 respondents prefer to be called (surprise, surprise) ‘students’.  Yet many spend this stage of their education being labelled with a term they do not prefer. If you don’t believe me, try a straw poll among your …ahem….students.

Life gets even harder when a school, college, university, agency or Government tries to communicate with people who neither receive nor work in education.

“The way to gain a good reputation is to endeavour to be what you desire to appear,” said Socrates. Or, in the words of Steven Covey: “What we are communicates far more eloquently than anything we say or do.”

Language that obscures meaning poses problems for reputation in the way that Socrates and Covey describe it. It is hard to be authentic when no-one understands what you are trying to be. For example, over half of the general public mistakenly think that further education colleges are still under local authority control. It is one of a number of challenges colleges face as a result of widespread misunderstanding of terms like FE and ‘further education’ and it matters because these misconstructions power the reputations of institutions among important publics like parents, press and politicians. These in turn power staff and student recruitment, funding, sponsorship, the policy environment and the melee of other influences that determine what matters most – the quality of education.

These aren’t sixth form centres. They are sixth form colleges. (But the sub-editor that wrote the headline either does not understand the difference or thinks her readership does not.) These aren’t bogus colleges. They are, typically, private language schools. This fantastic paper by the late David Watson highlights common category mistakes made in relation to higher education[ii]. The ‘appalling ignorance among decision makers’ about the world of further education as reported by Helena Kennedy does not exclusively relate to their educational experience. It is in part fuelled by a system still overburdened with qualifications and the jargon that surrounds them. A-level. Degree. GCSE. Got it. BTEC, FD, HNC, HND, NVQ, SVQ, functional skills, traineeships….not so clear.

There is nothing like enough space here to address all the ways in which a college, university or school might improve understanding and aid clarity. But here are three suggestions to start with:

Firstly, test the manner in which you communicate with those with whom you are communicating. For a large organization this might involve qualitative and quantitative research among those groups most important to you[iii]. For a little primary school it might be a few minutes at a coffee morning asking parents if they understood this letter or that policy and if not, why not. At worst, supply or suggest a glossary.

Secondly, use the professional expertise at hand. If you are a college or university and you have a communications team, it is likely to include someone with professional copywriting experience or qualifications, most commonly gained during an earlier career as a journalist. Are they involved in auditing internal and external communications and, if so, with what level of autonomy and impact? Typically schools – though blessed with experts in the application of English – may have to be more creative as regards this type of audit.

Thirdly, and most importantly, be as suspicious of those neologisms and neophilisms as your bones tell you to be. The exciting new initiative announced by the Minister this morning – whether it is a TechBac, EBacc or Teaching Excellence Framework – will suffer from reputation lag. It could be years before internal and external audiences understand the concepts that underpin them. (Colleges bear witness to this every time they are referred to as ‘The Tech’). Or to put it another way, whither the Diploma?[iv]The jargon and acronyms used to decorate these initiatives will further hinder a common understanding. Accordingly, institutions that uncritically welcome the new with open arms and adopt without reflection the language in which they are packaged do their staff, students and broader communities a great disservice.


[i] I am a hypocrite. In using the phrase Eduspeak I am of course sacrificing clarity on the altar of vanity, trying to look smart with a nod to George Orwell. Moreover I am guilty of using jargon and acronyms in my professional career – although I am now trying to kick the habit.

[ii] Which includes a dissection and argument against the use of the term ‘sector’ – a sin I have committed here. It’s a hard habit to kick.

[iii] ’Key publics’ in public relations terminology. There I go again.

[iv] This is particularly hypocritical rhetoric, given that I was involved in supporting a national promotional campaign for that qualification.

#PRstack – new ebook on PR (and other free tools)

I’m among the contributors to a new guide to modern PR tools published today, the My #PRstack ebook.

There are 18 contributors and 40+ practical examples of tools used in public relations, content marketing and search engine optimisation (SEO).

You can download the ebook for free.

The section I’ve written focuses on how PR practitioners can use the Government’s Nomis data tool to help define or understand the publics with whom they need to engage.

If you are interested in public relations research and evaluation, here’s something I wrote in 2013 about free online research resources for PR; it includes data sources on attitudes, media consumption, political opinion and trust.

Other free tools that I have come across since writing that piece include:

  • the fun YouGov profiler, a free to use app built to showcase (paid for) YouGov profiles, a segmentation and planning tool that allows users to build target profiles using the data from 200,000 YouGov members. (Just don’t mention the election).
  • the London datastore for demographic data from the capital

Very pleased to hear about other free demography, awareness, attitude, behaviour resources that practitioners find useful.

Creating online surveys that work – 15 quick tips

Most of our research work involves in-depth qualitative interviews – over the telephone or in person. But when the need arises we design and implement online surveys for clients.

These can be exceptionally good value research activities, particularly if a client wants an overview of opinions among a large group of respondents within a particular target group.

There are a wide range of really useful websites giving hints and tips about different elements of online surveying, and I’ve published a sample of these at the base of this article.

But I thought it might be useful for fellow communications, PR and marketing practitioners if I listed some tips that you won’t necessarily find anywhere else.

Our surveys tend to be quite complex affairs and part of a wider body of study, so I’d recommend approaching an agency if that’s your requirement (I would say that, wouldn’t I!) However, if you are doing something fairly straightforward in-house then these may come in handy….

1. Incentivisation – we find that mixed models of incentivisation work best, in which a respondent has the chance to gain something personally (i.e. enter a prize draw) as well as raise money for a charitable cause by completing the survey.

2. Charity support – spending time considering which charity to support is a worthwhile investment; if it resonates strongly with the respondent group, it can have a really positive impact on response rates.

3. Split testing – in particular, where a target group is particularly large, it’s worth split-testing different email introduction and subject heading types as part of a pilot stage, and running with the most successful for the bulk of the campaign.

4. Pilots – running a two-stage pilot process can add significant value to a project. Typically these involve an: initial test of the survey with a friendly group of respondents (and it’s important this is neither you nor the ‘client’, as you will be too close at this stage to see errors or issues); split testing of invitation types (see above).

5. Watch out for betas – the functionality of most of the current crop of paid-for online survey tools is impressive. However, take care when using ‘beta’ test versions of sites as there is (naturally) a higher risk of glitches that could impact your campaign if you use them. Current issues with fonts in the SurveyMonkey beta email tool is a good case in point.

6. Invitations – there are lots of useful blogs about what to include in an email invitation, some of which are listed below. In crude summary:

  • Personalisation [First Name] etc
  • Thank you
  • Why you’re doing it, who for and how results will be used
  • Length of time to complete
  • Incentivisation
  • Deadline
  • Confidentiality assurances
  • Contact details

7. Subject headings – again, hints and tips abound (see below). In the end it comes down to good copyrighting skills. Brevity, relevance and an answer to the inevitable ‘what’s in it for me?’ question.

8. Invitation timings – again, it’s worth splitting invitation email dispatch between different days and times of day.

9. Reminders – in-house email management or survey software email tools will often build in reminders for you, and send only to those who haven’t completed the survey. Reminder timings should be closely related to deadlines – work backwards from the deadline when setting them, rather than forward from the dispatch time.

10. Deadlines – Beware the distant survey closure deadline, a gift to procrastinators. The vast majority of responses will be collected in the two or three days after an invitation or reminder receipt.

11. In whose name should the email be sent? The person or organisation best known to the respondent group, even if you are using an agency like us, as this will boost response rates while minimising problems with spam or junk filters.

12. Don’t spam – not only is it poor practice, it’s a waste of everybody’s time. Only send survey invites to people who have agreed to receive communications from the organisation or, if you’ve bought a list, contact the recipients through your own email client to get their consent before sending the survey invite.

13. Beware contact lists provided by online survey companies – I have yet to hear a positive story about these. (Please do get in touch if you have a different tale to tell.)

14. Watch out for public links – on one occasion a client contact accidentally posted the link to an invitation-only, incentivised online survey in a public forum. It was very quickly attacked by spam bots and it took quite a while to clean up the results. Where posting links to surveys, it’s best to publish to a closed group.

15. Place the most important questions at the front of the survey. Sometimes this may involve pushing demographic data down the running order.

Some useful links:

How do I write an effective survey introduction?

5 key messages to an online survey introduction

6 simple tips to write perfect subject lines