#PRstack – new ebook on PR (and other free tools)

I’m among the contributors to a new guide to modern PR tools published today, the My #PRstack ebook.

There are 18 contributors and 40+ practical examples of tools used in public relations, content marketing and search engine optimisation (SEO).

You can download the ebook for free.

The section I’ve written focuses on how PR practitioners can use the Government’s Nomis data tool to help define or understand the publics with whom they need to engage.

If you are interested in public relations research and evaluation, here’s something I wrote in 2013 about free online research resources for PR; it includes data sources on attitudes, media consumption, political opinion and trust.

Other free tools that I have come across since writing that piece include:

  • the fun YouGov profiler, a free to use app built to showcase (paid for) YouGov profiles, a segmentation and planning tool that allows users to build target profiles using the data from 200,000 YouGov members. (Just don’t mention the election).
  • the London datastore for demographic data from the capital

Very pleased to hear about other free demography, awareness, attitude, behaviour resources that practitioners find useful.


Creating online surveys that work – 15 quick tips

Most of our research work involves in-depth qualitative interviews – over the telephone or in person. But when the need arises we design and implement online surveys for clients.

These can be exceptionally good value research activities, particularly if a client wants an overview of opinions among a large group of respondents within a particular target group.

There are a wide range of really useful websites giving hints and tips about different elements of online surveying, and I’ve published a sample of these at the base of this article.

But I thought it might be useful for fellow communications, PR and marketing practitioners if I listed some tips that you won’t necessarily find anywhere else.

Our surveys tend to be quite complex affairs and part of a wider body of study, so I’d recommend approaching an agency if that’s your requirement (I would say that, wouldn’t I!) However, if you are doing something fairly straightforward in-house then these may come in handy….

1. Incentivisation – we find that mixed models of incentivisation work best, in which a respondent has the chance to gain something personally (i.e. enter a prize draw) as well as raise money for a charitable cause by completing the survey.

2. Charity support – spending time considering which charity to support is a worthwhile investment; if it resonates strongly with the respondent group, it can have a really positive impact on response rates.

3. Split testing – in particular, where a target group is particularly large, it’s worth split-testing different email introduction and subject heading types as part of a pilot stage, and running with the most successful for the bulk of the campaign.

4. Pilots – running a two-stage pilot process can add significant value to a project. Typically these involve an: initial test of the survey with a friendly group of respondents (and it’s important this is neither you nor the ‘client’, as you will be too close at this stage to see errors or issues); split testing of invitation types (see above).

5. Watch out for betas – the functionality of most of the current crop of paid-for online survey tools is impressive. However, take care when using ‘beta’ test versions of sites as there is (naturally) a higher risk of glitches that could impact your campaign if you use them. Current issues with fonts in the SurveyMonkey beta email tool is a good case in point.

6. Invitations – there are lots of useful blogs about what to include in an email invitation, some of which are listed below. In crude summary:

  • Personalisation [First Name] etc
  • Thank you
  • Why you’re doing it, who for and how results will be used
  • Length of time to complete
  • Incentivisation
  • Deadline
  • Confidentiality assurances
  • Contact details

7. Subject headings – again, hints and tips abound (see below). In the end it comes down to good copyrighting skills. Brevity, relevance and an answer to the inevitable ‘what’s in it for me?’ question.

8. Invitation timings – again, it’s worth splitting invitation email dispatch between different days and times of day.

9. Reminders – in-house email management or survey software email tools will often build in reminders for you, and send only to those who haven’t completed the survey. Reminder timings should be closely related to deadlines – work backwards from the deadline when setting them, rather than forward from the dispatch time.

10. Deadlines – Beware the distant survey closure deadline, a gift to procrastinators. The vast majority of responses will be collected in the two or three days after an invitation or reminder receipt.

11. In whose name should the email be sent? The person or organisation best known to the respondent group, even if you are using an agency like us, as this will boost response rates while minimising problems with spam or junk filters.

12. Don’t spam – not only is it poor practice, it’s a waste of everybody’s time. Only send survey invites to people who have agreed to receive communications from the organisation or, if you’ve bought a list, contact the recipients through your own email client to get their consent before sending the survey invite.

13. Beware contact lists provided by online survey companies – I have yet to hear a positive story about these. (Please do get in touch if you have a different tale to tell.)

14. Watch out for public links – on one occasion a client contact accidentally posted the link to an invitation-only, incentivised online survey in a public forum. It was very quickly attacked by spam bots and it took quite a while to clean up the results. Where posting links to surveys, it’s best to publish to a closed group.

15. Place the most important questions at the front of the survey. Sometimes this may involve pushing demographic data down the running order.

Some useful links:

How do I write an effective survey introduction?

5 key messages to an online survey introduction

6 simple tips to write perfect subject lines


FE isn’t a brand – and why that matters

Earlier this month the TES published a double-page spread (and splashed the story) about a six month study of further education reputation undertaken by Richard Gillingwater, of corporate communications agency Acrue Fulton.

In the article Richard ‘says FE’s national brand needs to be rebuilt, and unveils his plan to help the sector make people sit up and take notice’ (to quote the TES blurb).

While I applaud any media taking interest in further education and recognise Richard’s impeccable credentials, the available evidence suggests that it is impossible to rebuild the further education brand. That is because further education, with one important exception, is not a brand.

It is at best a sector and most probably a system.

There are numerous, occasionally conflicting, definitions of ‘brand’. It is one of those words, as Jerry McLaughlin delicately puts it, ‘that is widely used but unevenly understood’. Where academics and practitioners tend to agree is that a brand is a product, concept or service publically distinguished from other products, concepts or services. “A brand is what a firm, institution, or collection of products and services stands for in the hearts and minds of its target audience.”[1]

Brands, as the derivation from branding-iron suggests, are commonly expressed through the medium of a brand name, a trademark, a logo.

FE is not ‘publically distinguished’. It has no recognised logo, no trademark. More importantly, all but one of its target audiences (those who work in it) are insufficiently aware of it – who it serves, its constituent parts, its ‘key facts’ for want of a better phrase – for it to qualify as a brand.

In the past decade a handful of studies examining FE’s reputation have been commissioned. They all tell pretty much the same story – like this one from 2007. If you look under the bonnet of each of those studies, the respondents typically have some understanding of further education and FE as a concept[2]. Sometimes this is deliberate, as in the case of this 2012 study of FE employees.

To my knowledge (and according to reviews of available literature like this one from Anne Parfitt at Huddersfield Uni and this paper from David Roberts at the Knowledge Partnership) there has been no audit of further education’s reputation among a general population. By that I mean parents, students, prospective students, client and non-client employers. More basically, people who don’t work in organisations involved in the delivery or receipt of further education.

No such study has been commissioned, I’d suggest, because potential investors think it would be a waste of time and money. In 2011, the Association of Colleges and polling company ICM undertook a study of college reputations among such a general public. Two thirds of respondents thought Trinity College Cambridge was an FE college, and half said that colleges are still under local authority control and not inspected by Ofsted. In those other studies among ‘stakeholder’ audiences, respondents demonstrate a higher level of awareness of colleges than they do of FE. So it follows that a general public would demonstrate an even lower level of awareness of FE than they did of colleges in 2011.

You can undertake a completely unscientific test of this proposition yourself by asking three people who aren’t an FE lecturer, manager or service provider the question: “What is further education?” If any of the answers correspond, buy yourself a drink.

None of this is meant to detract from Richard Gillingwater’s research and points about FE reputation per se. It’s just that when it comes to branding, FE never made it onto the ranch. This matters, because if Government or its agencies (for instance) want to bolster reputations they should focus on FE’s constituent parts rather than the whole. And in doing so they should recognise that a strong brand depends on a minimal level of awareness – which, by the way, is why the continued, deliberate fragmentation of the term ‘college’ through the proliferation of new forms of institution is likely to prove so corrosive in the longer-term.





[1] A quote from Luc Speisser of Landon – whose 2012 blog entry on explaining a brand I would highly recommend.

[2] Take a look, for instance, at the list of respondents on page 3 of this 2007 study, commissioned from Ipsos Mori by then head of the Learning and Skills Council Mark Haysom (who, by the way, now writes critically acclaimed novels).

Who and what influences choice in further education?

In the past couple of years we have specialised in helping clients study attitude, awareness or behaviours among groups important to their organisation.

We also help clients adapt according to the results of the research.

Projects include studies for further education (FE) colleges – typically focusing on recruitment and seeking to help a client understand and respond to who and what influences student choice in their area.

We’ve found a number of patterns across our work in this field and thought that FE colleagues might find it useful if we set ten of them out here.

  1. The decline of the influencer. In 2012 a national study of students aged 11 to 21 and their parents (in which I was involved) indicated that parents exerted a high level of influence on student choice of institution[1]. In our subsequent studies on behalf of colleges – as the agency YouthSight suggests in relation to university applicants – the influence of other people on post-16 student choice of place of study appears to be in general decline. In our latest study (of a 3000+ population of higher education applicants to a large GFE) just under half of respondents said they had not been influenced by anyone. Where third parties do influence choice, mum and dad and family friends most commonly top the rankings.
  1. The rise of search. Online search is overtaking the prospectus as the channel applicants find the most useful for finding out about a prospective place of study. This shift and trend #1 are probably linked – rather than asking or expecting advice from friends or family on study options, students are more commonly actively searching online for institutions which fit their requirements. So colleges need to know what information potential applicants are looking for in order to make an informed decision, and ensure it is easy to find on their website. Online search is commonly also the most useful channel for applicants who have yet to commit to an institution and want more information – so keeping a website up to date may be the most effective ‘keep warm’ tactic for any college. Online search, by the way, dominates where full cost recovery provision is concerned. Social media discussions, adverts and newspaper articles are typically cited as the least useful sources of information about a prospective place of study. 
  1. The power of course. A good reputation for teaching is, typically, the third most important factor for 16+ students considering where to study. Locational factors – where a college is based and the transport network which feeds it – are commonly cited as the second most important factor. Course most regularly tops the rankings. Students may compromise on sports opportunities, on the time taken to travel, on the way buildings look or the facilities within them, but they are unlikely to make concessions on the subject and type of course they want to study. Which highlights the importance of teaching excellence and market research for colleges – while providing another depressing piece of evidence for those of us concerned about the black hole that is schools-based careers advice. 
  1. Gender differences in influence. Where we have explored this issue, we’ve seen notable differences in the way males and females make decisions about where to study. In crude summary, female applicants to further education courses are more discerning – they commonly take more factors into account than males when considering their options. They are also more likely to be informed in their institutional choice by school or college tutors than their male counterparts, who are more inclined to be influenced by friends. 
  1. Hedging bets. This phenomenon first came to light in a study we undertook of a population of 7000 students who applied to a college but enrolled elsewhere in early 2013. 20% of applicants considered the college as a ‘back-up’ choice. In the majority of cases, according to qualitative responses, they were encouraged by school tutors to apply to more than one institution. There appears to be a corresponding general growth in the number of institutions applied to – but we haven’t adequately tested that proposition to be sure. There are ramifications for conversion rates here, and related expectations about the effectiveness and performance of recruitment activities. 
  1. Last-minute change of mind. In the same piece of research, 10% of applicants changed their mind about the course they wanted to study in the period between applying to college/s and enrolling. This change of mind lead to a change of institution (because, as we have seen, course is the most important factor in choice, and in the case of this 10% they – rightly or wrongly – didn’t think the college in question delivered the course they had now settled on). Which means that colleges need to make applicants aware of the broad range of courses available (or at least of the mechanism for finding out), even if an applicant seems pretty sure about what she wants to do with the rest of her life. 
  1. Uncommon applications. Looking for ‘insurance’ offers may sound more like the behaviour of a university applicant than a prospective college student. Whereas university applicants have a system in place – UCAS – to standardise those applications, that is not the case for (non-HE) college applicants. The differences in the application processes between colleges and schools can be confusing, and the more students ‘shop around’ the more puzzling it can be. In one study among non-enrolled applicants, a significant minority expressed low levels of awareness of the particular hoops – application, assessment, interview or audition – that constituted the application process according to course type. Setting out the college processes – including what applicants can expect in terms of entry requirements, timings for interviews and communications from the college – in ways that are easy for applicants to understand is clearly important.
  2. Silence is goodbye.  We are sometimes asked to test (via mystery shopping or quantitative research) if open day, interview and enrolment practices are up to scratch. Where colleges most commonly ‘lose’ applicants it is in the period between application and interview, when the responsibility for a prospective student is passed from (say) a central recruitment or marketing department to administrators in a school or course area responsible for booking interviews. Where there is a delay in an application (say, prior to interview) most students do not follow this up – they assume they have not got a place. Similarly, where there is no response after an interview, most have been offered a place at another college or school, and they don’t chase the college in question either but apply elsewhere. The impact of poorly managed communications is clear.
  1. But goodbye may not be forever. In two separate studies this year (2014) we’ve asked non-enrolled applicants whether they would be interested in hearing about courses at the institution they rejected for another. In both cases a significant minority said they would. A majority of alumni, asked a similar question, were interested in further study. When applicants reject an institution in favour of another it does not necessarily mean they have a low opinion of that college – it may be a case of right place, wrong time. Or that the course they wanted to study was not available. These results also hint that FE alumni networks may be significant and (as yet) overlooked sources of recruitment.
  2. The timing of communications matter. The majority of our education research is undertaken among groups of students aged between 16 and 21. We have experimented with different methodologies depending on the client, the geography and types of students. Generally speaking, research is most fruitful when we’re contacting respondents by mobile phone between 5pm and 8pm. Where colleges are able to raise an expectation among student groups that they may be asked to take part in research, the response rate is (much) better. There are ramifications for data management and protection and communications planning here.

[1]Parent Power Dominates Education Choices’ – Chartered Institute of Public Relations Education and Skills Group.

Why understanding what makes MPs tick might help repair our relationship with politics

When Ipsos Mori published its annual veracity index in February last year, politicians emerged as the least trusted profession. Just one in five Britons said they trusted MPs to tell the truth, a fairly damning judgement when you consider the importance of trust in relationships and reputation.

In the wake of the expenses scandals, Chris Huhne’s perversion of the course of justice, contemporary perspectives of Plebgate and (probably most significantly) the long tail of political scepticism in the UK and beyond, the results weren’t much of a surprise.

Indeed, Ipsos’ own social trends data, shows that our suspicions simmered at the same level thirty years ago as they do today. However, if you look further back, our disenchantment with the political classes appears to have become more entrenched over time. In 1954, 38% of us thought that our MP was doing a good job. In 2012 that figure shrank to 15%, according to comparison’s made in Peter Kellner’s excellent Democracy on Trial.

In Decca Aitkenhead’s recent interview with Rory Stewart, MP for Penrith, she wonders why on earth this man would want to work in the House of Commons. Here is a diplomat, adventurer, bestselling author, philanthropist and polyglot, to name a few of his achievements. On this evidence he is (almost brutally) honest. Like those politicians interviewed in Tony Russell’s Commons People he hopes for the bauble of a ministerial post. Yet he believes that ‘anyone running a small pizza business has more power’ than he does as a constituency MP and he works for people who, on the whole, wouldn’t trust him (at least in the abstract) as far as they could throw him.

Which begs a second, more useful question: why would anyone in their right mind want to be an MP in that environment, particular someone demonstrably able to excel in other walks of life? In answer to which, you’ll find a morass of opinion. On the one side, MPs are lazy, bossy, dogmatic, power-hungry and narcissistic and the job description fits. On the other (less fashionable) side MP’s are motivated by high levels of energy, ideals of public service and an ambition to change people’s lives for the better.

There is little evidence available to silence the dull rumble of conjecture and supposition. While studies into political enfranchisement and participation among UK voter groups are relatively common, contemporary research into motivating factors among aspirant, current and former Members of Parliament is thin on the ground and primarily anecdotal.

Ipsos Mori investigated the changing demography of MPs after the 2010 election but haven’t studied MP motivating factors along these lines in recent memory, says research director Carl Phillips. Yougov, who interviewed over 5000 Britons as part of the Kellner study (and who, like Ipsos Mori, regularly survey MPs on behalf of clients) looked at what factors would demotivate members of the public from standing for election in a December 2012 study. Oliver Rowe, Yougov reputation research director, says that broader questions about MP motivations would be interesting and probably ones people would like to see answered. No member of the British Polling Council appears to have published a study of this type in the past decade*.

Democracies, so the saying almost goes, get the politicians they deserve. To what extent might our distrust of politicians deter aspirant MPs? How does it impact on the motivations of current incumbents, if at all? And how might this affect the types of people who want to become MPs and their performance in the role? Does our cynicism seal a vicious circle? A robust (anonymous) study among MPs, aspiring and past politicians would help us answer some of these important questions, replacing supposition with evidence.  Here’s hoping that the pollsters take up the challenge.

*I have tried my best to find them via an online and British Library search, and called up most companies to check, but if I’m wrong and there is something out there please let me know and I’ll blog about the results.

Why a change is not as good as a rest when it comes to education policy

Year on year the revolutions of educational change appear (at least to me) to spin faster, backwards and forwards, like an enormous washing machine. Centralisation, decentralisation, diplomas on, diplomas off, EMA to bursary and perhaps back again, three more types of colleges to add to the confusion,  hello quango, goodbye quango, new curricula, frameworks, white papers, funding streams, fee and floor targets, subsidies, tables, measures, rules.

Aside from how confusing all this must be for the lay-person and the boring but true fact that it takes years to see the impact of most policy changes (and so many seem to be changed before the effect can be felt, as if someone were impatiently switching between washing programmes), what’s the impact on people who work in schools, colleges and universities?

Last year a group interested in reputation and communications in further education (of whom I was a member) commissioned a piece of research looking at what people who work in FE think of their jobs, the sector they work in and the people who lead it. We were particular keen to understand whether employees were likely to act as advocates for their college and FE in general and, if not, why not.

The result was a forensic examination of what gets people out of bed in the morning to work in a college and what makes them want to stay under the duvet. One surprisingly – at least to me – common source of disconcertion was government policy. Or more specifically, its fluctuations.  Staff ‘are challenged by the changing nature of government policy [my italics] and the effect this has on working conditions and the quality of delivery to students’ according to the research (which included a survey of over 1300 staff and a complementary qualitative study). ‘Employees commented on leaders being overly concerned with government targets and being reactive rather than pro-active.’ Conversely strong leaders were praised for ‘protecting’ their staff from the slings and arrows of policy changes.

Frederick Herzberg (who you can see doing his best impression of Columbo in this BBC 2 film) developed a theory of job satisfaction in the 1950’s which split the needs of the working woman and man into ‘motivating’ and ‘hygiene’ factors. Hygiene factors – such as working conditions, salary, the way we are supervised – keep us from being unhappy (as Professor Herzberg says in the film). Motivating factors – responsibility, the meaning and significance of a role – make us ‘want to do it’.

The FE staff study suggests that there is a strong connection between rapidity of change in the context of  education policy and the stripping away of hygiene factors for people who work in that system, to such an extent that roles become denuded of meaning and, correspondingly, motivating factors.

The story for schools is not very different. This paper on ‘the emotional state of teachers during educational policy change’ presented at the European Conference on Educational Research in 2003 by Brigitte Smit ends with the warning:

“This inquiry revealed that educational policy change creates considerable uncertainty and even ambiguity among teachers. This was evidenced in teachers’ anxiety, professional isolation, and loss of connection and trust in the education system. If policy is serious about implementation, policy makers need to take cognisance of teachers’ emotional responses and dispositions towards educational change.”

With the advent of any new government there is a tradition of pejoratively labelling those resistant to change. They are the lump, the mass, the blob, them. There is also a cultural cliché of the moaning teacher, taking too many holidays, insufficiently progressive (whatever that means). And while there may certainly be more than a grain of truth in the assertion that professions (of whatever type) are inherently conservative, name-calling misses the point.

Even if every significant change to education policy over the past fifty years was benign, enlightened and conceptually the right thing to do at that point, they could still collectively amount to a list of mistakes as a result of their ubiquity.

If what you are doing (or perhaps more tellingly, how you are doing it) de-motivates a significant proportion of that workforce then something is wrong, surely? A more useful question – if the impact of hasty change is generally accepted to be negative, why does it happen so regularly?

I’m not on first name terms with very many politicians (OK, none) but they don’t strike me as (generally) deluded or unintelligent. Nor do the people who advise them. But when they are proposing and promoting new policies do they sincerely believe that they are introducing a change for the better or are they rolling with the political cycle? This is a system that promotes differentiation and punishes stasis – the irrepressible one-upmanship of party politics, engineered by the need, desire (call it what you will) to be re-elected and powered by wonks, think tanks, columnists, editors, analysts, advisors and a cast of many thousands, including people like me who make a living helping institutions respond to change.

Or, to put it another way, if you put too many clothes in a washing machine they come out creased.

I’m not advocating the dislocation of the political from education policy; we spend so much on education as a state that this would seem at best naive. Change can, of course, be genuinely progressive –  witness the Education Acts of 1902, 1918, 1921 and 1936, for instance and the implausibility of Wackford Squeers in an era of Teaching Councils.

But we do need a better system for testing and regulating the impact of policy (per se) on the people who work in schools, colleges and universities as well as on pupils, students and their parents.

More on what research tells us about how policy change affects the reputations of education systems among the latter anon.