Site icon CXL

Making Marketing Training Work: Closing Skills Gaps, Proving Value

marketer training on computer

Does investing in employees’ marketing skills pay off? Or is it just a waste?

Businesses spent nearly $90 billion on corporate training in 2018.  Per employee, expenditures ranged from $1,046 at large companies to $1,096 at smaller organizations, according to the same report from Training magazine.

Within marketing departments, an estimated 3.9% of the total marketing budget now goes to training programs, up from 2.7% in 2014. The Association of National Advertisers’ CMO Talent Challenge Playbook highlights success stories from marketing training investments:

What about your team? Are you spending too much? Too little? Are you training the right skill sets? Is training a good investment? A wasteful expense? Do you even know?

The State of Marketing Training

Some 23% of marketers surveyed by HubSpot identified “training our team” as a top challenge. “Hiring top talent” was immediately below, at 22%.

Companies trying to hire their way out of a skills gap face a competitive marketplace. LinkedIn’s May 2018 Workforce Report revealed a 230,000-person shortage in the United States for marketing skills, with demand highest in major cities.

That tight talent market has pushed training to the forefront, especially training to develop new marketing capabilities. The biannual CMO Survey anticipated a 6.5% increase in investment for “how to do marketing.” No other increase in marketing knowledge development—such as the transfer of internal knowledge or honing of market research skills—topped 3.9%.

As a segment of the training market, however, marketing lags behind the stalwarts of the industry—sales and leadership training:

Search volume for “marketing training” has consistently trailed other disciplines. Source: Google Trends.

Still, marketing’s share of the corporate training budget is significant: $17 million annually for large companies (10,000+ employees), per Training magazine. Mid-size companies (1,000–9,999) spend an average of $1.5 million per year; small companies (100–999) invest around $375,000.

Those figures are supported by similar findings from a 2016 Brandon Hall Group Benchmarking Study, which surveyed training spending for equivalent tiers: $13 million (10,000+ employees), $3.7 million (1,000–9,999), and $290,000 (100–999).

In recent years, most of that money has tried to close a single marketing skills gap: digital.

A Skills Gap Marketers Don’t Know They Have

The Digital Marketing Institute’s 2016 report “Missing the Mark: The Digital Marketing Skills Gap in the USA, UK & Ireland” lays bare marketers’ shortcomings. Only 8% of those tested achieved entry-level digital marketing skills, and the perception of skill exceeded performance: 51% of U.S. marketers perceived a skill level that only 38% demonstrated via testing.

A 2018 analysis of client data by General Assembly—which has benchmarked more than 25,000 marketers with its “Digital Marketing Level 1” skills assessment—found no correlation between seniority and expertise (among those below the vice-presidential level), and cited “data and measurement” as the biggest skills gap.

“It’s not uncommon for us to hear, ‘We don’t know what we don’t know,’” noted Alison Kashin, an Education Product Manager at General Assembly who focuses on digital marketing training. Kashin elaborated:

Most corporate marketers have outsourced digital execution to agencies, and clients now realize they’re too far removed to be effective. It’s hard to give direction, ask the right questions, or make confident decisions if you don’t know how something works.

Marketers’ Self-Inflicted Wound

The yawning skills gap is, in part, self-inflicted. As the Digital Marketing Institute’s report notes, “The general consensus among employees is that the pace of technological and digital change within their organizations is too slow, and that factors such as a fear of loss of control, especially among employees aged 35–49 years, is hindering its adoption.”

The push to close the skills gap also has the potential to create tension with agency partners, who at times transfer knowledge that reduces the need for their services. As Rhea Drysdale, CEO of Outspoken Media, explained:

Companies want to train their team so they can handle more internally, and that makes sense. They see our work as a means to an end. More often than not, that end is team growth.

“This exact scenario happened last year with an enterprise-level professional services company,” Drysdale continued. “Our advocate went from managing one person to a dedicated team that included a data person, an SEO, an editor, and developers. We’re still working with them but as a consultant on project scopes.”

Digital marketing isn’t the only skills gap disrupting the industry—in-house and agency—either.

Further Fronts in Marketing Training

Niches like account-based marketing (ABM) have seen rapid growth in recent years as well.

“The top question we get around education and training development is account-based marketing,” stated Rob Leavitt, Senior Vice President of the Information Technology Services Marketing Association (ITSMA). “There is a hunger and demand for ABM, and it’s far beyond us.”

Search volume for “account based marketing” began a dramatic rise in 2016. Source: Google Trends.

Leavitt believes ABM training has been a reaction to the digital wave, which can confuse interested individuals with interested accounts:

If I download four whitepapers to understand something relevant to my client, I look really interested—but I’m not a relevant account for you. So how do we take what we’ve learned in digital and overlay an account-based strategy and approach?

At times, the skills gap comes full circle. Just as experienced marketers may hesitate to invest themselves in digital, newer marketers, Leavitt cautioned, risk undervaluing traditional skill sets: “More experienced people feel more comfortable with soft skills—collaboration, leadership, teamwork, etc.”

For every marketer, there’s need. For every facet of marketing, there’s training. But can training close the skills gap?

Does Marketing Training Work?

Few executives know.

In Learning Analytics: Measurement Innovations to Support Employee Development, authors John Maddox, Jean Martin, and Mark Van Buren reveal that, when it comes to training, some 96% of CEOs want to measure one aspect more than any other: impact.

How often is it being measured? Just 8% of the time. Another 74% of CEOs want to connect money spent on training to money earned—the return on investment (ROI). It’s measured just 4% of the time.

Measuring the business impact of training is possible. But individual knowledge gains don’t guarantee company-wide improvements.

Recognizing the Limits of Training

On August, 8, 1963, a band of 15 robbers stole £2.6 million in cash from a mail train traveling between Glasgow and London. Media outlets dubbed the heist “The Great Train Robbery.”

In 2016, Harvard Business School (HBS) Professor Michael Beer and TruePoint researchers Magnus Finnstrom and Derek Schrader reappropriated the moniker to allege a similarly monumental fraud: “The Great Training Robbery.”

Despite the ominous title, the authors are less critical of training programs per se than the “fallacy of programmatic change,” which mistakenly focuses on individual behavior change as a way to shape institutions. Their findings suggest the inverse is true:

The organizational and management system—the pattern of roles, responsibilities and relationships shaped by the organization’s design and leadership that motivates and sustains attitude and behavior—is far more powerful in shaping individual behavior.

Evaluating the Corporate Climate

Professor Amy Edmondson uses the metaphor of “fertile soil” to describe a corporate environment in which individual training can thrive.

Additional work by another HBS professor, Amy Edmondson, distills the prerequisites for effective training programs down to a single metaphor: the need for a corporate climate to provide “fertile soil”—a psychologically safe environment in which subordinates can voice opinions freely. Only fertile soil, in turn, can allow the “seeds” of individual training to germinate.

Beer et al.’s work found that just 10% of training programs surveyed had the fertile soil necessary to derive value from training. Too often, they lament, the rush to invest in individual training protects obdurate executives—or the HR representatives who would need to confront them—rather than addressing core organizational or leadership issues.

Those findings align with Kashin’s experience working with clients at General Assembly:

There are layers of team structures, technology, planning processes, etc., that need to be re-examined to be successful in digital. Most corporate programs have an element of change management. The most success occurs when we support a larger change-management effort that has been set in motion with strong internal leadership.

Measuring the Success of Marketing Training

Even with strong organizational support, how do you know if a marketing training program works?

“It usually looks like ‘program success,’” according to ITSMA’s Leavitt. “Clients look at basic satisfaction with the education training: Did it seem like a good use of time? Have we been able to develop the program and succeed? Are we hitting our targets?”

For Kashin, numbers are only part of the picture: “At the core of every one of our success stories are individuals who were motivated to learn and change, and highlighting their stories is one of our most powerful and rewarding ways of showing value.”

“The reality of a lot of these programs,” Leavitt summarized, is that “education training is hard to measure. A lot of it is qualitative, informal. We know it when we see it. We’ve not cracked code.”

Jack Phillips believes he has. Phillips, an expert on determining the value of training programs with a doctorate in Human Resource Management, is chairman of the ROI Institute:

We don’t like ‘estimates,’ but our choices are to do nothing or claim it all. Neither one is any good. Quantitative data is more believable. Executives understand it quite clearly. Our challenge is to make and defend credible estimates if quantitative data isn’t available.

That combined measurement—exhausting quantitative data sources while communicating qualitative ones persuasively—begins with the identification of KPIs.

Identifying KPIs for Marketing Training

“Sales and marketing tend to have the same metrics,” explained Phillips. “Increase existing customers, acquire new customers, increase client quality, etc.”

(A scan of marketing-specific KPIs highlighted by training firms also reveals a list of familiar metrics: number of qualified leads generated, cost per qualified lead, marketing staff turnover rate, and marketing staff productivity.)

When attempting to identify KPIs, a common mistake is not translating a problem into its underlying metric. For example, “poor copywriting” may be a marketing problem, but improvement can’t be measured unless marketing executives identify an underlying business metric—like conversion rate—that can show the effects of successful training.

According to Phillips, identifying KPIs is far easier than parsing the influence of factors that may affect them: What if an improvement to an ad campaign drives more qualified visitors to a landing page? Or a recent website redesign increases site speed?

Isolating the impact of marketing training, Phillips asserted, is the key to unlocking assessment methods that can demonstrate ROI. Still, the math can quickly become complex. So can the cost of measurement. On average, companies spend just 4% of the total training budget on measurement; most spend less than 1%.

Many models, Phillips’ included, outline progressive levels of measurement to help companies scale accountability based on resources.

The Phillips Measurement Model

The methodology behind the Phillips Measurement Model. Source: Phillips, Jack. Measuring the Success of Sales Training.

Phillips uses a five-level model (an optional sixth level assesses intangible values—job satisfaction, organizational commitment, teamwork, etc.):

  1. Reaction: Did participants like it?
  2. Learning: Did they learn from it?
  3. Application: Did they apply their new knowledge on the job?
  4. Impact: Did the training have a business impact?
  5. ROI: What was the value of that impact, and was it a good investment?

While authoritative, Phillips’ model is not the only one. Models by Donald Kirkpatrick and Josh Bersin are also widely used. (General Assembly uses a version of the Kirkpatrick model.) The Kirkpatrick model allows for immediate post-training measurement, while the Bersin model folds values such as efficiency and utility into Phillips’ approach.

Levels 1–3: Generating a Baseline Measurement

The initial levels of measurement include assessments such as post-training surveys to measure trainee satisfaction as well as tests or instructor evaluations to measure knowledge transfer.

Phillips believes the first two levels are sufficient for a baseline measurement of knowledge transfer. Additional levels of measurement connect training outcomes more closely with business metrics and monetary returns, but those insights come at a cost.

Kashin concurred: “Measuring behavior change and business impact is something we always encourage, but it requires a fair bit of investment on the client side.”

Measuring behavior change (“Application” in the Phillips model) also requires a time lapse—Phillips suggests three months—but can be a simple retest of training knowledge or follow-up survey about trainees’ perception of its enduring value.

Levels 4–5: Bridging the Gap between Training Costs and ROI

To complete a five-level measurement with “Impact” and “ROI,” companies must identify a business outcome (e.g. web leads), assign it an accurate monetary value (e.g. dollar value of a web lead), and isolate the impact of training from other factors.

Phillips offers quantitative and qualitative options to isolate the impact of training:

Quantitative

To identify outside variables that affect progress toward business metrics, Phillips leans on experts within the organization, asking questions like:

Qualitative

If, say, a digital marketing training program, an online advertising campaign, and a website redesign all launched in the past three months, ask marketing staff to weight the effect of each, multiplying by their confidence:

As Phillips argued:

When you combine these estimates from a group of people, they become powerful. Our effort is always to go to the most credible process first. If we can’t use a mathematical approach, we’ll use estimates—and we’ll defend them.

Time and again, Phillips has seen the “confidence” adjustment account for human error effectively. (Phillips cited Jack Treynor’s jelly bean experiment as corroborating evidence.)

“The key is to ask the right person and collect it the right way,” Phillips explained. Finding the “right” person or conducting a survey the “right” way is open to interpretation. But, Phillips is adamant, no less necessary:

You have to do it. You can’t just say, ‘We’ll take full credit for it,’ and life is good. Executives will require you to sort it.

Translating the business impact into ROI requires two additional steps:

  1. Converting the business impact to a monetary value (e.g. the dollar value of a whitepaper download)
  2. Determining how many dollars are returned above and beyond the initial investment in marketing training

Importantly, an ROI calculation differs from a benefit-cost ratio in format (percentage versus ratio) and formula (subtracting program costs from benefits):

How ROI differs from a benefit-cost ratio. Source: Phillips, Jack. Measuring the Success of Sales Training.

Even without a complete ROI calculation, assessing the “business impact” of training—when supplemented with a list of intangible benefits—can be a powerful defense at multiple levels within an organization, c-suite included.

Conclusion

Need alone—digital marketing skills today, account-based marketing skills tomorrow—may continue to fill training budgets and grow training programs. Measurement challenges will not fix or excuse skills gaps in marketing departments.

“To some extent,” Leavitt concluded, ruminating again on the question of ROI, “When your clients come back for more, they’re happy with what they got the first time.” It’s a purely qualitative measurement.

Still, robust, quantitative ROI models, though more persuasive in the c-suite, lean on qualitative components, too. All measurements can be defended; all surpass a failure to measure anything at all.

Nor can any assessment answer other, broader questions: Is training currently the best use of marketing resources? Does the commitment to change extend to the highest levels of the organization?

In short, is it the right season? Is the soil fertile? If yes, then plant the seed. And grow.

Related Posts

Exit mobile version