What "Working" Actually Means

This post originally appeared in my Substack newsletter, The Work Behind the Work. Subscribe here.

Each marketing team I’ve worked with has had this sort of meeting.

It’s at the month’s or quarter’s end. Someone assembles a report – slides, graphs, possibly a dashboard image – and the figures are shown: views, how people responded, clicks, new followers, emails dispatched, open rates. The report looks complete; the data appears correct. Managers approve, then somebody asks the question, which makes everyone fall silent:

“So, is it working?”

The pause is a giveaway. Because the report can demonstrate that marketing took place, and that the team was busy, but it often can’t show – and, really, most marketing reports aren’t made to show – if any of it made a difference.

This is the outcomes issue: not a shortage of data, but a shortage of understanding of what the data should signify.

Photo by Jon Tyson on Unsplash

The activity difficulty

A form of marketing assessment is, in essence, simply activity recording, given the performance appearance. The team ran three promotions, posted 40 social media items, sent 12 emails, and created a new product specification. Done, done, done, done. The work was completed.

However, “the work was completed” is an implementation measurement, not an outcomes measurement. It tells you the machinery was in operation. It doesn’t tell you it was in operation in the correct direction.

I’ve been caught in this difficulty myself. At the beginning of my career, I judged success by production – how many items we released, how rapidly we released them, how refined the products were. And those things do matter. But they are inputs, not outcomes. A flawlessly executed campaign that doesn’t change anything significant is still a campaign that didn’t work: it only appeared good while failing.

The activity's difficulty is attractive because it’s easy to measure and claim credit for. Outcomes are more difficult. They appear more slowly, are more difficult to assign, and are often shared between teams. This leads me to the unpleasant truth that most marketing measurement is designed to avoid.

The attribution difficulty (and why it’s a diversion)

Attribution is the ideal of marketing assessment, and I believe the focus on it causes more harm than good.

The promise is simple: if we can trace every dollar of income back to the marketing contact that created it, we can prove return on investment and improve as a result. The reality is more confused. Buyers do not move in straight lines. A potential customer may see your advert, read your blog post six weeks later, be recommended by a colleague, go to your trade show stall, and then take a sales call – and the CRM gives credit to whatever happened last.

This isn’t a technology difficulty. It’s a complex difficulty. And the more effort teams spend trying to perfect attribution, the less effort they spend on what actually matters: determining whether marketing is moving the business forward and what to do next.

I’m not saying don’t measure. I’m saying the aim of measurement should be education, not credit. The question isn’t “which campaign gets the revenue?” The question is “what is working, what is not, and what should we do more of?”

What “working” looks like

If we look beyond what looks good and stop trying to link every dollar spent to a result precisely, what should we really be looking for? This is what I think:

Is the sales funnel moving? Marketing isn’t for closing sales – that is the sales’ job. Instead, marketing should make sales easier to complete. Are good leads going into the funnel? Do they understand things better when they get there? Is the time it takes to make a sale decreasing? If you are releasing new products, are the right people aware of the right details before the salespeople call? These aren’t figures you will find on a social media page, but they are the ones the business cares about.

Is demand going up? Not clicks, not views – actual demand. Are more people looking for you? Is the number of people coming to your website rising? Are you getting more requests for demonstrations, more invitations to submit proposals, and more people at your stand who already know of you? Creating demand takes time, and it’s hard to attribute to a single campaign; this is exactly why many teams don’t put enough into it. However, it’s the clearest sign that marketing is working at its base.

Is the brand better than it was half a year ago? This is the hardest thing on the list to measure, and I’m including it on purpose. Brand strength is visible in many places that don’t have figures on a screen: the quality of the people you hire, whether potential customers will meet with you, whether experts and reporters know who you are, and whether your customers recommend you without being asked. You can’t easily put a number on it, but you can sense the difference between a brand that is known and one that is not. And if your marketing is working, the brand should be getting better known.

Are you learning? This is the thing I think people miss the most. Every campaign, every launch, every three months ought to leave the team with something they didn’t know before. What messages were good and what didn’t work? Which ways of getting to people delivered, and which were just noise? What did well at the trade show,w and what was a waste of room? If the team finishes three months and can’t say what they learned, the way you are measuring things isn’t doing its job – no matter what the figures say.

The problem of reports

Most marketing reports are about what was done: what we did, how many people saw it, and how it did compared to before. This is useful information, but it isn’t the same as a story about results.

A results story answers a different set of questions: What were we trying to do? What happened? What did we learn? What should we do next?

The difference seems small, but it changes everything about how a marketing team talks to those in charge. Activity reports put marketing on the defensive – here is proof we were busy. Results stories put marketing in a position of planning – here is what we learned and here is what we suggest.

I’ve been in rooms where a marketing team gave a report full of good numbers, and those in charge left not believing it. And I’ve been in rooms where the numbers weren’t great, but the team told a clear story about what they learned and what they would do differently, and those in charge left sure that marketing was headed in the right direction.

The numbers are important. But the story is more important.

Making results part of what you do

If results seem like something you add on at the end of three months – something you hurry to prove – the solution isn’t better reports. It’s putting results into what you do from the start.

This goes back to being clear. If what you are asked to do says what success looks like before the work starts, then measurement isn’t something you work out later. It is part of the plan. Was the campaign designed to drive requests for demos – did it? Was the new product meant to create awareness among a certain type of buyer – did that type of buyer’s interest rise? Was the social strategy meant to help the sales team? Are they using the content, and is it helping?

When you define results at the beginning, you allow yourself to measure what matters and not worry about what doesn’t. You stop making reports that prove marketing happened and start making insights that tell you what to do next.

That is truly what results are for; they aren’t an ending, but rather the thing that makes the coming round of understanding better. That better understanding then makes the next go at doing things, at execution, more focused. And that, in turn, creates the next period of progress.

This goes on when what you measure helps you learn, and doesn’t simply confirm you did something. If a marketing account of yours shows only that marketing took place, it isn’t a report; it is a bill.

Next
Next

Why Your Marketing Resets Every Quarter (and How to Stop It)