12 December 2023 |

What so many of us get wrong about research

By Tracey Wallace

It’s a problem most research reports have: they don’t really tell you anything new.

That’s a crazy statement, right? Because by its very definition, research has been done in order to tell you something new, to give you datapoints to back it up, to be compelling, and break through the clutter of ish on the internet. 

And still, most reports just reiterate the data back to you. In fact, here’s a recent one from LinkedIn. It’s a report on AI, and yet, in reading it, there’s very little new to takeaway. Sure, there are datapoints, and you might be able to use these to create your own story, often confirming your own bias, but I doubt that was the goal for LinkedIn. 

What they didn’t do was use this data to tell a bigger story about why AI matter, and what the data suggests will happen next (or might currently be happening and we just don’t see it yet). 

Go ahead, read it. You’ll see that most pargraphs summarize the data found in the charts. They tell the percent and tie that back to an overarching theme––but nothing that really drives the conversation forward, or takes a new stance. It backs up, at best, what we already know, and at worst leaves you thinking: “So, what?” 

There is a CTA at the end of rhow LinkedIn can help––and perhaps that is their answer to “So, what?” 

But I’d argue that LinkedIn, and all companies that do research like this whether its their own data or commissioned, was looking to use this report to help strenbghten their status as a thought leader. Access to and the presenting of data on relevant topics for audience does just that. 

But, it’s also really important that you take a stance, too. Why? Well, because:

  • It makes the content a whole hell of a lost more interesting. If you’ve done data visualization well, people can look at a chart and gather the summaries for themselves (that’s the point of data visualization). Your copy should add to that data, and make it clear to the reader why it means, why it matters, and what is next. 
  • It positions you as the thought leader––not just a company doing a research piece, which so many do thesedays, but one that is willing to put a stake in the ground, to analyze the data, and then do the work to take it a step further and make it add to the conversation. This might mean you remove data from the piece. This might mean you rework visualization. This might mean you need to get CEO approval and align with your brand team for overall messaging. It also means that you are taking some unique and new to the audience––not just a nicely package data dump. 
  • It makes sales enablement much easier. A data dump isn’t an easy thing for sales teams to use with prospect. Sure, it gives them data to pull from, but data doesn’t tell a story on its own. And sales people are in the business of stories. Posiiotn your research right and it’ll be easier to sell it to your sales team, have them remember it, and have them use it in outreach to prospect (all helping your team have a greater impact on the bottom line). 

Why is all of this coming up now? Well, because me and my team got caught in the data dump mind-set with our BFCM report a few weeks ago. And why wouldn’t we? We had 6 hours getting final data in to analyze it all, write about it, and turn around all the graphic and GTM assets to support it. And that 6 hours kicked off a 4 p.m. central time. 

But, it was my CMO who pulled up the doc and in a call said: “Can it be more of a story?” And she was right. We were regurgitating the data on the page in each section, and very eloquently just describing what the chart clearly showed. We aren’t alone. A ton of research reports do this. 

And beyond a time crunch, the other reason why reports often end up like this is that content folks don’t have the permission to write a story, and to connect larger dots. The brand in general is too afraid to stake a stance. And once you get in a habit of doing work like this, well, it’s hard to break it. 

But our CMO did break it. She gave us the permission to put a story to the data, to look at it more holistically and ask, “What the hell does this even mean? What is the data telling us?” And then to tell that story for our audience so they didn’t have to ask themselves those same questions. 

Here is what we ended up with. 

But weeks before this report, I was inspired by OGM’s research and their storytelling around it. In it, Nigel, the author, doesn’t regurgiate a single stat. He leans heavy on the charts to tell the story for him, and draws conclusions based on that data all around it. It’s materful. It’s helpful. It’s interesting. 

And it’s how we should all be doing our reporting on research.