Friday, October 8, 2010

My First Four Analytics Lessons

I had a good meeting the other day with a very smart friend of mine (I have lots of friends who are way smarter than I am -- it's a secret to success). This buddy began his career as a graphic designer and is now a Brand Strategist for a well-known consumer product goods (CPG) company here in town. My intent was to pick his brain about his career -- what kind of training he needed to do his job, what did he look for in a job opportunity, what are the economics of his segment, how might one enter this segment, etc.

But our conversation, interestingly, swiftly turned its gaze towards analytics. I mentioned my interest in visual analytics and described what this field was. He was intrigued. We talked a lot about consumer marketing campaigns and how to measure their effectiveness, particularly when you start rolling social media into the mix. How do you know your marketing spend mix is optimized? How can you measure its impact when we're talking about hearts and minds?

I won't recount the minutiae of the conversation, but I did walk away with several insights that I want to record for later. The early lessons are often the truest.

  • You need data in order to create analytics. OK, this sounds self-evident, but in some cases (e.g., social media) gathering the data and its relationships might be the hardest part of the process. You have to get it, you have to taxonomize or structure it, and you have to be able to navigate it heuristically.
    Lesson: There is almost certainly significant value in understanding which data to collect, how to get it, and how to normalize/structure it prior to analysis. That might be a sub-specialty itself.
  • How do you know if you have bad data? If Data + Analytics + Visualization = Insights, then what happens if the data is just "bad" (e.g., unrepresentative of the population at large, falsified)? And how do you know your resulting conclusions may be wrong? I strongly suspect this is a sub-field of statistics that I've not yet encountered as companies like Gallup and Nielsen MUST have solved this by now.
    Lesson: Don't forget about the credibility of the data.
  • The analytics process is context sensitive. While I know there are platforms out there for visualizing complex data sets, I believe the context of the associate business decision alters the entire process significantly. Let me give you two examples: Retail to Consumer and Internal Security & Compliance. In the first example, you want to understand and measure the impact of marketing campaigns on your bottom line or, failing that, on some metric that indicates the strength of your brand. The data you capture and the analysis you perform will revolve around that business logic. For Internal Security and Compliance, you're measuring your security posture, your compliance status, and, possibly, the financial risk that each brings to your bottom line. Again, the data and how you manhandle it will vary from the first case significantly. As will the timing of the analyses. The Marketing data might be updated hourly while the security and compliance data might be updated quarterly.
    Lesson: Work top down and not bottom up. Let the business question drive the data and the analytics. Don't let the available data dictate what questions you are answering.
  • Product Managers are going to need analytics skill sets. I predict that, within five years, the truly effective product managers will be the ones with strong data analytics skill sets. This is not the case today[1], but I'm seeing a lot of companies opening up analytics positions for functions that are product-manager-ish in nature. In fact, I wonder if the need for data analytics might not make current product management approaches obsolete? Any good PM will be looking to make his product obsolete and the same should be true for his or her current skill set. What's gonna replace us?
    Lesson: See that data analytics meteorite? It's coming for you, dino-boy. Start thinking mammal.
[1] The sad truth is that there is NO objective, a priori measurement for Product Management effectiveness today. In other words, when you're hiring a product manager, there's really no good way to know whether he or she is any good or not. This is not true for sales, engineering, marketing, finance, or executive leadership. In fact, the job of PM tends to vary significantly from company to company and, when hiring, the first question I always ask is "What did 'Product Management' mean at your last job?". Maybe the ability to collect, normalize, analyze, and make sense of raw quantities of data will become that bar that PMs will have to jump over to achieve basic competence. Right now, it's frosting and not the cake.

No comments:

Post a Comment