February 20, 2006
In our computer-driven, paperless-but-not, automated world, it seems like we should have all the data we need when making a decision. Signing up for new cellular service? Analyze your usage patterns for the past few months and then map this to different plans. Trying to predict product sales for the coming year? Just analyze past sales and figure out how to trend it.
The reality is, data is much harder to analyze than it seems like it should be. And, years before Staples introduced the “Easy Button” in their advertising campaign, I attended a conference on Web analytics where the presented had a slide with a big red button on it that said “Analyze.” His point was that there is no such thing — not in Excel, not in more sophisticated tools like SAS and SPSS.
I constantly run into people both at work and elsewhere who have a need, pain, or problem that is somewhat defined, and there first instinct is to pull data for analysis. Time and again, I’ve watched data be pulled without a clear plan for using it. And, time and again, when the person who is looking at the data has realized that insights have not magically emerged, he asks for more data!
I’ve got a pretty large soap box on this subject, and I’m destined to revisit it in greater detail. For now, I’ve got some targeted funding applications to review for a nonprofit I work with. The applications are just a half-dozen pages long. Frankly, based on past experience with this sort of thing, that’s about all I can wrap my head around, and I’d bet I make a pretty good call on them.