Data Just Is
Data is not an answer. It is not proof. It is not direction. It is not plan. It is not a strategy. Data just is.
Data is a Tool.
On it’s own, it is not the solution or an explanation or an excuse. Data is a term that is used interchangeably with information and results, both of which it is, but on it’s own, it’s useless. It’s quite like a hammer. I have a hammer. It does not make me a carpenter. It doesn’t mean I know how to use it. In the wrong hands, or in the hands of an uncoordinated or inexperienced user such as myself, danger and accidents will likely ensue.
The same goes for data.
Like a hammer, data can provide results, in the way of insights, overviews, trends, and any number of historical analyses as well as opportunities for predictive modeling. And a hammer can build a house or a masterpiece of art. Without knowledge, awareness, or talent, both can destroy.
The biggest mistake people make with data is reviewing it in a vacuum. Without understanding what influences were involved on the snapshot in question, the data is effectively useless in providing the desired analysis in giving an overview and outlook on performance.
One Snapshot Doesn’t Tell the Full Story
I worked in an organization that brought in new leadership to help out as some of the brands under the umbrella were faltering. They poured over data, in their own world, excluded from outside influences, and came out, bleary eyed, with their plan. No insights or background, they reviewed it cold. And based on their analysis, they proudly announced that despite being retail brands, this space did not have any real Black Friday seasonality.
The data showed it. Even historical data (2 years of it) were compared and reviewed. No uptick, nothing, in fact it showed a downturn during that time period. Therefore despite objections and at the dismissal of suggestions and advice, a new plan would be put in place, to push sales later in the season, to capture that audience.
It failed. Common sense would have suggested it would. A toddler would have suggested it would. Cash-infused egos, armed with data as “proof” knew better. Had they thought to consider external influences, factors, or behaviors, or had they even ASKED a question, especially considering the outcome was so diametrically opposed to the expectation, they would have known there were significant issues at play for two of the brands at that time. And yes, it happened two years in a row. Significant technical issues that brought the servers down.
The audience, instead, does follow the standard retail trend of significant Black Friday shopping. But our brands now faced the third year in a row of not being at the party. Pushing our efforts later in the season, of course, also did not help, as the buyers bought, during the Black Friday weekend hullabaloo, and so were not interested in the later offerings our new consultants planned out. (Our audience, while they love a deal, did not rely on them. It was the experience of the season that got them to buy then, and we were not able to offer them that. The rest was custom work, which does not lend itself to fast and flash sales late in the season!)
The data was right all along. Because it is always right. It is exactly what it says it is. A snapshot of historical activity. Our brands did not experience an uptick in sales during the previous 2 years during an expected shopping season. The data was right. The analysis was wrong, because there wasn’t any. They never asked why. They used the data as the analysis in and of itself.
That’s not how data analysis works. But that’s too often how it’s done.
There is no why inherent in data.
What are some of your data fail stories?