Monthly Archives: April 2012

Big Data & What it Means For Today’s TV Industry

Since I began working at Bluefin Labs, the industry that revolves around big data has fascinated me. I wrote this blog post for Bluefin a couple weeks ago summarizing the idea of big data and how it’s affecting the social TV industry today. Check it out!

What’s the Big Deal with Big Data?

These days it seems like you can’t check Twitter or browse news articles without coming across the term “big data” at least a couple times.

But big data isn’t exactly new, so why all the hubbub now?  Well for starters, the scale of data collection today is of legendary proportions.

To call what’s happening a torrent or an avalanche of data is to use entirely inadequate metaphors. This is a development on an astronomical scale. And it’s presenting us with a predictable but very hard problem: our capacity to collect digital data has outrun our capacity to archive, curate and – most importantly – analyse it.

-“Why Big Data is now such a big deal” John Naughton, The Guardian

When I say legendary, I mean it – we’ve had to create a whole new set of multiples to describe the volume of data being collected – from terabytes to petabytes, exabytes, zettabytes and yottabytes!

Ok, we’ve got a boatload of information. So…now what? John Naughton again:

Hidden in those billions of haystacks there may be some very valuable needles. We saw a glimpse of the possibilities when Google revealed that by analysing the billions of queries it handles every hour it could predict flu epidemics way ahead of conventional epidemiological methods.

Retailers such as Walmart, Tesco and Amazon do millions of transactions every hour and store all the data relating to each in colossal databases they then “mine” for information about market trends, consumer behaviour and other things. The same goes for Google, Facebook and Twitter et al. For these outfits, data is the new gold.

Mr. Naughton may not be exaggerating. In a GigaOM post Robert Greene notes:

According to a 2011 study by the Aberdeen Group, organizations that effectively integrate complex data are able to use up to 50 percent larger data sets for business intelligence and analytics, to integrate external unstructured data into business processes twice as successfully, and to slash error incidences almost in half. The connection between a company’s success and its ability to leverage big data is very clear.

The new gold, indeed. But let’s circle back to one of Mr. Naughton’s initial observations:  “it’s presenting us with a predictable but very hard problem: our capacity to collect digital data has outrun our capacity to archive, curate and – most importantly – analyse it.

In an Ad Age blog post, CRM expert Hal Brierley calls this an “Achilles heel of [the advertising/marketing] business: we don’t have enough specialized knowledge to adequately draw insights from all the data we have. Frankly, we are amid a talent crisis.”

On Monday eMarketer released a report outlining how marketers are struggling to link digital data to the big data picture.

“Without the ability to integrate big data collection and usage processes, companies are certain to fall short in delivering a truly personalized customer experience integrated across ad formats and channels. Such a mandate is of significant importance as consumers increasingly interact with brands across multiple channels and screens.”

Continue reading on the Bluefin blog…