Data infrastructure optimization, availability & security software
Data integration & quality software
The Next Wave of technology & innovation

Mailing List Mavens to Geospatial Gurus: Big Data in Market Research

Marketing Mavens of Old

Not so long ago, marketing research meant old school tools and techniques. One current example from the global corporate pantheon is Experian. After it acquired TRW Credit, Experian became known in the US as a source of credit reports, but for most in marketing research, Experian once meant mailing lists. For many, list management is probably associated with Constant Contact and Mailchimp – but their lists are generally associated with web opt-in and membership lists. Before Mailchimp and Mad Mimi made the scene, list builders and brokers held the keys to a certain type of actionable market intelligence.

Before internet search, before Google Ad Words, before SEO, even before web sites, a primary source for direct marketing was lists of company names, people, addresses and phone numbers. But these lists were costly, produced relatively low yields, did not target with great specificity and — most of all — did not allow for prospect engagement. Product managers had to identify market opportunities, try to anticipate market size and opportunities for success, and do it with limited data.

So how have things changed?

Segmentation Sagas

A major factor in the change is expanded use of multiple data sources. In Spring 2014, Joel Rubinson wrote in the Green Book blog about Nielsen’s data matching capabilities:

Individually matching TV viewing, Facebook, digital clickstream, and radio listening to frequent shopper data. Nielsen has combined their audio panel with frequent shopper data via Catalina and their own Homescan panel.

Nielsen had been developing its Big Data sources (and cultivating big customers) since before it was called Big Data. So it was just a matter of time before Facebook (in 2013) signed deals with Datalogix, Acxiom, Oracle’s BlueKai and Epsilon to leverage “off-Facebook” purchasing.

Traditional views of customer segmentation still support major brands, but there is a recognition that some segments move and shift rapidly. Data to support that movement must be disseminated quickly into organizations so that it can be rapidly converted to product or service design intelligence.

So what is on the horizon? Here are seven trends frequently mentioned:

  1. Adoption of standard practices to develop increasingly sophisticated customer clickstream models
  2. Competitive product / brand profiling, with allowances for shorter life cycle churn
  3. Increased use of “First party data;” deep analytics of current / past customers, approaching 1:1 profiling
  4. 360 degree prelaunch product / service feature-level market research — not limited to a few focus groups or small surveys
  5. Sentiment Analysis beyond the classic Net Promoter Score, toward other metrics, such as the ForeSee Word of Mouth Index
  6. Increased use of collateral data sets, especially geospatial and purchase history, but expanding (for some industries) into the Internet of Things
  7. Hybrid affiliate / partnership data sharing cutting across traditional organizational and discipline boundaries, such as across a supply chain

Geospatial Gurus

Today market research is available for hire. Like Gartner and Forrester for information technology, industry specialists provide results tailored for industries they know well. Even as Oracle BlueKai promotes its general purpose Audience Data Marketplace as “DaaS for Marketing,” outfits like Health Market Science are betting that pharma is more likely to outsource market research to a specialist which speaks their language.

Google Earth Geospatial Marketing Maps

Google Earth Tutorial Caters to Geospatial Market Analysis (Image: Google)

A 2013 Microsoft white paper aimed at the oil and gas industry identifies new applications for big data in that sector: equipment maintenance, production / price optimization and improved use of weather and environmental data streams to strengthen safety records. How is Oil and Gas to accomplish this? Microsoft cites Big Data’s general purpose resources: real-time analytics, complex event processing, improved business processes and self-service business intelligence.

By contrast, Bain goes to the core of the matter [sic]: recommending data-driven subsurface geology analytics for individual well performance tuning, use of live 3D imaging over fiber to improve well delivery performance, and BiD data for operational intelligence in well spacing. Bain’s pitch to this industry doesn’t skip over data warehouse fundamentals, observing that many oil and gas systems lack metadata on wells, machinery and drilling operations.

Bain suggests a two-track approach, with both incremental improvements in legacy applications while investing in Big Data infrastructure, such as when:

one leading international oil company implemented a sophisticated Hadoop analytic platform on Amazon Web Services’ cloud infrastructure and limited the touchpoints with its legacy technology infrastructure – a strategy that helped keep it cost-effective and agile.

The Geospatial Gurus are coming. They may already be at work in your enterprise. As Peter Daboll wrote in Forbes, “much of the trend and pattern identification is already done by the time it gets to them.” They’re just too busy slicing, dicing, predicting and segmenting to tell you about it.

One way to keep it “cost-effective and agile” is to consider pairing Syncsort’s Ironcluster ETL with Amazon EC2.

Related Posts