2016 Data Strategy Checklist for Digital Marketers

Originally published on the PMG Blog on December 1, 2015

Congratulations!  You’re just wrapping up a (hopefully) successful 2015.  You’re also well into 2016 planning.  Based on what we’re hearing and seeing, 2016 seems to be the year that advertisers will put data strategies in the lead, more so than any year we’ve seen.  These data strategies can be new forms of CRM activation, new ways to visualize holistic business performance, attribution and advanced data activation.

Silicon Valley has also noticed this trend.  Most noticeable is the growth of cloud-based data services such as Amazon Web ServicesGoogle and Microsoft Azure.  There is also a renaissance in data discovery tools from larger companies such as Tableau, to upstart technology dashboarding companies such as BeckonOrgamiLogic and Domo.

You can boil this down to one simple reason, there is simply so much data in an advertiser’s hands.  It feels like a good thing to have all of this data, but actually it’s one of the digital industry’s biggest forms of stress on an organization and its leadership.  That stress is the realization that you have an underutilized asset with a high perceived cost and complexity to turn into a competitive advantage.

Insights and data activation (not just the data itself) is the currency that powers today’s digital media channels.  Without insights and the ability to activate data into media channels in a curated and automated manner, advertisers will have that persistent feeling of being left behind.

 

How usable is my data?:

This is the eternal question of whether the data is clean and can be trusted, and this is indeed the biggest question to answer.

To answer this question, an advertiser needs to conduct a comprehensive audit. The way that PMG approaches a client data audit is a clear breakdown of dimensions (the columns in a data table) and measures (the numbers and attributes that populate the columns) across your key data sources and to showcase errors visually (to showcase the gravity of the error and the prioritization that should be assigned to resolving).

Simplistically, an audit will be comprised of three pieces:

– Dimension Map – A tree hierarchy of key dimensions to identify mapping errors or points of inconsistency.  This identifies areas of improvement in the data system being audited or in the trafficking process when advertisements go out the door.

– Measure Audit – This is where we reference measures being seen across multiple data systems (such as sales) and compare the totals.  The common misconception is that these numbers SHOULD total up to the same amount.  The reality is that this is rarely the case.  Simply put: different data sources see the world in different ways.  It is vital to have experience in your organization or in your data services partner to understand the nuance and know whether inconsistency is expected or is an actual problem.

– Resolution Plan – The plan essentially addresses how you turn unclean legacy data into usable data and how to make sure the data is clean going forward.  What may surprise you is that this is not always as complicated as it seems.  Understanding the error in the data typically leads to a method of repackaging legacy data to make it usable.  It’s never “what’s done is done, so let’s move on.” We’ve also typically found automated processes or a process playbook (with alerting of errors when they are generated) will typically address the data cleanliness going forward.

 

How much labor is involved in utilizing my data?:

Efficiency is the key when it comes to utilizing your data.  The saying of “too many chefs in the kitchen” is extremely applicable to data.  If your data automation is lacking, you will simply need more bodies to do the work.  The more bodies you have, the more likely you will create divergent processes and more manual errors.  Therefore, investing in a solution to automate your data is key to data survival.

At PMG, we’ve actually made a multi million-dollar investment and dedicated significant time and resources in our homegrown, proprietary data platform to automate data from our clients’ disparate systems; more than 20 platform-specific data sources such as Google AnalyticsFacebook Atlas, and Adobe Marketing Cloud with built-in data blending and standardization. Our combined data output is then easily accessible by platforms such as R Studio and Tableau.  We’ve also constructed this data platform on a scalable, cloud-based platform so that we can scale up or scale down to best suit our clients’ needs and provide them with enterprise-grade reliability.

If PMG is not your agency, don’t worry (well, maybe worry just a little). There are many emerging and established solutions in the market.  If you’re a small or medium-sized business, Datahero is a great, inexpensive solution for automating data from data sources such as Salesforce, Google Analytics and Marketo with automated data visualization built into the product.  For marketing organizations in large enterprises, solutions such as BIME and Beckon offer access to API’s and dashboarding services.  It’s getting easier every day to automate and utilize your data.  So don’t just have a plan to get to your data.  Have a plan to use it to the fullest.

 

How impactful is my data?:

This is where the art and science come together.   The opportunity for making data impactful is significantly aided by data cleanliness (trust) and automation (ease of data sourcing).  The rest is left to nuance and context.  The digital marketing technology landscape is extremely fragmented. A commoditized view of the technologies that create the data can be a huge pitfall for advertisers.

Understanding the ins and outs of data sources leads to unique ways to blend different data sources (joining or stacking complementary data sources together) to create unseen opportunities or identify areas of improvement.

This nuance and context also enables the statistical modeling.  As typically models need to be contextually fit to be useful, understanding of the source is imperative to make statistical modeling a competitive advantage.

 

Lastly, data visualization is an iterative and contextual game.  While the large scale dashboarding companies provide efficiency value that can be easily quantified, we have found that your data visualization and digital strategy should be highly interconnected.

This is why our data team at PMG has seen the success and generated the client satisfaction that we have over the past 18 months.  Nuance, context and a keen understanding of the media programs we operate for our clients (and the rare few that we don’t).

To account for this nuance and context, it is imperative that you have a data lead on your team that has a comprehensive understanding of your business and is highly experienced in the nuances of digital media.  That will ensure whatever solution you choose will not just generate efficiency, but it will also generate impact.

In summary, get ahead of your data challenges in 2016.  Once you do, your stress level will drop and your optimism and will grow. And isn’t that what we’re all looking for when as we ring in the new year?

 

Facebook's Atlas (Re)Launch and 6 Points to Consider

[Originally published the PMG Blog on October 6, 2014]

You may have seen yesterday that Facebook brought Atlas out of the skunk works to reemerge as a revamped competitor in the digital ad delivery space.  As this conveniently surfaced during Adweek, there will no doubt be momentum on this for a couple of reasons.

  • We digital marketing dinosaurs remember fondly Atlas’ heyday as a best-in-class ad server. It will be great to see another major player in the ad delivery market (hopefully). Press announcements evoking fond nostalgia never hurt anybody.
  • The tracking methodology being used by Atlas will hopefully usher in a new period of coexistence between the privacy-focused consumer and the data-hungry advertiser.

For now, let’s focus on point #2. The tracking methodology serving as the primary mechanism (based on news reports) is geared towards Facebook’s omnipresent view of an online user’s “Sign-In State.”  This Sign-In State Tracking (SST) standard is exciting because not only does it move marketers from an antiquated cookie-standard of tracking, but it hopefully allows greater control to the consumer in what they want to share and if they will share any information at all about themselves.

Marketers now have a more reliable source of cross-device tracking of their customers and prospects as it is extremely rare for a consumer to A) not be on Facebook (Facebook reaches 83% of all online users per comScore) and B) not be actively logged in either via phone or desktop or both. As there are very few entities that have this kind of SST tracking scale and the ability to execute ad delivery (Google and Facebook) and those who may be hunting for an ad server to match to SST scale (Apple, Twitter and potentially Adobe), this simplifies the game…to a point.

Unless any future SST players want to band together to create a standardized data exchange, there will be new market for data “last milers” who want to bridge these data sets. Potentially a growth opportunity for DMP’s and the evolving tag management companies who are chasing the illustrious “unified customer ID” for advertisers and then porting that ID to data players. Nevertheless, this market will come together and create greater transparency from the existing 3rd party (and perhaps 1st party) cookies.

For consumers, trying to opt-out of tracking is either a web browser setting activity (not fun for the less tech-inclined) or directly with data brokers (less fun than a route canal). This opens up the game for privacy control by simplifying the primary holders of your data.  As consumers are already training themselves on privacy control in social media and across the Google ecosystem, this blends well into their current privacy protection behavior.

route canal). This opens up the game for privacy control by simplifying the primary holders of your data.  As consumers are already training themselves on privacy control in social media and across the Google ecosystem, this blends well into their current privacy protection behavior.

This should play well into the soothing of tensions in the great privacy debate. So long as the SST players adhere to the mantra “With great power comes great responsibility.” Here’s your chance to be benevolent and just overlords of my data.

Facebook is already looking like they see the bigger opportunity with announcements of partnership with optimization platforms like Kenshoo and Marin and with agency partnerships. From the views of the user interface in the press, it also looks like there was a major overhaul in the UI that hopefully means that we may see some new ad server innovation outside of just look and feel.

For advertisers, there is a lot to consider here. But there is also much more to hear before you start your ad server shopping, such as:

  • While the privacy changes in iOS 8 are not as strong as predicted, does SST methodology outweigh the planned focus to make mobile users anonymous at the hardware level and provide true cross-channel attribution?
  • What are the plans of the other potential SST players such as Google?
  • What are the plans for those on the SST fringe (SST scale but no ad delivery of substance) such as Adobe and Twitter?
  • Will Apple get into the game or is Tim Cook’s stance in the press lately a sign that Apple will bow out of this data arms race?
  • Beyond the tracking methodology innovation, how does Atlas stack up to other ad servers considering the many years of being relatively dormant from a roadmap perspective?
  • How will existing analytics leverage SST and how long will it take for the information to bubble into analytics packages such as Adobe and IBM or with attribution platforms such as VisualIQ, Convertro, and Adometry by Google? I have a feeling that the last one may be covered here if Google enters the game as expected.

 

Long story short, advertisers should get excited about the wave of innovation to soon come…but you probably want your procurement folks to stay on Alert 5 for the near term until some of these key questions have solid answers.

eTail

It was a great few days at eTail West in Palm Springs.  I am always humbled to be in the same setting with so many of the best minds in retail strategy, marketing, and technology.  I especially appreciated my fellow panelists on the "Adding Value To Your Business And Getting Started With Attribution Modeling To Allocate Dollars And Resources Most Effectively".  Thank you @brandonproctor, @eskolfield, and to Mike Sands from BrightTag for his excellent moderating and for his microbrew analogy of attribution.

Advertisers...Understand Your Options With Tracking

While I would not consider myself a technical wizard by any stretch, 14 years in digital advertising has forced me to get my hands dirty and understand methods of tracking online users.  This to ensure that the brands I have represented are aware of issues near and dear to consumer privacy advocates.  

Recently, the FTC settled with online advertising company, Scan Scout, regarding their use of Flash cookies.   While Flash Cookies may carry more nefarious aliases such as “zombie cookies” or “super cookies”, they are essentially data files associated with Adobe Flash, the popular web multimedia platform which is typically associated with web video, animation, and interactive applications.  These data files act like standard cookies, but since they do not live in the same location as standard cookies, they are not removed either manually or automatically by typical web browser controls.  They can also be utilized to spawn standard cookies that have been deleted at some earlier point.  It is only a recent development that web browsers have been able to identify and manage these in a similar manner to standard cookies.

In non-technical speak, flash cookies enable advertisers, networks, and tracking mechanisms an ability to follow online users without much control of the user to control their privacy by limiting the online user’s ability to remove the cookies.  For those that see the situation as such…it’s the equivalent of putting a “kick-me” sign on somebody’s back and then that somebody wonders why they’re getting so much attention.  I speak for many when I say, that just ain’t right and it’s not good for our industry.

While Adobe has been making strides to resolve this use of Flash, there are always going to be those who continue to push the envelope on tracking methodologies, which ultimately are not in the control of online users.  This type of behavior is what will continually push legislators to take a more heavy-handed approach to regulating online privacy.  In essence, the cops will break up the party before it even gets off the ground. 

An example is the utilization of fingerprinting.  Fingerprinting leverages information published by the browser such as IP address, browser version, time zone, etc. to create a “fingerprint” of the online user machine with the intent of re-identifying them at a later date.  To be fair, this methodology seems like a challenge and I respect the technical prowess of an organization that can deploy this methodology in a scalable form and accurate format.  I have heard arguments from both sides pertaining to the accuracy or inaccuracy of fingerprinting, especially accuracy over lengthy periods of time.  So, I feel the jury is out on this methodology from an accuracy perspective.

In my mind, regardless of the level of accuracy, you are essentially tracking consumers without their permission or ability to opt out.  As the information is based from standard communications between the web browser and the website, the user will not have control over this information (at least easily) and it will force similar resistance from consumer advocacy groups, thereby leading to either legislation or the elimination of passing information from the browser to site.   Now, the issue with the latter is that websites would no longer know browser type, location, OS, etc. and thereby present a very impersonalized experience to users.  That will not make a web operations group very happy.

As an advertiser, it is imperative that you request documentation from your partners on how they collect information on users and what users are able to do in order to opt out of tracking.  If it appears that users do not have control over the tracking, it is best to error on the side of caution.  Why?  Do a search on Google for “flash cookie lawsuits” and that should be all of the evidence you need.  While some of these cases have fallen apart, it is nevertheless bad press for the advertisers.

Lastly, it is imperative that advertisers realize that the digital world is not 100% trackable.  By that line of thinking, the end solution will likely encompass standard tracking but also quantitative approaches (similar to media mix modeling in the traditional media world), to ensure that the “untrackable” is accounted for.  This obviously makes the end solution less simple to construct as the skillset becomes highly specialized.  However, user control of their privacy is paramount and advertisers have the responsibility to drive their dollars towards providers utilizing responsible tracking mechanisms.

If you are an advertiser, be wary of “too good to be true” tracking methodologies that are outside of the average online user’s control.  Either the leak will be plugged up at some point or worst-case scenario…you could be seen as an advertiser uncaring of consumer privacy.  In the court of public opinion, the “I was not aware” defense does not carry much weight in the area of consumer privacy.