“Nature abhors a vacuum.” Whether you first heard this phrase from Jeff Goldblum in the 1993 “Jurassic Park” film, or you actually remember your professor crediting Aristotle in your 200-level philosophy course, there are parallels to what we are experiencing in adtech.
A corollary to “vacuum” in business is “market efficiency.” Markets, like nature, seek to find ways to bring more efficiency and transparency to processes.
As this evolution happens in our industry it would be wise to remember that the value proposition behind programmatic advertising was to bring advertisers an efficient application of data for targeting with the promise of measurement. Programmatic is an innovation that brought great value and efficiency to a previously arcane process marked by Yellow Page subscriptions and hanging digital flyers on the homepage of portals.
And while we are still inclined to pat ourselves on the back for our innovation, it would be pragmatic to start listening to the concerns being expressed about the state of the system we have created.
Any of us who have been to an industry conference or met with a brand advertiser recently have heard the growing complaints over the rising costs of the adtech stack. Popularly referred to as the “adtech tax,” it is liberally associated with the rising cost of media platforms, ad servers, data, first-party onboarding, fraud and visibility monitoring. And of course, there is still the attribution and measurement problem that remains unsolved and sticks out like a sore thumb.
Yes, there are likely too many hands in the pie, and yes, the path to market efficiency will involve consolidation. But, we should be wary that, as that process evolves, we don’t paper over our original value proposition. It would be wise to lean into innovation to fix our shortcomings and be sure to protect the core value of data transparency, optimization and measurement.
We also should be careful to guard against the potential for dilution of targeting and measurement from black box automation solutions touting “AI” labels or boiler room campaign operation schemes. Solutions that are more opaque and that dilute core values are not the right answer.
One way to innovate around the overpopulation of the ad stack is to bring efficiency back to digital advertising through attribution and measurement. The attribution and measurement conversation has been years in the making. The walled gardens, device deprecation and the cookieless future managed to throw some shade (that is, provided cover) on the issue for the last few years. However, the acceleration of connected TV and omnichannel digital advertising has once again exposed measurement as a key issue. Attribution and measurement need to go mainstream.
But, that will require overcoming three key challenges.
The first is uniform coverage across all platforms – postal, email, display, native, mobile, social and television.
The second is timing. Each platform in the advertiser’s omnichannel journey is standalone. Different platforms and devices require different match capability. Some are browser- or publisher-based, some have device identifiers and some don’t. Online, offline, living room or mobile phone – all of the touchpoints count and all have to be supported.
And, the third is where in the stack does this capability live?
Focus on the audience
Standalone attribution solutions have been tried, but they require solving for negotiating data ownership and collection issues that typically happen post-campaign. Knowing the path the horse took when they escaped the barn only works for really big-ticket advertisers
Bringing multiple platforms exposure files together at a media-platform level is completely impractical. We are in an era where there are walled gardens standing in the way and CTV matching has no cookies nor any universal adtech IDs to lean on.
The promise of a true universal cookie-free identity remains 50% press release and 50% fact – a reality which, even if the platforms played well together, is too inefficient for omnichannel measurement.
This leaves us with a third option as the focal point of measurement – the target audience. Whether it occurs as an extension of onboarding or as a feature of the customer data platform (CDP), the logical focal point for measurement and optimization is to measure at the audience level.
Interestingly enough, the most serious impediment to fulfilling the promise of targeting across platforms for the entire market is time and money. Done right, an advertiser should know the next day exactly who in their target audience has been served and who hasn’t. They’d know how many times a record was served and on which platform. They’d know which platform delivers for which audience profile and which creative so that the audiences can be reconfigured to meet the demand.
There is a lot of upheaval and questions being asked regarding the adtech tax and proof of performance in the omnichannel. If optimization was a predictable result of measurement services, and that happened in near real-time, then the value of data would be far more obvious and attribution and measurement would be far more likely to be invested in.
Ray Kingman is the CEO and co-founder of Semcasting, a data-as-a-service (DaaS) provider. He leads the company in the development and commercialization of its automated targeting and data offerings. As an experienced innovator in content management, analytics and data visualization fields, Ray directs the day-to-day operations of Semcasting. His extensive experience working in the marketing and advertising industry field allows him to speak confidently on matters of consumer data privacy, identity resolution, brand empowerment, customer acquisition and digital and online marketing.
If you like this article, sign up for the SmartBrief on Social Business email newsletter for free.