Is SPO the silver bullet to programmatic’s transparency challenges?
The ongoing header bidding revolution might be bringing joy to publishers, but it’s giving the buy side of programmatic a bit of a headache.
While the technique allows demand-side platforms (DSPs) to help advertisers access a wider range of inventory, this amplified scale is creating issues. Not only does the bid duplication header bidding generates mean DSPs must deal with higher volumes of requests, but greater adoption is disrupting auction mechanics and leading supply-side platforms (SSPs) away from the traditional second-price model. As a result, programmatic processes are becoming muddied; limiting transparency — especially for advertisers and agencies.
Yet the good news is a new hero has arrived in ad tech town that promises to reorder the market by simplifying interactions and boosting clarity: supply-path optimisation (SPO).
Or has it?
SPO: a brief overview
Firstly, SPO doesn’t refer to one optimally efficient method that everyone should follow. It describes the procedure of streamlining connections with SSPs, where the details of implementation depend on what specific DSPs want to achieve. Unique objectives are diverse, ranging from identifying SSPs that offer the most relevant bids to cutting ties with those using certain auction models, such as first-price.
What are the benefits?
Despite its myriad forms, the driving goal of SPO is the same: for the buy-side, keeping the process fair and prices of inventory down, and for the sell-side, maximising revenue. To understand how it can do both, we’ll need to explore the situation as it stands.
Where once, a DSP might have handled 1.3 million impressions per second, mass adoption of header bidding means the figure is now closer to 4 or 5 million, and this is putting pressure on DSP resources. Not only is there a cost for responding to each request, but many platforms lack the technical capacity to deal with impressions at such scale. Consequently, some are dropping requests that exceed their limits, which lowers participation rates and makes them less attractive to SSPs. Simultaneously, with SSPs increasingly using first price models — where buyers pay their full bid amount — impression prices are rising. Yet as winning bids go up, so does the cost per acquisition (CPA), meaning buyers looking for the lowest cost will divert spend. There are also the issues of potential bid duplication, fraud, and latency caused by long and complex supply chains.
Enter SPO. Most versions of this process are automated: using smart algorithms to gather information — such as win patterns and traffic — and select SSPs that maximise the chances of realising desired outcomes. Although in some cases, analysis and changes are managed manually. The idea is that by deploying data-driven decision-making, buyers can filter supply and only connect with SSPs that meet stringent criteria: such as low fraud rates, no use of first-price auctions or hidden fees, and inventory related to key topics. This is intended to increase the likelihood of winning the right bids at reasonable prices, as well as reducing the cost of processing sub-optimal impressions. Plus, there are added efficiency gains of using fewer SSPs and harnessing smart machines to automatically avoid duplication.
And the drawbacks…
SPO has its failings. Firstly, its algorithmic iteration may inadvertently cause overpayment by cutting off SSPs that could have presented a better deal. Secondly, the technique promotes penalisation of SSPs using first-price auctions, which isn’t necessarily good. Second-price auctions aren’t perfect; many header bidding implementations see winning bids competing with demand sources in publishers’ ad servers, making it possible they still won’t be the final victors. Moreover, this method stops publishers from gaining a true view of impression value, unlike first-price — which fuels transparency as buyers pay what they bid.
Finally, the rapid nature of real-time bidding (RTB) means it is unrealistic to monitor and delete duplicates manually. And even with intelligent machines, the presence of multiple SSPs can create latency problems. Not to mention the fact that successful de-duplication should take into account criteria such as the exchanges’ fees and cost, transparency, take rates for publishers, and cookie matching (where IDs for publisher data sync exactly with buyers).
The verdict: hero or not?
Overall, the impact of SPO is mixed. The procedure relieves the burden placed on DSPs by header bidding: allowing them to refine supply chains and obtain relevant impressions at a fair price. It is also arguable that publishers should be embracing an initiative with the power to facilitate ad fraud reduction, bolster buyer value, and guarantee long-term interest from demand.
But it is important to recognise that SPO has downsides, particularly the restricted control over bidding decisions made by machines, and it isn’t really new either. As a technique that encourages DSPs to be selective about, it is fundamentally just the latest version of working smarter. So, it is likely that SPO — albeit with some adjustments to counter its flaws — will radically alter the ecosystem; making header bidding more efficient and strengthening relations between publishers and advertisers. But we mustn’t make the mistake of dubbing it as saviour that will resolve all programmatic challenges. That is still yet to come.
- » Facebook topples from list of top ten most-innovative brands
- » Why advertisers must use next generation mobile technology responsibly
- » Amazon hits record $2.5bn quarterly profits driven by online sales and cloud demand
- » Focus on these four areas to improve your marketing using AI
- » Embrace risks to build a culture of marketing experimentation