The sheer number of carbon duplicates that programmatic auctions are producing is threatening to bring down the entire programmatic industry.

The severe bid duplication is beginning to be noticed by buyers as unusual behavior.

Demand-side platforms are unable to expand campaigns because they only see a small percentage of inventory replicated numerous times, rather than the entire universe of bid chances.

Their attempt to remedy the issue—processing fewer bid requests to provide a clean path using a technique called traffic shaping—only serves to eliminate worthwhile bid proposals in the process.

It’s no secret that bid duplication causes problems. However, being proactive has drawbacks, as is frequently the case in the ad tech space. If a single publisher tries to solve the problem on their own, their revenue will drop to such an extent that they won’t be able to change on their own. In the meanwhile, eliminating bid duplication waste may put pressure on SSPs, who currently have the opportunity to sell everything.

READ MORE: At CES, NBCUniversal Shows Off Its Programmatic Abilities

However, many in the advertising sector wish to avoid bid duplication. However, more people must first comprehend what they are witnessing.

The basic intricacy of duplicate bids

Customers typically believe that automated processes follow reason and logic. SSPs get pings from publishers, and they compile DSP bids for the ad spot. The highest-bidding buyer prevails.

The actual events are far more intricate. Although an ID exists to remedy this issue, bid requests that are sent through various SSPs are indistinguishable from one another. Unable to distinguish between dozens of identical bid requests, buyers must sort through them. An SSP’s twenty bid requests for a single advertisement can appear exactly the same as 20 SSP bids for 20 distinct ads.

Since header bidding has been the norm for large publishers setting up their programmatic ad stack, bid duplication has increased in lockstep. Since header bidding is now the norm, publishers ping each and every one of their SSP partners to inquire about ad availability and get a bid response.

However, DSPs have not been able to adjust back. Every can of soup they come across is identical, giving the impression that they are browsing down an aisle in a grocery shop.

An illustration of how noisy the world is may be found here: PubMatic claimed to have processed 56 trillion impressions in Q3 2022, up 33% from the previous year. That works out to an average of 7,000 ads for each person on the planet. That statistic only makes sense if there is a great deal of bid duplication, which ultimately hurts nobody (perhaps not even PubMatic).

AdExchanger was informed by PubMatic that the number of bids it rejects is significantly higher. A representative stated, “We routinely reject impressions that don’t meet our inventory quality standards or that we don’t believe we can monetize effectively.”

READ MORE: A Look Ahead At Ad Tech As The New Year Gets Started For CTV, Data, And Programmatic

Here’s an additional scenario: It would make sense to believe that a publisher receiving 100 bid requests from an SSP would be receiving 100 distinct ad slots. False. Will Doherty, vice president of inventory development at The Trade Desk, said, “One problem we’ve seen is that [SSPs] send 30% of the publisher’s traffic – the stuff they think is best – and they send it three times.”

Most DSPs, such as The Trade Desk and Google DV360, prohibit sending multiple bid requests for the same ad impression. However, that only indicates that certain SSPs attempt to go undetected or mask their activities as yield optimization (FreeWheel was recently exposed for using a strategy known as “smart bidding” to submit repeated bid requests for the same inventory).

Sonja Kristiansen, chief business officer at TripleLift, stated that these kinds of auction shenanigans “are the exact reason DSPs won’t explain or make public why they make filtering decisions.”

Traffic shaping: The medication that exacerbates the situation

DSPs have resorted to traffic shaping, which filters excessive bids by combining algorithmic and manual selection to pick the inventory that buyers see, in order to prevent bid duplication.

However, the procedure doesn’t truly deduplicate impressions for DSPs, and ironically, the assumptions made during traffic shaping can make duplication worse.

It treats a symptom of the disease rather than curing it.

It costs money to process billions of bid requests. Traffic shaping technology has been adopted by both SSPs and DSPs as a cost-cutting technique in response to their rapidly increasing cloud costs. For example, Magnite purchased nToggle in 2017 to use for traffic shaping.

In actuality, an SSP may be instructed by a DSP to provide it with six million QPS [queries per second]. The SSP will then attempt to deliver the DSP its “best” content, which is the inventory that the SSP believes has the best chance of succeeding given the limitations.

READ MORE: Google Is Introducing Programmatic Bidding For Limited Ads

For instance, the self-service facility provided by Microsoft Advertising’s SSP allows its DSP partners to specify the kind of supply they require, such as CTV inventory, banner inventory, or particular domains. Microsoft also rejects ad requests with incorrect formatting or missing data by using data science.

In order to decide what to send buyers in the future, traffic-shaping algorithms rely on previous data on what inventory has been bought on in the past. This strategy runs the risk of limiting purchasers’ perception of what is available and preventing them from learning about fresh inventory.

The founder of Jounce Media, Chris Kane, stated that because exchanges base their choices on the anticipated revenue for each image, “what ends up happening is that every exchange chooses the same impression.”

DSPs also influence traffic. A DSP might decide not to process a bid request even though it claims to be listening to it. In an efficiency move that saves money, a lot of DSPs and SSPs filter impressions before they reach buyers using their own algorithms and criteria.

Control that QPS.

The reason for traffic shaping is that handling the millions of queries per second (QPS) of duplicate ad requests would be prohibitively costly. Although the total number of impressions is increasing in tandem with the growth of web traffic, the number of requests for those impressions has increased at an exponential rate.

“I see it as inflation,” Doherty remarked. “All they’re doing is printing more and more copies.” They are increasing supply while decreasing and diluting value in a market where supply already vastly exceeds demand.

Higher server expenditures result from buyers and sellers having to sort through an increasing number of duplicate impressions in order to purchase the same quantity of ad inventory. This could be referred to as a margin issue or a sustainability concern. In any case, operating expenses—to use the vernacular of CFOs—are skyrocketing.

In addition to being realistic about what it wants to purchase, a smaller DSP asking an SSP to limit the QPS it sends is also an attempt by the DSP to preserve margin, since cloud servers are typically among the highest expenses incurred by a DSP or SSP.

The SSP was on the verge of bankruptcy when EMX filed for bankruptcy a year ago because it owed $900,000 to cloud provider Amazon Web Services, which demands monthly payments. During the first nine months of 2023, The Trade Desk spent $264 million on platform operations, or $730,000 per day. This amount covered all costs associated with operating a DSP, such as server expenses paid to Databricks and Amazon Web Services.

According to Alex Chatfield, head of Microsoft and Netflix ad sales for Microsoft Advertising, which runs both a DSP and an SSP as part of its Xandr acquisition, “header bidding has materially increased the operating costs of the platform.”

Since most ad tech companies no longer maintain their own servers but instead use cloud servers, they can adjust QPS in response to business conditions. During peak marketing events like Black Friday, a business might allow more QPS or limit it to control expenses.

Duplicate bids hurt buyers.

The addition and subtraction result in the removal of desirable inventory when bid duplication is coupled with traffic-shaping technology. Everything considered a niche site, including minority-owned publishers, is too difficult for buyers to scale with.

READ MORE: CTV/OTT Programmatic Advertising

Several purchasers have confirmed that traffic to diverse-owned media organizations is disproportionately filtered out by traffic-shaping technology, despite the fact that numerous brands have committed to spending with these companies.

According to Emily Kennedy, SVP of programmatic partnerships at Dentsu Media US, minority-owned publishers and content providers are filtered out because they frequently don’t generate enough bid volume or are required to use multiple technology connections in order to allow their inventory to be available programmatically.

In discussions with a number of industry professionals, AdExchanger brought up the detrimental impact of filtering technology on minority publishers. Many of them were unaware of the problem or were unable to see how technology could result in this kind of reaction.

But minority-owned publishers are not the only ones affected by this issue. If the “medium to low value” people on that site are shut out, an auto insurance trying to buy on specialist sites that reach automobile shoppers, for example, might not be able to grow.

According to Lara Koenig, global head of product at media buying firm MiQ, “DSPs make a lot of network-level decisions that don’t always align” with what a marketer would need. “Having a one-size-fits-all supply strategy on the DSP side won’t work” for marketers of consumer packaged goods, vehicle companies, or advertisers who run a lot of mobile gaming, according to her.

Put another way, before a niche consumer even gets a chance to raise their hand and indicate that they desire anything, what that buyer specifically requires may be viewed as “too unique” and excluded.

Good luck to any brand that wishes to further refine its inventory selection: “It can be challenging if I want to serve on this very specific [publisher] brand and specific DMAs and add on brand safety elements,” affirmed Kennedy of Dentsu.

Scale problems are difficult to troubleshoot since duplication and filtering occur at every stage.

Is it the publisher’s fault if inventory is being filtered out before it reaches a buyer, for instance, or was the SSP’s fault for sending over insufficient inventory to satisfy a DSP’s QPS caps? Or perhaps the impressions were just filtered out by the DSP.

Often, the best course of action is to completely avoid using traffic-shaping and filtering algorithms. The technology that is eliminating inventory that buyers want to purchase is typically circumvented when buyers move to programmatic guaranteed or private marketplaces.

According to Koenig, the reason this strategy is effective is because it gets beyond the “underlying algorithms that cause favoritism.”

However, purchasers believe that bid duplication and traffic filtering alone drive up CPMs.

Kennedy remarked, “You are recycling the same stuff—and paying more for it because the pool is smaller.”

Is there a way to stop bid duplication?

The revenue from an auction increases with the number of bids. Because publishers and SSPs would lose money if they stopped, and because DSPs are unable to change their algorithms so that sellers are not penalized, bid duplication continues.

For a moment, let’s focus on DSP algorithms.

DSPs have a volume bias, as publishers and SSPs have found. Even while both publishers actually only have 3 million ad slots for sale, many DSPs believe a publisher with 30 million or more ad impressions for sale is bigger and more desirable than a publisher with 10 million bid requests.

That partiality stems from the way DSPs manage campaigns.

A DSP may return 12 bids for a single impression if it receives 12, and the publisher may select the highest bid rather than taking the lowest one. SSPs have more options when they send more impressions because more bids are typically squeezed out.

Bid duplication will persist until DSPs find a way to eliminate the volume bias in their algorithms, as these auction economics are difficult to alter.

“Trust the bidder, not the DSP,” Kristiansen of TripleLift said.

For example, DSPs may claim to demand quality, but if they order their bidder to buy up inexpensive goods, pay attention to what the bidder is putting into their mouths rather than what the DSPs are saying.

DSPs should stop paying volume to their bidders if they truly wish to reduce duplication.

But this issue cannot be resolved by algorithmic tweaks alone. It is also necessary to change policies.

Algorithms and policy

The Trade Desk is the most outspoken of the two major DSPs when it comes to policy. It requests that SSPs begin use its Global Placement IDs (GPIDs), since this will aid in the deduplication of impressions. Additionally, DSPs can encourage SSPs to adopt the transaction ID found in OpenRTB standards.

However, after three years, it’s not apparent how many publishers and exchanges are adhering to The Trade Desk’s guidelines.

In an effort to discourage SSPs and publishers from distributing the same impression at various floor rates, the independent DSP also recently decided to ignore floor prices.

According to Microsoft’s Chatfield, “publishers have a right to be worried” about efforts aimed at addressing duplicate inventories.

He stated, “If it’s done everywhere, there shouldn’t be any difference because the actual supply is the same; it just needs to be widely adopted.” “It turns into a game theory issue where the early movers incur penalties.”

The Trade Desk tries to persuade publishers to follow its new policies since it recognizes that disadvantage. “Listening to more traffic when publishers grant you greater access to it and taking their traffic into consideration more frequently are two ways you can reward publishers who do the right thing,” Doherty stated.

In an effort to cover all of its bases, The Trade Desk created OpenPath, a direct publisher link that provides it with a baseline of deduplicated inventory and data that it can use to make sure SSPs provide it with the cleanest supply channel possible.

“OpenPath provides us with clarity,” Doherty stated. Since it’s neither duplicated or traffic shaped, it may use OpenPath to “judge the efficacy of all other paths.”

Google, where are you?

Unlike The Trade Desk, Google has not made any announcements on policy pertaining to duplicate bids.

Google is unlikely to change anything that would have an adverse effect on publishers given the impending antitrust trial this spring, which will particularly examine its rigid tech setup—you know, the one that originally led the ad tech industry to invent header bidding and bid duplication.

Google Vice President of Global Ads Dan Taylor stated, “We are walking a tightrope where we want publishers to support monetization that best supports their goals.”

If Google took any action, it would need to come from a trade association. Taylor stated, “I do believe it would be beneficial to have an industry conversation about what are the appropriate practices in this situation.” “I don’t believe there is an ongoing conversation right now.”

Overseeing bid duplication may be the responsibility of an industry organization.

“We need industry standards, and The Trade Desk—albeit in line with their own goals—is the closest we’ve gotten to defining those,” Kristiansen added, alluding to the organization’s GPID (global placement ID) mandate. “It practically must be at the IAB level.”

Is 2024 the start of the end for duplicate bids?

Many in the ad tech industry are troubled by the jury-rigged nature of programmatic in the present market. It lacks style and sustainability in terms of both the environment and structure.

“Such waste and inefficiencies are not present in mature markets,” Doherty of The Trade Desk stated.

However, the sector is still not working together to find a solution to the bid duplication issue. According to Kane, “not enough people are wrapping their heads around this enormous industry dynamic.”

However, things can alter. Programmatic has revised its core mechanics before.

The waterfall setup was superseded by header bidding. Second-price auctions gave way to first-price auctions.

Buyers now require a method that combines policy and algorithm adjustments to defragment the auctions occurring for the same ad spot across dozens of SSPs.

Putting this approach into practice in a way that improves rather than worsens the issue will be the difficult part.

Source


Radiant and America Nu, offering to elevate your entertainment game! Movies, TV series, exclusive interviews, music, and more—download now on various devices, including iPhones, Androids, smart TVs, Apple TV, Fire Stick, and more.