Recently I took over another agency’s client, where they ran YouTube. The client is in the sports nutrition vertical.
I sat through the onboarding meeting with them and they shared their YouTube completion rates of skippable ads of 60%+.
All I could say to myself was “either their videos are the most amazing commercials ever, or someone doesn’t know what the skip button is.”
Turns out it was the latter.
Having reviewed the brands Where Ads Showed, we discovered more than 40% of their ads spend went to kids content (i.e., Nursery Rhymes cartoons and those really obnoxious videos of kids who unbox toys for other kids to watch so they can bug their parents to buy them).
This is the type of content often viewed by those under age 5.
Now they are being served ads for pre-workout and muscle-building protein.
Needless to say, it wasn’t a good look.
Similarly, back in September, I saw the news story when a major city police department was running a recruitment ad.
It was a Google Display Network (GDN) effort, and sure enough, the ad triggered on a known right-wing website.
While trying to test the voracity of issues like these, I inadvertently triggered an online mattress ad on a news article about a mattress store employee who was murdered at the store.
This was then followed by an ad for a cruise aggregator – whose ad triggered on an article about a toddler dying on a cruise ship.
All of this was via GDN, and all of it was preventable. The advertisers just failed to take the necessary steps post-launch, to avoid this crisis.
We all perform pre-launch and post-launch QA (or at least we should be), but all too often, after that, repeat visits to the work get neglected, and/or, those performing QA get “account fatigue” (when someone is so accustomed and burned out by an account, they can’t see what is right in front of them, while looking right at it).
But what happens 3 weeks, 2 months, a or fiscal quarter later?
Are you still going back to do an audit of your work (or have someone not on the account do it for you)?
If not, you’d be amazed by how much a bit of data can shed light on “mistakes and no-no’s”, that weren’t caught during initial QA’s.
You’re a pharma company promoting a prescription medication necessary for surviving an allergic reaction.
In YouTube, you set content to limited inventory, age is targeted to 25-54, who are parents, and there are 400+ negative keywords.
In addition to more than 2,000 channel exclusions in place. Week 1 of data, you primarily show on videos for a well-known talk show talking about your topic, so you ignore the “Where did the ad show” data for a bit after that.
Later, you discover, that due to lack of exclusions, your #3 spender over the past 3 weeks to be animated content for kindergartners learning to read and a sing-along channel with a cartoon walrus (#4 spender).
Spend was high due to the watcher doesn’t know what a skip button is.
Had this been monitored, within 2 weeks this data would appear and an immediate investigation would’ve been launched. The path of the least resistance to prevent the ads from showing on this content.
This can/should be done for GDN and YouTube, even a modified version for Microsoft Search Syndication.
If possible, have someone who doesn’t normally work on the business, step in and do an audit. This prevents “account fatigue”.
Follow these simple steps:
- Log in and set a date range of data that meets a minimum logical traffic threshold, but doesn’t overlap with your last audit (usually 2 weeks at a minimum, but keep it under 3 months).
- Pull a geo-report of your campaigns, to ensure no substantial amount of traffic is coming from outside your intended geo-target.
- Repeat this step as needed for devices, topics, HHI, Age, Gender, Parental Status, etc.
- Click on “Placements”, and then click on “Where Ads Showed.” (Unless you are targeting exact videos or domains, you are going to see a lot of placement locations you weren’t expecting.)
Once those sets of data have been pulled and analyzed, one of two things will likely run through your head:
- “Damn, I am good at what I do, will recheck in a few more weeks” or
- “What the $*@#!, something is not right here”
If you ended up with #2, then you have just reinforced the need for auditing and QA.
Have no fear, the data sets you pulled will provide you direction. Merely look at the placements and that will start telling you what to do.
- Any ad showed against a news article about a murder = Negative keyword murder
- Any ad showed for a toy unboxing video that wasn’t for toys = Exclude toys in topics
Medicare ads showed for video game content = Exclude ages under 55
Proactive auditing eliminates bad ad placements and helps prevent brand safety issues from happening as well.
But most importantly this helps refine your pre and post-launch QA process and checklist.
The expansion of your new pre and post-launch QA list, based off auditing, your list should cover the following (and a lot more):
- Frequency capping: No one needs to see your same ad 300+ times per day, in an inappropriate spot.
- Topics: Show ads to relevant content if it isn’t retargeting, but definitely use it to exclude in appropriate topics.
- Placements: If you’re targeting specific placements, that’s fine, but keep a list of places you never want your ad on (you’d be amazed how they can get around topic categorization).
- Demographics: Not everyone necessarily needs to see your ads (Ask any pharma, luxury goods, or financial service advertiser).
- Keywords: Build keyword audiences for targeting, but more importantly use them for negatives.
- Geotargeting: This defaults to the entire country, make sure that is what you want.
- Languages: It does not default to a specific language (YouTube), it defaults to all languages, make sure your creative aligns with the language.
- Content Exclusions: You may not want to show for the “Sexually Suggestive Content.”
- Placement Types: Embedded Videos, Live Streaming, Content with a DL-MA rating, let’s just say it isn’t right for everyone.
The ultimate takeaway is: QA is more important than you will ever realize.
But failing to go back and evaluate what you set up, with actual flowing data is far worse.
So let auditing influence your QA process. Also, remember to do it regularly!
All screenshots taken by author, January 2020