BE
International speakersPrestigious award showThe latest insightsAnniversary edition in the Netherlands
News

Brad Geddes (AdAlysis): ‘The development of Google Ads since 2000 is huge’

29 January 2020

It is almost impossible to not know you when working in the field of PPC. I personally am a begin fan of your book Advanced Google Adwords and your blog on bgtheory.com. For the people who don’t know you, what should they know of Brad Geddes?

I’ve been involved in PPC since 1998, before Google even had paid search. Over the years, I have a variety of experience from working on reseller accounts (we managed 42,000 PPC accounts), to building two agencies, to working on accounts that spend over $100 million a year. That means I often go from thinking about scale to efficiency depending on each situation.

I’m a logic-driven creative thinker, so I like to look at the available features of a system and think of creative ways in which to put strategies together for accounts. I’m also a time management geek, so I like to temper scale and time with the strategies so everything can get done in an efficient manner.

As a side note, I write frequently on the Adalysis blog right now if you want to follow some of my thoughts and ideas.

Google Ad(word)s has been serving ads since the year 2000. What do you think are the most important updates and/or developments throughout these years?

If we look at the top developments that make Google what it is, we need to go back to 1943 when Walter Pitts and Warren McCulloch created the first Deep Learning system. It’s the work of the early computer scientist pioneers that even allowed Google to exist in what we think of today.

Before Google could grow, they needed to monetize their traffic. In 1998 Bill Gross launched the first PPC engine, GoTo.com, as that was the first time ads were served based upon user intent – the search term. This would later be adopted by Google to create its billions of dollars in revenue.

Google’s first innovation was launching AdWords Select (what we now call Google Ads) with a CPC x CTR positioning in 2002. In 2005, the positioning formula morphed into Quality Score, which went on to become Google’s first machine learning algorithm. That same year, Google acquired Urchin, which became Google Analytics. Pairing SEM with easily integrated analytics created a wave of data-driven marketers.

With a nod to eFrontier, who launched portfolio style automated bidding in 2002, in 2010, Google launched CPA bidding. That allowed smaller companies access to automated bid tools and would forever change how much time companies spent setting bids.

In 2012, Google launched Google Website Optimizer. This was the first free, and widely accessible, website optimization tool based upon multivariate testing. This took the small market of conversion optimization and spun out an entire sub-industry of marketing.

Also in 2012, Google launched AdWords scripts. While many of us had been working with the API for years and doing things at scale, the availability of creating your own small scale automation and analysis made it so companies of any size could put their own spin on automation.

While there have been many innovations over the years from audience targeting, DSAs, responsive ads, call tracking, Google local, Google Shopping, and so forth, the big turning points can really be attributed to machine learning, conversion optimization, and data-driven marketing.

There seems to be a divide in-between PPC-experts being pro and against the automation of Google, especially when it comes to Google’s smart solutions. Where do you stand on this, and why?

I think this is a resource question. If you are a large company and you want to use proprietary data in your marketing, then you are going to build a lot of your tools in-house as they are more custom to your needs. Google has to build to the masses, you can build to custom specifications.

If you are a company that doesn’t have the resources to build your own technology, then leveraging Google’s tools is a wonderful idea.

I don’t think this is a question of “do you trust Google with your data?” Instead, it’s a question of how you allocate your time and money based upon the resources you have to work with to make an account efficient.

What do you think of the statement from “Black Box to Black Hole” regarding smart automation?

I think it’s a very smart statement. The biggest issue with automation right now is that you can’t audit the decisions or learn from the data from it since it’s a black box.

This makes it so a marketer using smart automation can’t get better on their own in how they think about marketing, strategies, and customer interaction from this data as they don’t get to analyze, audit, or learn about the decisions being made by the algos.

In addition, you can’t ‘help’ the algo along by introducing your own variables or custom data. This is often why larger companies like to build their own systems since they can introduce variables that Google can’t use (as they aren’t consistent across all accounts) or variable that Google doesn’t know.

Lastly, since the data is locked into a single system, you can’t transfer what’s been learned to other platforms.

Google seems to be taking away control for the keyword-based advertiser with the latest updates of Google Ads. Why do you think this happens and do you consider this a bad thing?

I think it happens for a few reasons. The first is that Google thinks they are smarter than everyone else and they trust deeply in their machines to figure out what is right and wrong, even if in the short term it hurts a lot of companies. They are usually looking long term once the computer has figured out the variables, while marketers are looking at their results on a daily basis.

The second is that with desktop, mobile, and voice inputs, the universe of search terms has changed and Google is trying to accomplish an initial vision of marketing from Larry Page and Eric Schmidt where Google was smart enough to just connect searchers with companies without marketer’s needing to do any work. To get there, you need to understand the intent and anticipate the need. Google is nowhere near that right now, but they think they can get there so they are taking keyword control from advertisers as they think they can discern intent from the keywords.

Overall, the new match types are not working well for many advertisers, but it can be a huge amount of work to analyze and fix it all. When you put a lot of work in front of people, their options are to do the work or let Google eventually figure it out. We see people following both paths right now.

A common practice for a lot of search advertisers is using Single Keywords Ad Groups and for e-commerce advertisers also Single Product Ad Groups (for Shopping). Do you think this practice is still valuable?

I was never a fan of this practice. It’s called an ad group for a reason – it’s all about the ad. Is the ad relevant to the user? Does the ad help the user to make a decision and cause them to take action?

A keyword just says show an ad or don’t show an ad. If an ad is relevant with 1 keyword, then you can have a single keyword ad group. If it’s relevant to 100 keywords, then you can have 100 keywords in an ad group.

SKAGs often create more work than it’s results brought in. There are always exceptions, such as high-value words or brand terms you need to watch closely, and you might have some SKAGs in your account – and that is fine.

The way Google’s match types are working right now, anyone using SKAGs often has the same search term showing up in many ad groups. That means your query bidding and ad testing aren’t working very well since you need to aggregate data across many ad groups to do something simple like set a bid modifier.

Google advises to use BROAD match type keywords when an automated bid strategy is active. What would you advise when the campaign goal is performance based?

There are a few additional factors to this question:
How often are your search terms in multple languages at once? (i.e. half German and half Arabic)
How well does Google understand your language?
What’s your available volume?

For instance, if you are using English keywords in a large market, then broad match is usually a terrible idea as you end up with more useless queries than good ones. You are paying for Google to learn. If you start with more restrictive match types, even modified broad, then you are letting them learn from a more relevant keyword universe and you usually waste less money for their system to learn.

If you are in a small niche market, then you might need to use some broad match to get volume. If you have a lot of queries that cross languages, broad match has historically been better at matching across languages than other match types. We are seeing the other match types start to match across languages (and even product names to product part numbers), so this might not be fully true in another few months with regards to multi-language search terms.

If you are advertising in Japanese, and Google doesn’t understand the nuance of the language very well, then broad match often does work well.

Lastly, if you just want to know or reach everyone (which means you are also looking at branding, impression, and reach metrics), then broad match can be useful.

When bidding and targeting is automated by using bid strategies, what do you consider most important to guide the machine in the right direction?

Incredibly accurate data. Are all your conversions being tracked? Is the proper data being moved between systems and back into Google’s algo? Are there conversions leaks? The better your data, the better the machine can make decisions.

Refinement. Are you looking at what is not working and helping the machine to refine it’s targeting.

Ad and landing page testing. The better your data becomes, the more the algo can optimize just for you.

As co-founder of the PPC management tool Adalysis you work with a lot of agencies and clients. Do you expect the work of agencies to be totally different or maybe even non-existent in the near future? What do you think of the future of the agency landscape will look like?

I think everyone is going to have a job if they can think creatively and strategically. Machines are not good at interpreting human behavior. They aren’t good at being creative. Machines work from a set of inputs to create outputs.

If your job is to push button A when X happens and push button B when Y happens, then you should already be out of a job. If your job is to create the same reports month over month, then you shouldn’t have a job now.

If you are guiding the machine, creating innovating marketing campaigns, and thinking about how all the data fits together to create new strategies for your clients, then you’ll have a job for a long time.

To quote Rob Norman (long-time CEO of GroupM North America):
To my peers and friends who are still worried that their jobs may be replaced by machines, I’d offer this: 30 years ago we were information workers, then machines beat us at processing. So, for the last decade, we’ve adapted to become intelligence workers. Now it’s time to adapt again. And in this new age of assistance, I believe we’re called to be imagination workers.

Any job where I can be called an imagination worker, I’m quite happy to have.

What can people expect from your session at Friends of Search?

I’m going to dig into the match type changes, the good and the bad, and show how search terms and keywords need to be evaluated and managed in light of Google’s recent changes.

I’ll get into pivot table analysis, year over year keyword comparisons, and show how to keep your search terms relevant and organized as Google keeps changing match types.

Unfortunately, we have decided not to organize a Belgian edition of Friends of Search in 2024. The 11th edition of Friends of Search we are organising in the Netherlands. Belgian visitors are of course more than welcome. Join us on March 21, 2024, in de Kromhouthal in Amsterdam.

Go to Dutch website Stay here