Mr Jonathan Jones

Mr Jonathan Jones

SEO & Digital Consultant

The Google Search Central Conference December 2024 – Q&A Edition: Highlights from Zurich

A few days ago, I had the opportunity to attend the Google Search Central Conference in Zurich—a highly anticipated event (at least for me) that brought together leading experts in search, SEO, and web performance. Hosted by John Mueller, Martin Splitt, and Daniel Waisberg, the conference offered a packed agenda of insights, technical expertise, and candid discussions. From structured data to the evolving role of AI in content creation, the day provided invaluable takeaways.

The event had an air of familiarity, reminiscent of the 2019 edition of this conference held in the same location and hosted by the same Search Advocates. Nearly five years later, it was fascinating to see how both the industry and the event itself had evolved (zero talk of AI, more around technical SEO, and more talk of cheese [thanks to John]). For a glimpse into that earlier conference, you can read my reflections here: Google Webmaster Conference Zurich 2019 (then called the ‘Google Webmaster Conference’ before the re-branding to ‘Search Central’).

Agenda Highlights

The conference kicked off with sessions led by Google’s search advocates, focusing on topics crucial for navigating today’s SEO landscape. Irina Tuduce shared advanced tips on leveraging structured data for shopping platforms, while Daniel Waisberg delved into the intersection of Google Analytics and Google Search Console (GA & GSC), offering practical strategies for maximising site performance.

This post, however, hones in on the key takeaways from the Q&A sessions. Featuring insights from Danny Sullivan, John Mueller, and Martin Splitt, the discussions explored the nuances of Google’s messaging and their broader implications for publishers and SEOs. From practical advice on testing new content areas to critical perspectives on algorithmic transparency, the Q&A sessions uncovered both opportunities and ongoing challenges within the search ecosystem.


Earlier Q&A Highlights: Martin Splitt and John Mueller

The first Q&A session of the day saw Martin Splitt and John Mueller fielding questions on topics ranging from AI to search in general. Here are some of the key takeaways:

1. Is SEO Dead?

The session began with a timeless question: “Is SEO dead?” The answer? A resounding “No!”, delivered with a mix of humor and conviction.

Martin and John emphasised that while SEO continues to evolve rapidly—driven by advancements in AI and shifting user expectations—it remains as crucial as ever. Their playful tone set the stage for an engaging and thought-provoking discussion.

I delve deeper into this topic later in the article, particularly in the section on “Artificial Intelligence’s Role in SEO.”

2. Gemini and Search Console

During the Q&A, attendees asked about the integration of Google’s AI tools like Gemini into Search Console or Google Analytics:

“When will we see Gemini in GSC or GA4 directly?”

Daniel Waisberg responded:

“We don’t add things to Search Console unless they’re stable and final. It’s going to take some time.”

This response underscored Google’s cautious approach to integrating experimental AI features into its platforms, prioritising stability before full implementation. It’s evident that a dedicated AI Overview traffic filter isn’t on the horizon anytime soon with it’s Google Search Console team based out in Israel.

Gemini was presented as a distinct application, operating separately from Google’s AI Overview, which is more integrated into Search. A notable limitation is that Gemini currently does not include UTM source referral tracking, complicating attribution efforts. When testing Gemini on the web, browser referral tracking data remains the primary method to trace user activity, but this often gets blocked, reducing accuracy.

In comparison, ChatGPT’s search product, SearchGPT, appends referral links with ?utm_source=chatgpt.com, enabling clearer tracking:

https://www.petplan.co.uk/?utm_source=chatgpt.com 

By contrast, attribution becomes significantly more challenging for users accessing Gemini through its app on the App Store or Google Play Store. Visits from these platforms are frequently misclassified as direct traffic, leading to gaps in data granularity. ChatGPT’s use of UTM tags (also referred to as Google Analytics tracking properties/Urchin Tracking Module) provides a stark contrast, offering more precise attribution capabilities.

Similarly, Google Discover also lacks UTM tagging, making it difficult for publishers to analyse traffic sources. Adding parameters like ?utm_source=google_discover would improve transparency and tracking.

These gaps highlight the need for consistent UTM tracking across platforms like Gemini and Discover to ensure accurate attribution and actionable insights.

Side note: Wil Reynold’s posted this great article here by Jonathan Wehausen at Seer Interactive titled “Deep Dive: Tracking How ChatGPT + Search & Others Send Users To Your Site

Questions for Google:

  • Would adding UTM parameters (e.g., ?utm_source=google_discover) to Google Discover not improve transparency and data granularity for traffic sources?
  • With ChatGPT leveraging UTM tags for more precise attribution, how does Google plan to ensure its platforms maintain competitive analytics capabilities?
  • How can publishers ensure they’re not losing critical data granularity when users access Google platforms via mobile apps? e.g. Gemini

3. AI-First Content Creation

A hot topic was the role of AI in content production:

“Are we heading towards AI-first content production?”

Martin responded candidly:

“Hopefully not. AI can assist in content creation, but replacing human-authored content entirely often backfires. Nothing can replace human-to-human interaction and creating content for humans.”

“Massively generating pages with AI often backfires. AI should aid content creation, not be the primary driver.”

He made it clear that while AI can streamline workflows, replacing human-authored content entirely often leads to subpar results. His tone was critical of using AI in ways that remove the human element, arguing that this approach undermines trust and quality.

“If you’re generating hundreds or thousands of pages just because you can, without considering if they provide real value, that’s a slippery slope. Users aren’t fooled by that kind of content, and neither are we.”

This comment was aimed directly at those using AI + datasets to churn out massive quantities of pages without adding meaningful insights or serving user intent.

“AI is a tool, not a substitute for understanding your audience. The moment you let AI run the show without human oversight, you lose the connection with your users.”

4. Disallowing AI Crawlers

Another key question revolved around managing AI crawlers:

“Will disallowing AI crawling affect normal crawling?”

John clarified:

“No, unless you do it wrong. For instance, disallowing all crawlers could interfere, but generally, the two are separate.”

5. People First, Not Google First

On the question of creating content for Google versus users, John’s advice was clear:

“If you’re only writing for Google, you’re missing the point. Always ask yourself, ‘Would I do this for my readers if Google wasn’t here?’”

This user-centric philosophy remains central to Google’s approach, and I delve deeper into it in the section on User-First Philosophy.


Insights from Danny Sullivan: Site Reputation, Content Strategy, and Algorithm Nuances

The day culminated in a virtual appearance by Danny Sullivan, who joined for a thought-provoking Q&A moderated by Martin Splitt. Tackling more strategic and philosophical questions, Danny shared invaluable insights:

1. Quality Over Tactics: Algorithms and Content Credibility

Danny addressed the question:

“How do algorithms distinguish between surface-level SEO optimisation and truly high-quality, user-focused content?”

He explained:

“It’s not that we’re trying to understand, ‘Oh, is this SEO versus content that’s not SEO.’ We’re trying to understand if we think that content is helpful, reliable, useful, and satisfying to people.”

This reaffirms Google’s commitment to rewarding content that prioritises user needs over SEO tactics.

2. Expanding into New Content Areas: Build or Split?

One attendee asked in the open question form:

“If a site wants to expand into new content areas, what’s the best way to establish credibility and ensure alignment for search ranking expectations?”

Danny’s advice was practical:

“You would do the things that people are used to expecting from your existing site, assuming it has this established reputation. If you’re venturing into something entirely different, consider creating a separate site to build its own reputation and identity.”

Martin Splitt added:

“Think of it like a YouTube channel evolving over time. If you’re covering drone photography and then pivot to motorcycle travel, you might lose your audience’s trust unless the transition feels natural.”

3. Core Updates and Algorithm Transparency

Danny used the session to announce that the December 2024 Core Algorithm Update was officially live. The update was shown on-screen during the Q&A session, aligning with Google’s iterative approach to updates. He likened updates to phone operating systems:

“At any given day, there are lots of small changes happening—thousands of them annually. Core updates are larger shifts, but they’re part of that same iterative improvement process. Ideally, we’d like to reach a point where updates are so regular that people stop noticing them altogether.”

Google Webmaster Central Conference, Zurich 2019 vs the Google Search Central Conference, Zurich 2024  2019, "More Core algorithm updates
2024: More Core Updates! More often too.

Core Update Prediction from 2019 to 2024

The comparison between the December 2019 Google Webmaster Conference and the December 2024 Google Search Central Conference reveals how Google’s predictions for search evolution have materialised. In 2019, John Mueller highlighted the future importance of more frequent core algorithm updates as seen in the image.

Fast forward five years, and Google has delivered on those predictions. The December 2024 Core Algorithm Update marks the 18th core update since 2019, aligning with Google’s increased cadence of updates.

DateCore Update Name
June 3–8, 2019June 2019 Core Update
September 24, 2019September 2019 Core Update
January 13–17, 2020January 2020 Core Update
May 4–18, 2020May 2020 Core Update
December 3–16, 2020December 2020 Core Update
June 2–12, 2021June 2021 Core Update
July 1–12, 2021July 2021 Core Update
November 17–30, 2021November 2021 Core Update
May 25–June 9, 2022May 2022 Core Update
September 12–26, 2022September 2022 Core Update
March 15–28, 2023March 2023 Core Update
August 22–September 7, 2023August 2023 Core Update
October 5–19, 2023October 2023 Core Update
November 2–28, 2023November 2023 Core Update
March 5–April 19, 2024March 2024 Core Update
August 15–September 3, 2024August 2024 Core Update
November 11–December 5, 2024November 2024 Core Update
December 12, 2024December 2024 Core Update

This reflects ongoing discussions in the SEO community about the future of core updates. Notably, Glenn Gabe, speaking at the News and Editorial SEO Summit (NESS) in October 2023, predicted that Google might shift to running core updates continuously rather than at scheduled intervals.

At the conference, Danny Sullivan reinforced this shift, stating:

“We’re always improving our systems to deliver better search results. If an update is ready, why wait? The goal is to make updates routine and continuous, so they’re no longer seen as major events.”

This marks a clear departure from Google’s historical approach. In 2013, then-Google spokesperson Matt Cutts emphasized that updates were intentionally minimised during the holiday season to avoid disruptions for businesses (source).

While Sullivan’s “why wait?” philosophy prioritises users by delivering improvements as soon as they’re ready, it raises valid concerns for businesses, particularly during critical commercial periods like Black Friday and Cyber Monday.

Google has addressed the question of updates during the holiday season before. In one statement on their Search Central site, they state:

Aren’t there supposed to be no updates during the holiday shopping season?

We do try to avoid having updates during the late-November to mid-December period when possible. But it’s not always possible. If we have updates that can improve Search, that have been developed over the course of several months, we release them when they’re ready.”

Google Search Central (Source)

This acknowledgment reflects Google’s evolving priorities, focusing on delivering improvements for users even if it may cause temporary disruptions for businesses.

However, this commitment to avoiding holiday-season updates is contradicted by the timing of the November Core Algorithm Update, which began rolling out on 11 November 2024, and the December Core Algorithm Update, released on 12 December 2024.

Both updates fell squarely within the peak shopping period, with the November update disrupting critical preparation time for Black Friday and Cyber Monday, and the December update affecting businesses during the final rush before Christmas.

These actions seem to disregard Google’s stated effort to minimise disruption during the late-November to mid-December window. While the updates may enhance the user experience, the timing reinforces the growing tension between Google’s user-first philosophy and the operational realities faced by businesses reliant on stable search visibility during the holidays.

4. Third-Party vs In-House Content (“Sites within a site”)

Danny also touched on Google’s perspective on third-party content, stating:

“If you’ve built up site signals based on first-party content and then third-party content takes advantage of those signals, it creates confusion for users. Treating distinct areas as ‘sites within a site’ can help address this issue.”

He acknowledged that freelance content is not inherently problematic but emphasised the importance of ensuring that third-party content aligns with the site’s purpose and reputation to maintain user trust and prevent abuse of ranking signals.

Expanding on this, Danny suggested that publishers exploring entirely new areas might consider moving those efforts off-domain:

“If you’re venturing into something entirely different, consider creating a separate site to build its own reputation and identity.”

This approach echoes how Google treats Blogger.com subdomains, where individual subdomains (e.g., example.blogger.com) are evaluated independently of the main Blogger.com domain.

Vanessa Fox, former Product Manager of Google Webmaster Central, explained (Note: Referencing a 2007 talk at PubCon from Matt Cutts) that Google recognises these subdomains as distinct entities managed by different individuals. Each subdomain must establish its own signals to rank effectively, as they do not automatically inherit the authority or trust of the parent domain. She gave the following example (even if I don’t think it is absolutely relevant in today’s age):

“Use subdomains when you have very disparate content that you feel searchers would find relevant. For instance, videos are very different from articles, so it may make sense to separate those into subdomains.”

Vanessa Fox, former Product Manager of Google Webmaster Central

Matt Cutts, former Head of Web Spam at Google, described in 2007 subdomains as useful for separating content that is “completely different.” As he noted:

“A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such as news.google.com or maps.google.com, for example.”

Matt Cutts, former head of web spam

Google’s “sites within a site” framework builds on this principle, suggesting that subfolders or sections on a domain that diverge significantly from its core purpose—such as those focused on affiliate content or niche topics—might be treated as standalone entities. These areas may not benefit from the main site’s authority and will need to build their own trust and relevance. For publishers, this could result in a “reset” in rankings or performance until those sections prove their value.

For example, a site expanding into niche content might initially gain visibility through its parent domain. However, as the content grows and diverges, Google may begin evaluating it as a separate property. This raises the strategic question of whether publishers should continue hosting such content on the primary domain or transition it to a dedicated domain with a clearer focus. While moving off-domain allows for greater alignment and independence, it also involves the challenge of starting from scratch to establish authority.


Key Themes and Takeaways

Throughout the day’s engaging sessions and candid Q&As, several key themes emerged, providing not just practical insights but also a roadmap for navigating the rapidly evolving SEO landscape:

1. Artificial Intelligence’s’ Role in SEO

AI is no longer just a buzzword; it’s a powerful tool that can augment human creativity and efficiency. From Gemini’s gradual integration to structured data now supporting AI-generated elements like product images, Google’s message was clear: AI is here to enhance—not replace—human effort. As John Mueller succinctly put it: “AI can assist in content creation, but nothing replaces human-to-human interaction.”

Liz Reid, at Fortune’s Brainstorm AI 2024, commented on this, too

“AI-generated content isn’t intrinsically bad. It can help create amazing information. But scaled content abuse is a challenge we take seriously.”

Elizabeth Reid, VP of Search, Google

Related: Brainstorm AI 2024: “Combating the “AI Slop” Problem with Elizabeth Reid

AI’s role is not without its pitfalls, as low-quality, mass-produced AI content can harm a site’s reputation. Responsible adoption is critical, leveraging AI to streamline workflows and improve user experiences without compromising authenticity.

The presentation on Retrieval-Augmented Generation (RAG) highlighted a key evolution in how AI models, including ChatGPT and Google’s own LLMs, integrate more recent and reliable information into their outputs. By layering AI-trained data with real-time web content accessed through a vector database, these systems address a fundamental limitation of static training datasets: outdated information.

Related: Artificial Intelligence Optimization (AIO) — Embrace or Resist?

This approach allows AI to fetch snippets of up-to-date data from trusted sources, ensuring that responses are not only accurate but contextually relevant to the latest trends. As the diagram shows, search results provide metadata and context, which the AI uses to refine its outputs—making it a dynamic tool for surfacing timely and useful information.

While this enhances AI’s potential, it also underscores the growing reliance on high-quality web content to fuel these advancements. For SEOs and publishers, the message is clear: maintaining authoritative, up-to-date content is more important than ever to remain visible and valuable in an AI-driven search ecosystem.

Ilya Sutskever discussing AI and data as fossil fuel

Ilya Sutskever, a renowned machine learning scientist and former chief scientist at OpenAI, aptly described data (this largely includes web content, structured data sets, user generated data, interaction data, in my opinion) as the “fossil fuel of AI” during his presentation. As advancements in compute power, algorithms, and clusters continue to grow, the availability of new data has not kept pace. As he put it:

“We have but one internet… That data is the fossil fuel of AI. It was created somehow. And now we use it.”

This statement underscores the increasing reliance on existing, high-quality web content to fuel AI systems. The web is finite (in context to usefully diverse, high-quality data), and with pre-training it is believed it is reaching its limits, AI’s evolution will depend on continually refining and optimising the datasets it draws from. For SEOs and publishers, this reality drives home a critical message: maintaining authoritative, reliable, and up-to-date content is no longer optional—it’s essential.

Future Visibility: As AI-driven search models like Google’s Gemini and OpenAI’s RAG evolve, the relevance of publishers in these systems will depend on their ability to provide fresh, valuable insights. However, there will have to be a value exchange—publishers cannot simply serve as data providers for AI without receiving tangible benefits in return. This could include better attribution, traffic generation, or revenue-sharing models that ensure the sustainability of quality content creation (the fossil fuel that a lot of us are sitting on).

2. Content Credibility and Reputation

Google underscored the importance of building trust through high-quality, user-focused content. The Site Reputation Update and discussions around third-parties highlighted that trust and transparency are paramount in today’s SEO environment. As Danny Sullivan explained:

“If you’ve built up site signals based on first-party content and then third-party content takes advantage of those signals, it creates confusion for users.”

Experimentation and innovation remain encouraged, but substantial pivots may necessitate creating dedicated platforms or treating new areas as ‘sites within a site’ to ensure clarity for both users and algorithms.

Danny’s mention of ‘sites within a site’ suggests that Google may treat subfolders or subdomains differently, especially when their focus significantly diverges from the main site’s purpose. This ensures that unrelated or niche content, such as affiliate-driven sections, must establish their own reputation rather than relying solely on the parent domain’s authority.

AI-generated content also plays a significant role in maintaining or eroding credibility. As Martin Splitt emphasised, using AI to mass-generate content without oversight can harm a site’s reputation and go lead you towards running into issues with Google’s Scaled Content Abuse policies. High-quality, user-focused content must remain the priority, even when leveraging AI tools.

3. User-First Philosophy

At the heart of every conversation during the conference was a consistent reminder: always prioritise your audience. Whether discussing content strategies, core updates, or AI integration, Google’s representatives reinforced the importance of user-first thinking. As Danny Sullivan aptly stated:

“Would you create this for your readers if Google wasn’t here?”

Danny Sullivan, Google Search Liaison

This philosophy challenges publishers to create content that genuinely serves user needs rather than focusing solely on search engine performance. However, this raises important questions about how Google’s dominance in the search ecosystem influences the way content is created:

  • If Google prioritises user-first content, why does its SERP design often prioritise ads and Google-owned properties over organic results?
  • How can publishers align with a “user-first” philosophy when zero-click searches often prevent users from fully engaging with publisher content?
  • What specific metrics does Google use to evaluate ‘helpful’ and ‘satisfying’ content, and how transparent is this evaluation process?

Critics argue that Google’s user-first philosophy, while aspirational, is often at odds with its own practices. As Nate Hake (via Twitter/X) points out, Google has shaped user expectations by training them to expect specific types of content through search—such as recipes, travel guides, and how-tos. Producing such content is not inherently manipulative; it reflects the need to meet users where they are and provide the answers they seek.

Sullivan’s statement also ties into Google’s focus on “satisfying content,” a concept recently highlighted in their updates. This isn’t just about answering questions; it’s about ensuring users feel fully served when they land on a page. Yet, this raises additional questions:

  • Should sites focus solely on creating content that aligns with Google’s expectations—prioritising user satisfaction as if created by a hobbyist with no ulterior motives like revenue or monetisation—or should they take a broader approach that balances these ideals with business goals and user needs across platforms like YouTube, LinkedIn, Bing, DuckDuckGo, Facebook, TikTok, ChatGPT, Perplexity and X/Twitter? This is especially challenging when Google controls 89.33% of the search market and acts as the primary gateway to the web.

For publishers, these questions are especially relevant.

Large-scale publishers face significant challenges in managing their reputations, balancing the need to expand into new areas with the responsibility of maintaining trust and quality. The best publishers go further—they create insights and value so unique, impactful, and expensive to replicate that their presence becomes indispensable. The aim is to operate at a level where the value you deliver makes it impossible for Google not to show your content. Without it, Google’s results would be noticeably worse, diminishing the quality and relevance users expect. This is the standard that publishers and businesses alike must strive for in today’s world.

At the same time, smaller publishers must compete against the scale, authority, and resources of established players, which can make standing out in competitive spaces particularly difficult. Navigating these dynamics while aligning with Google’s expectations requires thoughtful strategy in the current world.

Ultimately, while these questions highlight areas of tension, Sullivan’s guidance reinforces a key principle: the best-performing content aligns with user intent. By prioritising clarity, trustworthiness, and relevance, publishers can create content that not only serves users but also aligns with Google’s mission to surface high-quality results.

“to organize the world’s information and make it universally accessible and useful.”

Google’s Mission

Final Thoughts

The Google Search Central Conference in Zurich offered a glimpse into Google’s vision for the future of search (Read: Google’s Elizabeth Reid on the Future of Search) while also highlighting the challenges publishers face in adapting to its evolving expectations. While Google’s emphasis on user-first content and quality is commendable, the discussions revealed critical gaps in clarity, consistency, and practicality for those working within its ecosystem.

The suggestion to move substantial content expansions to separate sites—while logical in theory—presents significant logistical and financial hurdles for publishers who have already built established businesses with large head counts and operational capabilities. Building a new domain’s reputation from scratch requires time, resources, and sustained effort that many organisations may find difficult to justify.

A recurring tension throughout the conference was Google’s aspirational messaging versus the realities of modern publishing. Issues such as ambiguous enforcement of site reputation metrics, the complexities of managing freelancers and third-party content, and inconsistent algorithmic behaviour continue to make aligning with Google’s expectations a challenge for publishers of all sizes. Greater transparency and consistency remain essential to help publishers make informed decisions and navigate these challenges effectively.

Ultimately, the conference reinforced a critical takeaway: publishers must find a balance between relying on Google and building independent value for their audiences. While Google remains the dominant gateway to the web, publishers must prioritise trust, quality, and sustainability in their strategies. Diversification away from Google Search has never been more important.

With the advent of AI, frequent algorithmic changes like core updates, and new policies, the need to broaden focus across multiple platforms and ecosystems has never been greater. A willingness to challenge the status quo, adapt to evolving user needs, and maintain alignment with business goals across platforms is key to thriving in an ever-changing digital landscape, especially with the significant advancements of artificial intelligence.

If you need to submit corrections, please send an email titled ‘Corrections’, which you can find on my contact page.

9
Leave a Reply

avatar
3 Comment threads
6 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
4 Comment authors
Mr Jonathan JonesWasif ShakilJasminMike Harris Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
trackback

[…] him. In this tweet he primary talked about what Googler Danny Sullivan said. But in one of his recent post on his site he also covered what Googlers Martin Splitt and John Mueller said during this event. Those topics were SEO, Gemini […]

Mike Harris
Guest
Mike Harris

Great post, Jonathan. The ‘sites within a site’ thing makes sense, but it also feels like Google’s just adding more hoops to jump through. If a site’s already trusted, why split authority? Kinda frustrating for publishers trying to grow, but understandable with the dominance of larger publishers who have stomped into areas that would have taken many others decades to get into. The AI content bit hit hard too. People pushing out junk with AI deserve to get hit, but it’s not always clear where Google draws the line. Like, how much is too much? Would love to see more… Read more »

Jasmin
Guest

“The AI content bit hit hard too.” – Was gonna say the same @Mike. We have seen what has been indexed to rank recently. But “helpful content first” sounds nice, of course…

trackback

[…] Jones attended the Google Search Central Dwell occasion in Zurich final week and posted glorious and detailed notes concerning the occasion, together with a bit extra on the extra usually core […]

Index