Clicky

CommentComment

This week's issue has been made easy by the quality of the articles published in the last week or so. Our Post of the Week, Algorithm Analysis in the Age of Embeddings, by one of my favourite SEO bloggers AJ Kohn, looked at the obsession with Google's Rater Guidelines, and why we should stop focussing on E-A-T in particular (which sparked a bit of a twitter debate here).

I have a similar opinion. Google's Rater Guidelines are only interesting to see where Google's heading; their aspirations. There are too many SEOs making groundless recommendations based on August's update, and the change to Raters Guidelines, without taking a step back, and comprehending how Google might reflect E-A-T algorithmically.

Rand recently tweeted this, which I think perfectly sums up our industry reaction to Google updates:

"Google launches an algo update:

  • (1) 99% of the time, 99% of marketers should do nothing differently.
  • (2) 1% of SEOs will dig deep, form theories, & write about it (which is fine)
  • (3) That 1% biases the 99% to think they're supposed to make big changes (that's not so good)"

Making reactive changes to updates is often the worst thing we can do. After all, Google is now making thousands of them each year. What we should be doing is learning more about the trend of updates they're making, which can often be done by looking at the SERPs.

Here's a bad analogy about weather: to work out whether it's going to rain or not in the next half an hour, we can usually just look up to the sky. Based on what we've seen before, there are certain characteristics we associate with rain clouds; dark grey, low to the ground, blanket-like across the sky etc. There are then technical characteristics forecasters can use to make even better predictions, such as low surface pressure, wind direction and speed.

So where the hell am I going here? I think the same way of breaking down a problem can be applied when analysing updates and rankings. The SERPs give us the majority of the information we need first and foremost. The pages that are ranking show:

  • What Google believes to the be the intent of the search query.
  • Common characteristics of top performing content we can assume Google deems important.

To expand on this, we can then use some of the tools at our disposal to augment our understanding of the problem, eg. are links still important to rank head terms in top positions (check Tom's post below 😉)

Please ignore the '5 ways to optimise your website for E-A-T' posts, head over to the SERPs and do your own testing.

Andrew Charlton

Post of the Week

Tech


Tools

Hiring