Google never, ever does something for no reason. Sometimes it’s just a matter of waiting patiently to figure out what that reason is.
In May, Google created a fetch and render tool in Google Webmaster Tools that was built to render web pages properly for GoogleBot. At the time, it was unclear why the company was introducing the tool, though it hinted at future plans that would involve fetch and render.
On Oct. 27, we got a definitive answer.
So that tool that was put out a few months earlier was basically a warmup – it can be used to make sure GoogleBot is rendering your web pages correctly.
It’s all part of a drive toward better user experience that is ultimately behind the changes Google has made.
The Nitty-Gritty of the Changes:
That’s a big change from before, when Google’s indexing systems were more like text-only browsers. Google cites the example of Lynx. But the search engine says that approach no longer made sense since modern browsers index based on page rendering.
The search engine offers a few suggestions for optimal indexing, including:
- Getting rid of unnecessary downloads
- Using the progressive enhancement guidelines in your web design
What This Means
With any Google change, the real question is what does this mean? How will it impact webmasters and what sort of impact could it have on SEO?
Clearly the answer to that second question is sites that do not adhere to the suggested guidelines will see their search results suffer. Make sure your webmaster fully understands what Google is asking for, and discuss what type of changes should be implemented and how they could affect Google rankings.
Your aim is to create crawlable content, and that means doing whatever Google suggests. Use the fetch and render tool to make sure everything on your site is in order. It will crawl and display your site just as it would come up in your target audience’s browsers.
If yes, you are in good shape. If no, you need to figure out what tweaks to make so that Google is seeing the same thing you are. Here are potential problems that could be making your site’s content non-crawlable:
- Your server can’t handle the number of crawl requests you receive
Why These Changes, Why Now
Google always has intent behind what it does, and here’s my read on its intent with these changes: It’s making user experience a bigger factor in its search rankings. Think about it. The emphasis on page loads and rendering are two major steps in that direction.
That has also prompted speculation that the company could start using mobile user experience for its rankings as well. There has been rampant speculation in recent months, as mobile usage begins to overtake desktop, that Google will begin shifting its focus to the mobile web for search engine optimization.
So could this be one of the first steps on the way to those big changes? Perhaps. I always think it’s dangerous to try to get too many steps ahead of Google; the search engine likes to reverse course and throw people off from time to time. It does not like it when SEOs make changes in anticipation of its actions, preferring to dictate the course itself. And I do think the idea behind the crawlable-non-crawlable content changes makes sense. You have to keep up with the times.
But others could argue that keeping up with the times is exactly what Google will be doing by putting greater emphasis on mobile user experience.
The Bottom Line
Like any change from Google, this one will require adjustment and a fair bit of vigilance. I think it’s mostly a sign of things to come. User experience is really important to Google these days, and you would be wise to start looking at your mobile site in those terms. Make sure that you are doing everything you can to make your site mobile friendly, while still presenting a great desk top experience.
That way if Google does actually start penalizing based on poor mobile user experience, you will already be two steps ahead.