This document contains advice from a presentation on JavaScript rendering and SEO. It recommends: (1) Being mindful that Google has not fully updated its tools to the latest JavaScript rendering capabilities. (2) Avoiding client-side JavaScript rendering and instead implementing server-side rendering or static sites to ensure content is crawlable. (3) Introducing and adhering to web performance budgets and limiting client-side JavaScript to improve performance for users and search engines.
16. @bart_goralewicz
We decided to
try to understand
pages by executing JavaScript. It’s
hard to do that at the scale
of the current web, but we decided
that it’s worth it.
2014
18. @bart_goralewicz
“Times have changed. Today, as
long as you're not blocking
Googlebot from crawling your
JavaScript or CSS files, we are
generally able to render and
understand your web pages like
modern browsers.”
2015: Google claims they are generally
able to render JavaScript
22. @bart_goralewicz
2018:
Google wants to render ALL WEBSITES on their own
and stop using the old Ajax crawling scheme
"Googlebot will render (...) directly, making it unnecessary for the website owner to provide a rendered
version of the page.".
SOURCE: https://webmasters.googleblog.com/2017/12/rendering-ajax-crawling-pages.html
27. @bart_goralewicz
"Today, we are happy to
announce that Googlebot
now runs the latest
Chromium rendering
engine (74 at the time of
this post)"
7th of May 2019
https://webmasters.googleblog.com/2019/05/the-new-evergreen-googlebot.html
33. @bart_goralewicz
It’s great that Google wants to
support new JavaScript features!
But for the average SEO or developer,
IT DOESN’T CHANGE MUCH.
Reality?
43. @bart_goralewicz
GOOGLEBOT IS BASED ON THE NEWEST VERSION OF CHROME
Googlebot declines user permission requests
Cookies, local & session storage are cleared
across page loads
The browser always downloads all the resources
Googlebot may not...
BUT IS NOT A REAL BROWSER!
59. @bart_goralewicz
• both users and
search engines
have to render JS
on their own
• Default practice.
Usually it’s the
most problematic
one for Google
• bots get a static
version, easy to
crawl and index
• users get a fully-
featured
JavaScript website
Isomorphic JS Static-site renderingClient-side rendering Dynamic rendering
• Initial, server-side
rendered HTML is
sent to users and
search engines
• Then JavaScript is
loaded on top of
that
• All the HTML files
are built with data
BEFORE they are
uploaded to a
server
64. @bart_goralewicz
"is a workaround, but (...)
a stepping stone towards
improving your website
because server-side
rendering and
hydration, there's a
bunch of work to do."
DYNAMIC RENDERING
Martin Splitt
68. @bart_goralewicz
AVOID CLIENT SIDE JS
RENDERING2
We saw a few cases when 100% CSR websites
would rank, BUT
• they are rather an exception than a rule
• all the websites were fairly small and static
@bart_goralewicz
AVOID CLIENTSIDEJS
RENDERING2
We saw a few cases when 100% CSR websites
would rank, BUT
• they are rather an exception than a rule
• all the websites were fairly small and static
72. @bart_goralewicz
INTRODUCE (AND STICK TO) THE
WEB PERFORMANCE BUDGET AND
LIMIT CLIENT SIDE JAVASCRIPT3
for:
• Web Performance
• Crawler budget
• Social Media
• Bing and Altavista
P.S.
Don’t pay your JavaScript
developers based on “the
number of lines of code per
day” ☺
77. @bart_goralewicz
DON’T USE FEATURES TARGETED
ONLY AT COMPLYING
WITH CHROME 41 ANYMORE5
But experiment and test before implementing
changes. It’s new territory.
80. @bart_goralewicz
Share these #SMXInsights on your social channels!
ALWAYS:
Check if Google can render your website.1
Make sure that Google can index your
content quickly. Use the "site" command.2
Make sure that Google can discover your links.3