Learn how to properly setup your Google Webmaster Tools. Get the data, tools, and diagnostics for a healthy, Google-friendly site. Here are some of the things you can do with Google Webmaster Tools:
1. Check your site's health for potential issues that Google has detected.
2. Understand your search traffic and learn how users are finding your site.
3. Make optimizations to help Google better understand and represent your site.
Watch this presentation to get started!
Presentation by Ben Lewis at www.Boostability.com
3. Choose a verificaCon method
HTML
FILE
-‐
UPLOAD
VIA
FTP
OR
FILE
MANAGER
HTML
TAG
-‐
COPY
AND
PASTE
INTO
CODE
JUST
BEFORE
</HEAD>
VIA
FTP
OR
FILE
MANAGER
Click
verify
aEer
the
code
is
in
place.
Click
verify
aEer
the
code
is
in
place.
4. Verify both sites (www and non-‐www)
AEer
the
first
site
is
verified,
click
on
“Webmaster
Tools”
Click
on
the
Add
A
Site
buVon
Add
the
domain
again.
If
the
first
domain
included
the
www,
the
second
will
be
non-‐www.
Choose
the
same
method
of
verificaYon
as
the
first
domain
and
click
verify
(you
do
not
need
to
upload
a
new
file
or
HTML
code,
they
will
be
the
same).
5. Add a sitemap to both domains
On
the
site
dashboard,
click
on
Sitemaps
to
add
the
XML
sitemap.
Then
click
6. Add a sitemap to both domains
Enter
only
the
file
name
and
extension
(sitemap.xml),
not
the
whole
URL.
www
Non-‐www
7. Set the preferred domain
Navigate
to
one
of
the
domains
within
Webmaster
Tools.
Click
on
the
gear
icon
and
select
Site
Se^ngs.
Select
the
preferred
domain
(www
or
non-‐www).
The
preferred
domain
is
how
the
domain
will
appear
in
search
engines.
It
also
tells
Google
that
both
www
and
non-‐www
domains
are
the
same.
It
is
beVer
to
have
all
traffic
coming
from
the
same
domain
to
consolidate
domain
and
page
authority.
8. Test robots.txt
Navigate
to
Crawl
>
robots.txt
Tester.
Enter
in
the
robots.txt
URL
(it
should
be
in
the
root
folder).
Click
TEST
to
make
sure
Google
is
able
to
crawl
the
website.
Make
sure
the
XML
sitemap
is
included
in
the
robots.txt
file.
If
it
is
not,
then
add
it
via
FTP
or
file
manager.
Enter
the
robots.txt
URL
Click
TEST
9. Check for crawl errors
Navigate
to
Crawl
>
Crawl
Errors.
Check
for
404
(not
found)
errors.
Check
a
few
of
the
recent
404
errors
to
make
sure
they
are
sYll
404
and
not
just
a
server
glitch.
Check
by
visiYng
the
URL
in
the
browser.
If
it
is
a
missing
page,
an
error
message
or
404
page
will
appear.
If
the
page
is
not
missing,
select
the
box
next
to
the
URL
in
WMT
and
click
the
MARK
AS
FIXED
buVon.
If
the
page
is
missing,
301
redirect
it
to
a
relevant
page.
10. Fetch the domain
AEer
the
XML
sitemap
has
been
uploaded,
robots.txt
file
checked,
and
404
pages
have
been
fixed,
navigate
to
Crawl
>
Fetch
as
Google.
Without
entering
in
a
URL,
click
the
Fetch
buVon.
This
will
tell
Google
to
test
if
it
can
crawl
the
domain.
You
can
also
test
one
page
at
a
Yme
by
entering
a
page
URL.
Success!
Next,
click
here
11. Crawl the domain and all linking pages
Click
the
“Submit
to
index”
buVon.
A
pop
up
window
will
appear.
Select
the
second
opYon
“Crawl
this
URL
and
its
direct
links.”
Click
the
Go
buVon.
12. Google WMT is set up!
You’ve
made
sure
Google
can
crawl
the
site
and
pointed
Google
to
the
XML
sitemap
through
the
robots.txt
file.
You’ve
told
Google
where
all
the
pages
are
on
the
site
and
how
the
site
is
structured
by
submi^ng
the
XML
sitemap.
You’ve
helped
consolidated
the
domain
and
page
authority
by
telling
Google
that
the
www
and
non-‐www
websites
are
the
same
site
and
told
Google
how
to
display
the
domain
in
search
results
(*you
sYll
have
to
make
sure
the
domain
is
canonical
by
301
redirecYng
non-‐www
to
www
or
vise
versa).
You’ve
fixed
the
404
errors
to
get
the
site
ready
for
Google
to
crawl
and
index
updated
pages.
Finally,
you
submiVed
the
domain
and
all
linking
pages
to
Google
to
crawl.
You’ve
done
your
part.
Now
it’s
up
to
Google
to
decide
when
to
crawl
the
site.