Nine (mostly technical) ways to ruin your search engine rankings and kill your traffic with a site redesign, relaunch or migration … and how to avoid them. This talk was originally presented at WordPress DC.
2. @melaniephung
A few fun facts to set the stage:
● 40-60 billion searches are conducted on Google in the U.S. per month
● 51% of all web traffic comes from organic search
● More than 40% of online revenue is captured by organic traffic
● Nearly 25% of all search volume happens outside the top 100 million
keywords (that’s a long tail!)
● As of 2016, Google has indexed 130 trillion web pages
● Organic traffic still gets ~20x more click traffic than PPC
Sources: Jumpshot/Moz.com/BrightEdge/Search Engine Journal
7. @melaniephung
Mistake 1: Failing to redirect properly
The result:
80% drop in traffic.
Similar Fail:
Setting rel=canonical for every
page to the homepage
The British royal website was migrated to a new domain name. Rather than
doing 1:1 redirects, the site 301’d EVERY URL from the old site to the new
homepage.
Example and Image from: OmiSido
8. @melaniephung
Have a redirection plan to avoid losing traffic
Failing to redirect, or doing it badly, can cost you most of your traffic.
● Bring in a content strategist or SEO to conduct a
comprehensive content audit well in advance of site launch.
● Have a migration plan that includes a full redirect mapping.
● Document and communicate requirements clearly with your dev team.
QA the heck out of your migration (as per your content migration plan).
Related: Be hyper vigilant about canonicals. If you don’t know how to manage them, it’s better to
avoid touching them at all. It’s one of the most powerful ways to deindex pages.
9. @melaniephung
Mistake 2: Not using SSL/TSL/HTTPS
Why Use HTTPS?
● HTTPS has been a Google ranking factor since 2014.
● Starting this July, Chrome will show a strong warning
on non-secure pages.
● Migrating from HTTP to HTTPS after a site has been
live for a while is a messy PITA!
(You should still do it though!)
Stats:
● By last year, 50% of the results on the first page of
Google’s SERPS were encrypted.
● In the U.S., HTTPS usage in Chrome is at 82%.
Source: Moz.com
Source: transparencyreport.google.com
12. @melaniephung
How to avoid a bad HTTPS migration
● Create a sitewide 301 redirect rule for all URLs
● Ensure all media files are served over HTTPS also (no “mixed content” warnings)
● Avoid redirect chains -- Keep everything to one hop if possible
○ Yes:
http://www.domain.com → https://www.domain.com
and
http://domain.com → https://www.domain.com
and
http://www.domain.com/index.php → https://www.domain.com
○ No:
http://domain.com/index.php → http://www.domain.com/index.php → http://www.domain.com →
https://www.domain.com
● Do not let both versions continue to resolve (return a 200 code)
● Do not let URLs on the old version return an error (404 code)
● Work with a professional who has experience with HTTPS migrations!
13. @melaniephung
Mistake 3: Blocking critical URLs in robots.txt
Very common mistake: Blocking any crawling of the entire site at launch
More insidious mistake:
● Blocking directories instead of specific pages
● Blocking JS and CSS
Consequences:
● Chunks of your site can’t be found in
search results
● Google can’t render your site Example provided by CognitiveSEO
15. @melaniephung
How to prevent robots.txt errors
1. Learn, live, love RegEx
2. Test a LOT of URLs against robots.txt in Search Console
3. Make updating robots.txt part of every launch checklist
16. @melaniephung
Mistake 4: NOT blocking dev environments
Definitely do have one or more development
environments!
But don’t let them get indexed.
Once Google starts crawling and indexing your testing
sites, getting those links out of the index takes a bit of
work.
17. @melaniephung
Why You Don’t Want Dev Environments Indexed
Consequences:
● Duplicate content
● Dev/staging site can outrank your
production site
● Customers will see your WIP site/broken
experience/old site
● Customers will try to
engage/transact/convert on a
not-fully-functional site
● Analytics will break
37 versions
of the same
page in
Google
Details redacted to protect the
guilty. Trust me, the same page
showed up across 37 subdomains
18. @melaniephung
How to prevent dev domains from getting indexed
Ahead of deploying:
1. Password-protect the dev environment(s)
2. Block non-internal IPs
3. Put robots “noindex” meta tag on every page
-or-
4. Block entire subdomain from being crawled in robots.txt
Be aware that robots.txt and the robots meta tag do different things!
19. @melaniephung
How to prevent dev domains from being indexed
But remember: don’t transfer the block to your LIVE site!
20. @melaniephung
Mistake 5: Generating duplicate content
Allowing multiple URLs to load the same page content creates a “duplicate
content” issue.
Duplicate content can impact rankings, but also create a messy analytics
challenge.
Common examples:
● Dev environments (See Mistake #3)
● http AND https
● domain.com/index.php AND domain.com/
● /directory/page-name/ AND /node/2345/
● /results/ AND /results/?sort=default
Will the canonical Slim Shady please stand up?
21. @melaniephung
How to prevent and identify duplicate pages
Prevention:
● Understand how your CMS handles
slugs and URIs
● Avoid letting internal site search
results get crawled
● Have a strategy for handling query
parameters
○ Search Console
○ Robots.txt
○ Rel=canonical
Research & Mitigation:
● Use the “site:” search operator to
find on-site dupes
● Audit Google Analytics & Search
Console data
● Crawl site with tools like Screaming
Frog
● 301 redirect duplicates to the
strongest version of the page (if
possible)
● Use rel=canonical (correctly!)
22. @melaniephung
Mistake 6: Picking a bad theme and plugins
Your code and plugins need to be:
● Crawlable by search engines
● Not riddled with malware
● Mobile-friendly (preferably also compliant with accessibility standards)
● Written in clean, semantic code
● Fast to load
● Patched and updated if standards change
25. @melaniephung
Mistake 7: Not Paying Attention to Analytics
You can use various analytics tools to tell you, among other things:
● Which landing pages are generating the most traffic.
These are your top priority for migrating and redirecting
● Which keyword rankings you have.
Tells you which URLs & content you need to protect. (Use Search Console or paid tools, not GA)
● Where your referral traffic is originating.
These could be important sites to reach out to about your relaunch or for future link-building
● Which pages have high bounce rates.
These are pages you may need to improve
● What your traffic and performance trends are over time.
Key Performance Indicators and dashboards can help you identify problems … and progress!
26. @melaniephung
Tip: Set up and use Google Search Console
Common issues:
● Duplicate metadata
● More URLs indexed than should exist
● Fewer URLs indexed than were submitted
● Server response errors
● Pages blocked in robots.txt
● Spam links warning
● Hacked site warning
Search Console is where to learn if there are major problem with the site.
27. @melaniephung
Mistake 8: Not having a helpful 404 Error Page
Most site owners assume users hitting
error pages are an extreme edge case.
404s are almost guaranteed to happen
with redesigns and site migrations.
A poorly thought out Error Page is a lost
opportunity.
29. @melaniephung
… But are they helpful?
Useful 404 pages should:
● Make it easy to get to the
page the visitor wanted
● Provide helpful suggestions
for other content that may
be of interest
● Keep the visitor on the site
A very high bounce rate is a lost opportunity.
Oh my, that’s high!
30. @melaniephung
How to avoid unhelpful 404 pages
Prioritize the user’s needs.
Simple ideas for being helpful:
● Include a search box
● Include a high-level sitemap
● Showcase popular content
● Show relevant or recent content
● Be easy to use on mobile!
Bonus points for:
● Customizing the error depending on what the user was looking for
● Being on-brand, funny, beautiful and delightful (but be careful about being too clever)
31. @melaniephung
Mistake 9: Thinking a plugin does SEO for you
Things plugins can’t do:
● Advise you on what keyword phrases to target and why
● Tell you the user intent of a search engine user
● Create GOOD content that meets users’ needs
● Identify weak experiences causing users to bounce or abandon
● Create the optimal internal linking strategy for your site
● Benchmark what competitors are doing and what it’ll take to beat them
● Show you how to avoid any of the mistakes we’ve covered so far...
32. @melaniephung
Recap: 9 terrible, horrible, no good ways to tank
your search traffic
1. Failing to redirect properly
2. Not using HTTPS to start
3. Blocking critical URLs in robots.txt
4. Not blocking dev environments
5. Generating duplicate content
6. Picking a bad theme
7. Not paying attention to analytics
8. Not having a helpful 404 page
9. Thinking a plugin does SEO for you
33. @melaniephung
If web traffic is important to your business ...
● Educate yourself on what SEO is and isn’t --
there are good FREE resources available… but also a lot of nonsense
● Use the free tools that Google makes available to site owners, especially
Google Analytics, Search Console, and its educational resources!
● Invest in the SEO channel just as you do your other marketing channels
● Track meaningful metrics and performance indicators
● Hire or partner with a reputable SEO specialist if it makes sense for your
business or organization
34. @melaniephung
What kinds of specializations are there?
● Technical - Enterprise? Ecommerce? JavaScript? Will they tell you why they want
to fix things like schema, spider traps, duplicate content, sitemaps, redirect chains,
page speed, etc?
● Editorial - How do they do keyword research? Are they “spinning” content? What
do they say the goal of content is? Can they explain what the Panda algorithm is?
● Link-building - What’s their outreach approach? How do they evaluate link
opportunities? Are they transparent about where they are getting links? Are they
using shady PBNs?
● Local - How do they build citations? What tools do they use for managing NAPs?
Schema? City pages?
● International - What can they tell you about subdomains vs subdirectories vs
separate domains? Content localization, translation, hreflang, canonicalization?
35. @melaniephung
When working with an SEO consultant
They should:
● Understand your business goals
● Set proper expectations
● Clearly communicate what their speciality is and isn’t
● Be upfront about what work they outsource
● Be transparent about what they are working on
● Provide regular reporting
● NOT deliver a “one size fits all” approach
36. @melaniephung
When working with an SEO consultant
You should:
● Do your due diligence -- don’t just pick the cheapest option
● Clearly communicate your needs and your constraints
● Have realistic expectations
● Implement the things that are your responsibility to implement
● Not look for shortcuts
● Pay attention to what your consultant is sharing and reporting
● ASK QUESTIONS
37. @melaniephung
A Couple of Resources
SEO success factors: https://zyppy.com/seo-success-factors/critical/
Google Webmaster Help videos: https://www.youtube.com/user/GoogleWebmasterHelp/
Screaming Frog: https://www.screamingfrog.co.uk/seo-spider/
Google Lighthouse: https://developers.google.com/web/tools/lighthouse/
Pingdom (Site Speed Test): https://tools.pingdom.com/
Structured Data Tool: https://search.google.com/structured-data/testing-tool/
Robots.txt Tester: https://technicalseo.com/seo-tools/robots-txt/
Fetch & Render: https://technicalseo.com/seo-tools/fetch-render/