Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

SEO Data - The Circle of Trust

3 689 vues

Publié le

As SEO tools get outpaced by Google's constantly evolving signals, what SEO data is worth making bets on?

Publié dans : Technologie
  • Soyez le premier à commenter

SEO Data - The Circle of Trust

  1. 1. THE SEO CIRCLE OF TRUST Matthew Brown MJBLabs @MatthewJBrown MatthewBrownPDX
  2. 2. Where we went wrong
  3. 3. SEOs and ou r “data”
  4. 4. Google
  5. 5. We aren’t ‘proving’ ranking fac tors. Solving algorithm updates end ed with Caffeine.
  6. 6. Server Logs Search Co sole
  7. 7. Server Logs
  8. 8. Google’s Pipeline Optimizing this means less crap hits the parser/index
  9. 9. Googlebot needs a diet “Crawl budget” isn’t the issue here GSC won’t tell you Server logs don’t lie!
  10. 10. Buy this product! (not an ad)
  11. 11. Log Timeframes/Splits Small timeslices are easier to work w ith Compare different months, 7d, 14d Google’s crawl cycles aren’t static
  12. 12. What I’m NOT looking for Don’t freak out about a few UTMs. Everyone will be mad at you.
  13. 13. Match up links with crawl Crawled pages and pages with links should close to 1:1
  14. 14. Find and break Googlebot’s h abits 4xx/5xx/Soft 404 s: catnip for Googlebot Do not get me started on Bingb
  15. 15. Here’s the good stuff
  16. 16. More good stuff
  17. 17. What we did: No more API endpoints for Go ogle Optimized IA for a smaller # of hubs
  18. 18. robots First noindex THEN robots.txt bl ock
  19. 19. Parameters Browse/Searc h entry points have link valu e. *Block carefull y*
  20. 20. What we did: Left main Browse q= URL alon e Blocked all ?q= Unblocked ?q= that had links
  21. 21. The Results: noindex and robots.txt block
  22. 22. Important note: Patience noindex and robots.txt block changes implemented here The Yikes Zo ne
  23. 23. My takeaways: Shoot for the fewest # of pages per keyword target Ruthlessly eliminate everything el se from Google’s crawl – Quarterly
  24. 24. Search Console
  25. 25. about Rankings Crawl Stats Links Reliability
  26. 26. Good!
  27. 27. Oh No! (times 6000)
  28. 28. URLs Do they receive traffic? Links? (this can get tricky) Internal searches? Social shares? Cited/referenced by other pages?
  29. 29. The results:
  30. 30. Takeaways: Segment XML sitemaps into as many in dex files as you can for granular data Qualify your sitemap URLs based on m etrics
  31. 31. Thoughts
  32. 32. My best recent tech SEO successes were based on server log data (with a little GSC).
  33. 33. data is going to get worse. GSC isn’t getting better. Your own data is the key

×