All Time Service Available Call Girls Mg Road 👌 ⏭️ 6378878445
2020-03-05 Custard - SEO vs PWAs
1. SEO vs. Progressive Web Apps
What could possibly go wrong?
Chris Smith
custard.co.uk
2. About Me
• Manchester born and raised
• Studied Electronic Engineering at York
o Heavy software element
o Machine learning & genetic programming
• First business was a web design agency
• Custard is a technically focused, data-driven
digital marketing agencyChris Smith
22. But…
It took the best part of a year to get back to square 1
+PWA
RECOVERY
phase
DISASTER
phase
23. But…
It took the best part of a year to get back to square 1
+PWA
RECOVERY
phase
DISASTER
phase
OK
THIS DID HAPPEN ON OUR WATCH!!
The page at custard.mancseo.be-impossible.com:443 says:
x
24. But…
It took the best part of a year to get back to square 1
+PWA
RECOVERY
phase
DISASTER
phase
OK
THIS DID HAPPEN ON OUR WATCH!!
The page at custard.mancseo.be-impossible.com:443 says:
x
25. When you want to do complicated things to your
website, not only should you employ an SEO, but you
also need to implement their advice
90. Step 3: Execute
Todo list:
1. Agree specific outcomes
2. Produce roadmap
3. Execute
4. Monitor results
91. Step 4: Monitor
Todo list:
1. Check tracking software (keywords & preferred URLs, mobile vs. desktop)
2. Check Search Console (traffic levels, rankings)
3. Don’t be afraid to adjust or change your course
92. Question 7: What exactly did we do to salvage this mess?
#repost #tbt
94. Take-Aways
1. PWAs & SEO can be made compatible, but it’s unlikely to happen automatically
2. Just make sure that Googlebot can understand your site by:
1. Considering caching your pages for crawlers
2. Making the site work without JS enabled (if all else fails)
Good evening
I’m Chris Smith from Custard
And I’m here to present my talk – [read title]
First, a little bit about me.
I’m Manchester born and bred.
I started my career as a web developer, and was lucky enough to take the lead on some pretty hefty ecommerce and bespoke web application development projects.
I’ve spent the last 4.5 years transforming Custard into a technically focused, data-driven agency.
I’ve put together this presentation with the intention of giving everyone something to take away – not just heavyweight technical SEOs.
I’m going to show you…
…and most importantly
A PWA is a hybrid of a website and a native application.
In theory, it offers you the benefits of both and the downsides of neither, but in reality it all boils down to the specific implementation.
Here’s a quick table comparing features of the three categories:
Another illustration, this one by Google themselves
I’m not here to sell you a PWA – there are plenty of resources out there which explain the commercial pros and cons but in a nutshell…
But in a nutshell, when done well a PWA can offer similar ROI benefits to using native apps, whilst at the same time offering the lower ownership cost of a website
So for the sake of this talk…
What are the risks to SEO?
As SEOs, PWAs present us with the mother of all risks…
That Google may no longer understand your website
Kind of obvious really
It’s a mobile first world, so if you don’t have mobile, you’ve got nothing.
Pretty bad
Visibility is vanity, I’m told.
So what about traffic?
In this case, it’s actually worse.
Non-brand traffic halved.
This occurred in the space of 2 months heading in to Black Friday.
In this case, it’s actually worse.
Non-brand traffic halved.
This occurred in the space of 2 months heading in to Black Friday.
The good news is yes.
Another client who came to us in the late stages of moving to a new site with a similar architecture went through the same loop and has since recovered.
No consideration for SEO, and it absolutely decimated their non-brand organic visibility.
Now, this did happen on our watch.
But none of what we recommended was adopted before the launch. It was added in retrospect.
This was a costly mistake.
No consideration for SEO, and it absolutely decimated their non-brand organic visibility.
Now, this did happen on our watch.
But none of what we recommended was adopted before the launch. It was added in retrospect.
This was a costly mistake.
No consideration for SEO, and it absolutely decimated their non-brand organic visibility.
Now, this did happen on our watch.
But none of what we recommended was adopted before the launch. It was added in retrospect.
This was a costly mistake.
Moral of the story
So…
You need a meticulous and systematic approach.
I’m going to talk you through ours step by step.
Do not trust anything your client says unless you’ve seen it with your own eyes.
Goes without saying that we want GA and SC… but if your client has their own keyword tracking tools then that could be very helpful
Failing that, Sistrix provides up to 10 years of historic data and there’s a reasonable level of detail in there.
Helps us build a picture of what has happened
Ensures that we aren’t taking what the client tells us is happening as read – we need to see it first hand.
Compare YoY – account for seasonality
Other factors – such as algo updates, content changes, site alterations, migrations and merges
Brand – this traffic tends not to be directly impacted by issues such as these – client will (almost) always be most relevant
Trends – which types of keyword are affected, which types of page are affected, have there been any changes in preferred URL in the SERPs? Has mobile been hit worse than desktop? Almost certainly yes!
The recovery of the client in this presentation.
The client is in education, so they receive lots of brand traffic from their student base looking to complete online exams etc
Checking for issues with URLs in Search Console used to be easier than it is now
This means that if a non-canonical version somehow ends up in the search index and is receiving traffic, you can’t see it in Search Console.
This is where server logs can be really handy.
I’m going to show you how to use Google’s cache to help with this too
Do we want to recover for specific keywords?
Are the wrong pages being served?
SERP features lost?
If during impact assessment you established that a particular type of keyword has been hit, then you want to track that type as a single group and, if appropriate, in sub groups.
Good for split testing and measuring collateral impact
You can easily set up sub accounts which you can bill for or be granted access to from your client
Collateral impact is important – if you boost traffic for a page but lose it elsewhere (either for the same keywords or other ones), then that’s not really a win is it? This way we can measure overall gain.
Most tools don’t offer historic data so you will need to allow enough time for data to gather before you start making changes. Perhaps a week or two, depending on the site’s normal rhythm.
This is where most of the value adds are going to be in this talk, so try not to nod off in this bit
Most of what I tell you is useful for any mobile website, not just PWAs
Determine how the PWA is served.
Emulate it with Chrome
Response codes
PWA logic. If you find that the client is doing anything differently for Googlebot versus real users, treat with extreme prejudice.
Geolocation can be a nightmare
Chrome emulation how to – dev tools, user agent switcher
Responses (cache header plugin) – issue with JS redirects not being ‘proper’
This isn’t true mobile emulation however because your browser still declares itself as not being run from a physical mobile device
Spot the difference
Totally different HTML when you change user agent
Next thing to be careful of is that when you ‘view source’ of a page, you don’t necessarily see the same code as what’s behind the rendered page
Left is raw source. Right is rendered source, accessed using the ‘inspect’ tool.
Look how different these two are – both are the desktop homepage of Debenhams.com.
On the left, none of the JS has fired.
On the right, the browser has run all the JS, manipulated the DOM and potentially served a completely different site.
Text editor such as Notepad++ is very handy for doing automated comparisons on very similar text files.
I mentioned earlier about user agent being declared via the ‘headers’. What are they?
Metadata relating to the communications between the user’s browser and the web server
We see a lot of PWA sites using JS redirects and we’ve seen multiple instances where Google doesn’t seem to understand these.
So now we have the skills we need to test the site. We need to go and work out what Google is doing with all this.
Is it seeing what we see? Or is it seeing something completely different
The first place to look is the SERPs.
Anything unusual really – a lot of this is common sense
Empty snippets which shouldn’t be empty are usually a good sign that something is wrong.
Remember that we can now emulate mobile users, so check the snippets on mobile results pages too – you never know what differences you might find.
When you find a problem…
Google maintains a copy of the web pages it finds on your site. You can view this copy by accessing the ‘cache’ from this dropdown arrow on the SERPs.
The bit at the top contains useful information about which page has been crawled, when it was crawled.
Take it with a pinch of salt though – this isn’t necessarily the most recent crawl, and we’ve seen issues with Google showing old page versions after the date/time they’ve been changed.
But this is only the desktop cache
No arrow on the mobile SERPs
Could change user agent…
But that alone means you have to keep turning it off and on again when you go back to the SERPs to click another result
So, type cache: followed by the URL you want into your browser.
This means you can cut the SERPs out altogether – which is great when you’re trying to see if there’s a cache copy of a page which has dropped way down the rankings.
Well, you’re basically looking for anything that suggests Googlebot isn’t interpreting the site correctly.
Sometimes it’s pretty obvious.
This is what we saw in the mobile cache for every page of one client’s shiny new PWA-enabled site.
Viewing the source returned a single <div> with no content.
Ember framework.
Here’s another example.
If you see a spinner, then that might be all that Googlebot is seeing too!
This is what we saw in the cache of a client site using a platform as a service solution, which hadn’t undergone any organic specification or analysis.
BUT you need to also be wary of your browser loading scripts over the top of the page and interfering with what you see in the cache.
This is a bit complicated to get into now, but…
Here’s another example.
If you see a spinner, then that might be all that Googlebot is seeing too!
This is what we saw in the cache of a client site using a platform as a service solution, which hadn’t undergone any organic specification or analysis.
BUT you need to also be wary of your browser loading scripts over the top of the page and interfering with what you see in the cache.
This is a bit complicated to get into now, but…
Well, you’re basically looking for anything that suggests Googlebot isn’t interpreting the site correctly.
In this case, they’re the same. But we had an issue with a client recently where they had a URL parameter which they could use to force the responsive site to be served for mobile users.
This helped bridge the gap between certain product configuration functionality which wasn’t implemented in the PWA.
But Googlebot found these URLs, and when it couldn’t interpret the PWA version correctly, it started preferring the parameterised versions over the canonical ones.
Get to grips with large files. We received 7 days of log files earlier this week and they came to just under 15GB once unpacked. It’d be handy to get to grips with Perl or I use PHP for distilling these files down – filtered for Googlebot, Bingbot, the files came to under a 80MB, and less than a million rows so I could open in Excel.
Use Excel. I don’t often use a log file analyser for this kind of work because I find it easier to spot issues when scanning directly down the entries. It’s fairly easy to dissect the data, adding columns, filtering for particular strings that catch your eye. You can then build a picture of how much of an issue something is. Once you’ve categorised your data then you can use slicers and pivot tables to make sense of it.
Trawl. See what URLs it’s crawling. Does it make sense.
Well, it took quite a lot of work to investigate and determine what had gone wrong.
But the outcome was actually pretty simple.
However, this took 6 months to get over the line. The client lost a lot of traffic in the interim.
Your mileage may vary.
Not a massive social media type – Insta is your best bet