In our most recent SEO Reality Show episodes, we took readers through the agency’s process for setting up and launching Edelweiss Bakery’s blog:
- Finding potential topics
- Preparing a content plan
- Choosing an impactful blog theme
- Creating technical instructions for third-party content writers tasked with bringing the content to life.
In the meantime, the bakery’s web developers installed a blog theme on a bakery’s test domain to ensure that there were no technical errors before officially launching the blog on the bakery’s website.
You may remember that earlier in the SEO Reality Show, the agency conducted a technical site audit of the bakery’s homepage. They’ll use a similar process here to ensure the blog performs well in Google search.
The Challenge
Preparing technical audits will almost always turn up some sort of issues that need to be resolved, and this process was no real exception. The agency received a great deal of feedback from the client on how the bakery wanted their business to be represented on the blog and site. This is a crucial part of the process where the agency can build trust with their client by compromising where appropriate, and by explaining the importance of certain technical aspects that they feel will strongly benefit the bakery. The agency concluded that moving forward, they’d need to come up with meaningful solutions to client concerns. This would involve the following steps:
- Coordinate with the client in stages to avoid lengthy delays in the site design process. This would allow the client to give feedback prior to live site changes (when it would be difficult and time-consuming to make more changes).
- If necessary, postpone uncritical edits until after the release of the live site. Businesses get busy and can’t always give in-depth feedback to the agency on their timeline. Flexibility is key on both the part of the client and the agency.
The Process
Technical audits involve checking the bakery’s site for potential issues that could negatively impact Google’s ability to index the site, which in turn prevents it from showing up in organic search results.
During the audit, SEO experts do and check many things:
- Setting up the website’s main mirror
- Checking robots.txt
- Availability of a correct site map sitemap.xml
- Mobile rendering
- Page Response Code
- Canonical pages and their alternate versions
- Page loading speed
- The presence of duplicates, broken links
Let’s take a look at how the agency conducted a technical blog audit for Edelweiss Bakery’s website.
Step 1 — Setting up the site’s main mirror
A site’s main mirror is a replica of the main site. It allows brands to test the site structure before they actually launch changes. If the site is on a test subdomain, there is no need to check the main mirror’s configuration because no SSL certificate is installed and no redirects are configured.
When you’re setting up a site’s main mirror, remember to use https://httpstatus.io for verification. You must check the following combinations:
- site.com with www and without www
- site.com with HTTPS and HTTP (if you have an SSL or TLS certificate installed)
- site.com/index.html
- site.com/index.php
- site.com/index.htm
- site.com/home.html
- site.com/home.php
- site.com/home.htm
There may be additional options depending on your CMS. If the main mirror is not configured, it is necessary to task the programmer with configuring 301 redirects.
Step 2 — Robots.txt compilation
Earlier, the agency was working on a new robots.txt file for the bakery’s website. Since they were supposed to create a blog page without choosing a theme on the main domain, the agency advised the client against indexing the blog section. Now, after transferring the blog section to the main website the agency opened the indexing.
When you’re creating your own blog, don’t forget to open it up to indexing after you transfer it to your main website so that you can take advantage of all the potential SEO benefits when you’re ready for it to go live. To do so, simply remove the Disallow: /blog* line, which you can see in the example below:
You must prevent search engines from indexing test domains. In WordPress, you can do this in the “Reading” section of the CMS settings.
Step 3 — Checking the sitemap.xml file
An XML sitemap is not required for sites that have 1000 pages or less and an excellent internal linking structure. Blogs on WordPress will have an XML sitemap created automatically. Check that your sitemap is free of errors and contains only the pages that need indexing. When our partner agency ran the technical audit for Edelweiss, the bakery had published three articles. No issues were found during the testing of the sitemap, so WordPress did its job well.
When checking for errors, keep in mind that the sitemap can be indexed at different levels, so be sure to check all levels.
The agency used this validator to make sure that the sitemap was in good technical shape.
They also used the Semrush Site Audit tool, which can also check for broken links, oversized XML map files, and orphaned pages. These can each cause their own problems. Fortunately, no issues were found — as you can see in the example below.
Step 4 — Mobile Rendering
Before launching the site, you always want to make sure the search robot perceives the site the same way visitors will, especially visitors on mobile devices.
The agency used Google’s Mobile-Friendly Test tool for this purpose. When you’re assessing how your site renders, the agency recommends you pay close attention to the screenshot showing the Google Search Robot’s view of your website. If it differs from your mobile device’s view, you’ll need to identify and correct errors that prevent normal page rendering.
This happened in the test version of the Edelweiss blog. Although Google’s own tools indicated that the page was mobile-friendly, the rendering impression didn’t align with the actual user experience.
They determined that this was caused by a lack of Allow Directives for CSS, JS, and images. They were already registered in robots.txt, so these directives needed to be expanded.
Step 6 — Page Response Codes
Page Response Codes (also known as “Status Response Codes”) play an important role in the functionality of your site. When a user tries to access a site, the server will receive and process that request. Most of the time users won’t see the codes; just a functional page. But sometimes they’ll see a 404 error code.
This step of the audit is simple. You need to ensure that most pages return with a 200 status code, and non-existent pages send a 404 status code.
To check, use the Bulk URL HTTP Status Code, Header & Redirect Checker Tool, or a Chrome plugin like Link Redirect Trace.
After moving from the test domain to the live one, check the response codes of the site pages because the server settings may be different!
Step 7 — Canonical tags and their alternate versions
Canonical tags point search robots to a preferred version of a specific page, which mitigates issues around duplicate content. Canonical tags can also check the source code for links to non-existent alternate versions.
You can check the source code directly or use any convenient Chrome plugin (the agency uses META SEO inspector).
During their audit, the agency discovered:
- There were no canonical tags
- The alternate tags contained incorrect feed links
This was something that the web developer would need to tackle.
Step 8 — Page Loading Speed
Page loading speed is known to be important to the user experience, and it’s also a site ranking factor. The faster your site loads, the better.
Use Google PageSpeed Insights to check download speed parameters for both the mobile and desktop versions of your site. Check the doorway pages, categories, and the blog’s main page.
The agency discovered that the speed indicators for the page’s mobile and desktop versions were performing at an acceptable level, but that they could be better.
To improve them, they looked at the tips that Google recommended.
The FCP, LCP, and TTI metrics needed to be improved on the bakery’s test website.
To improve these parameters on your own site, follow Google’s recommendations, which include:
- Exclude rendering blocking
- Show text before loading fonts
- Use WEBP image format
Step 9 — Duplicates and broken links
Since the site is not indexable, no online tool can check for broken links at this stage. It’s best to do this via Google Search Console, which shows the exact same errors that Google’s search robots will find.
But in this case, it’s critical to open the site for indexing so that you can identify and fix any errors before it’s officially launched.
To do this, you can use a crawler that can bypass indexing restrictions. The agency used Semrush’s Site Audit tool for this purpose during their audit, which showed that there were no linking issues on the test version of the Edelweiss blog.
The agency strongly advises using Google Search Console to check the index for duplicate pages. If duplicates are found, they must be closed by meta-robots, or the source of the error must be identified.
Next Up
At this point, the agency conducted its technical SEO audit of the bakery’s new blog. They compiled all of their findings into a report for the bakery’s site developers to tackle. After the developers make those changes, the agency will then do a final recheck before making the site live.
So what happens next? The agency is going to check the “Shop” section’s settings, which is the commercial section of the site with categorized product cards. It’s a vital part of the site for the business because this is where the highest chunk of conversions happen. Stay tuned so you don’t miss out!
Source: 9 Steps to Perform a Technical SEO Audit for a Client’s Blog