Improve SEO Results With a Technical Website Audit
Improve SEO Results With a Technical Website Audit
Tony Patrick, Director of SEO and Analytics, Content & PR Division • Intero Digital • April 22, 2020
Searcher behavior has been disrupted. What people were searching for a couple of months ago has been turned on its head. But that doesn’t mean search engine optimization should stop. Businesses can take some time to work on their own websites and make sure they’re set up to maximize their potential — especially if they’re in an industry that’s not generating as much business at this time.
Think of it this way: If you don’t take a look under the hood, you won’t know whether the components of your website are working properly and actually helping you attract website visitors, close new business, and see measurable content marketing ROI. And what better time is there for a website audit than during a season of slower business?
What Is a Technical Website Audit?
While your website might look amazing and seem to be functioning well, it oftentimes misses opportunities to mitigate the risk of losing organic traffic and search rankings.
A technical website audit is the process of identifying a website’s technical issues that could hurt the user experience and diminish its ability to rank well in organic search results. Issues can include 301 page-to-page redirects, slow page speed, URL structures, and many other factors that can impact how well a website performs.
This audit can be helpful even during business slowdowns because your team might have more time to understand and fix the issues on your site. Making improvements now, rather than waiting, can also be beneficial because it can take some time for the site to see the full impact of these changes. So by the time business picks up again, your site will have had time to be crawled by search engines and start reaping the rewards of your SEO efforts.
10 Elements of a Technical Website Audit
So now you see why technical website audits are important, but the specifics might be a little murky, and you might feel like you need a dictionary for understanding SEO and all the terms. At Intero Digital Content & PR Division., we look at the following components of our clients’ websites and then investigate deeper, depending on our findings, to help them sharpen their SEO techniques:
1. Metadata
Metadata includes page titles, meta descriptions, and images.
When people perform a search, the first things they’ll see in search results are the page title and meta description. Optimized page titles and meta descriptions will help searchers understand what they can expect as a result of clicking the link. And if the information is relevant to the search terms they used, they’ll be likelier to click through to your site.
In terms of images, each should utilize an alt text tag, which serves as a description of that image. This helps search engine crawlers better understand what images contain, and it helps people who use screen readers get a grasp on what an image is communicating.
2. Page speed
Page speed is exactly what it sounds like: how fast your website pages load. Not only does page speed affect SEO, but it’s also important to all visitors who land on your site — no matter how they got there.
Search engines take page speed into account because the speed at which a page loads has a huge impact on the user experience. Ideally, search engines aren’t sending searchers to sites that will frustrate them, so fast-loading sites are perceived as more valuable and have the potential to rank higher. That’s why a slow website — especially on mobile — will struggle to improve its keyword rankings and overall SEO performance.
3. Duplicate elements
Duplicate elements are snippets of content used multiple times in different areas of the website. Duplicate titles, meta descriptions, H1 and H2 tags, or URLs can harm SEO visibility, reduce click-through rates, cost you chances to segment keywords, and confuse search engines.
When multiple pages on your website use duplicate elements, search engines have a hard time determining which version of the page to index, to which page to assign the link authority, which version of the page should rank for a relevant search query, and the quality of the content on the page.
4. Insecure content
Mixed content occurs when initial HTML loads over a secure HTTPS connection, while other resources — like images, style sheets, videos, or scripts — load over an insecure HTTP connection. All components of your site should load over a secure HTTPS connection in order to signal to search engines that your site is safe.
5. Redirects
Redirecting one URL to another can be appropriate, but if redirects are performed incorrectly, they can cause issues. Two common examples of improper redirects are redirect chains and redirect loops.
Long redirect chains and infinite loops can damage SEO rankings, as you lose about 10% to 15% in link authority during each redirect. Redirect chains make it tough for search engines to crawl your site, which affects crawl budget usage and webpage indexing. They also slow down page load speed, which hurts rankings and the user experience.
6. Broken links
Broken links lead to nonexistent webpages. When a link to your website is broken, it means authority that would’ve been passed from third-party websites to your website is lost. Broken links also lead search engine crawlers and users to 404 pages on your website, which results in a poor user experience. When visitors reach an error page, they often don’t want to stick around to explore the website further. And when visitors spend less time on a site, search engines assume the site doesn’t provide a good experience, which could lead a search engine to value the site less.
7. Schema
Schema, or structured data, is a back-end coding markup that provides more context to search engines about what’s on a page. Having structured data marked up on the back end of a website helps search engines better understand what each page is about, improves keyword rankings, and helps capture rich snippets in search results.
8. Security
The future of the web is a secure one, so websites have to be safe for visitors. Utilizing HTTPS and making sure your website is secure are imperative. Search engines use security as a ranking factor.
HTTPS doesn’t provide a major rankings boost, but when it’s used as part of a broader SEO strategy with other ranking factors, sites are bound to see improvement. And if you’re exploring Google Analytics for your HTTP site, the traffic passing through referral sources can appear as “direct” traffic, but if you use an HTTPS site, the security of the referring domain is preserved.
9. Technical files
XML sitemap and robots.txt files improve a site’s accessibility for crawlers. This ensures search engines can see and index pages. These files act as a guide so search engines know which pages they should — and shouldn’t — care about.
Sitemap.xml should include the links you want search engines to find and index. And the HTTPS site version should be set as the preferred one to avoid duplicate content, link dilution, and wasted search engine crawl budget. The robots.txt files help search engines understand what to crawl and what to avoid on your site. Listing the XML sitemap link in the robots.txt file will improve the site’s accessibility for search engines.
10. Canonical tags
A canonical tag (rel=canonical) can tell search engines that a URL represents the page’s master copy. This tag prevents problems that can occur when duplicate content shows up on multiple URLs.
Addressing each of these website components and SEO techniques can have various levels of impact. But by understanding SEO and avoiding any problems in these areas, you can set up your website for the most success once business picks up again.