Technical SEO is the optimization of the foundational elements of a website to enhance its visibility in search engine results. The focus is to improve site structure, page speed, mobile responsiveness, and other technical factors that impact search engine crawling and indexing.

Here, you’ll find:

Search engine optimization (SEO) is what puts brand content in front of target audiences. 

But it takes more than on-page SEO to appease the great and powerful Google — ruler of search in the land of engines. 

If you truly want to rank, you also need technical SEO.

 What is technical SEO?

Technical SEO is the optimization of the foundational elements of a website to enhance its visibility in search engine results. 

It focuses on improving site structure, page speed, mobile responsiveness, and other technical factors that impact search engine crawling and indexing. 

A well-executed technical SEO strategy ensures that search engines can effectively interpret and rank your content, laying the groundwork for your high-quality content to rank well.

User looking at computer screen

Google and other search engines have their own bots (also called “spiders”) that crawl websites via your site’s source code. (Image: Unsplash)

 The 6 most common technical SEO issues

There are plenty of mistakes site owners can make when it comes to SEO. But here are the top six high-priority technical SEO issues Veronica Baas, Lead Strategist at HawkSEM, sees most often:

  1. Sitemap.xml crawl errors
  2. Duplicate content issues
  3. Unintentional temporary redirects
  4. Unoptimized images
  5. Poorly coded JavaScript & CSS
  6. Broken internal and external links

Let’s break it down.

 1. Sitemap.xml crawl errors

Sitemaps often have old and outdated pages that are broken or redirect traffic. Removing these is critical if you want Google’s spiders to crawl and index your pages.

Note that larger sites have bigger site maps that can halt indexing if it’s a chaotic mess. Crawlability definitely affects whether or not the search engines can even find the pages you’d like to rank.

 2. Duplicate content issues

Content marketers know duplicate content is a big fat NO. But sometimes, they’re necessary for site functionality—for example, page two or three of a blog or e-commerce product category. 

These pages can co-exist but don’t need indexing for search, so you can use a robots.txt file to block these pages from being crawled. However, if you want them added to Google, differentiate the content so it passes the non-duplicate test. 

Another duplicate content issue is having a www. and non-www. or http: and https: versions of your domain. You only need HTTPS pages with either www. or non-www. pages. 

For instance: https://mysite[dot]com or https://www.mysite[dot]com (one or the other, not both). When you have both, it creates two separate domains on Google (if Google fails to detect the duplicate page and index one, which it usually does).

 3. Unintentional temporary redirects

A 302 or 307 redirect is temporary, while 301 redirects are permanent. The difference: a temporary redirect is for taking down a web page you’ll republish one day. For example, a black Friday sale page.

A permanent redirect, such as a 301, tells Google the page is going down forever and won’t be coming back… ever (hit the road, Jack).

“When using 301 redirects, Google will pass the web page authority of the old page to its new redirect destination, giving the new page a boost,” says Baas. “Temporary redirects don’t have this same effect so can be a big waste of page authority.”

 4. Unoptimized images

Unoptimized images can impact site speed and load stability negatively. This can happen when site owners:

  • Add ginormous images to web pages
  • Lack proper caching and/or don’t have a lazy loading setup
  • Use the wrong image file format (next-generation file formats are optimal for speed)
  • Don’t have height and width HTML attributes

 5. Poorly coded JavaScript & CSS

Now for some “geek” talk. If you’re a coder or familiar with programming languages, this is for you. When JavaScript and CSS (common languages used to build websites) are coded poorly, it bogs down site speed. 

This can happen when there are:

  • Unused JavaScript/CSS files
  • Unminified JavaScript/CSS files
  • Render-blocking JavaScript
  • CSS resources blocking the first paint
  • DOM tree size/node count

 6. Broken internal & external links

It happens to the best of us, even those with Trello boards organized to the T.

You move a page, forget to switch all the links to your site, and now you have a broken link problem. It’s an easy fix if you catch it early and use an SEO tool like Screaming Frog to spot broken links.

Then, there are broken links outside of your realm. External links change, and when they do, it tells Google the site is outdated and can hurt crawling efficiency and the user experience. 

In other words, bad news for your core web vitals and page ranking.

woman using tablet at a table

The first step to improving your technical SEO strategy is with an SEO audit. (Image: Rawpixel)

 How can you improve your technical SEO?

Dealing with poor site performance ends here. It’s time to take the steps necessary to make your site worthy of visitors (and the SERPs). 

Here’s a technical SEO checklist you can use now to improve your site today.

1. Audit your current technical SEO efforts

The first step to improving your technical SEO strategy: conduct an SEO audit. A proper SEO audit is a mix of a manual walk-through of your site coupled with the use of trusted tools, such as Google’s Core Web Vitals report, SEMRush, and Screaming Frog, to find common technical issues.

Some issues auditing tools can detect include:

  • Duplicate content
  • Broken internal links
  • Invalid robots.txt format
  • Non-secure pages
  • Slow page load speed
  • Multiple canonical URLs

After the audit, identify what’s broken and address those errors first. Some audits reveal a ton of red flags, which can be overwhelming. That’s why it’s a good idea to have an SEO expert and a web developer review and help you address the more technical issues and advise which to prioritize.

Pro tip: Register your site with Google Search Console and Bing Webmaster Tools. These free tools can find technical issues on your website.

 Here’s a short clip from our recent webinar that delves into how to perform a technical SEO audit.

2. Understand how search bots crawl and index

Google has billions of pages indexed. Google and other search engines have bots that crawl websites via the source code on your website. But these bots don’t “see” web pages the same way humans do.

It takes more than just producing great content for Google to find it, rank it, and reel in traffic. (Although great content is an important piece of the puzzle.) 

Search engine spiders perceive websites through a technical lens, primarily analyzing the HTML code, meta tags, and other elements in a structured manner. 

Unlike human visitors, they lack the interpretative skills to understand content nuances, visual aesthetics, or interactive elements. So, if the major content on your page is visual,  you need to make sure you have the correct structured data to go with it so that a spider can also interpret the content.

Despite how powerful search engines are, their bots crawl a finite number of pages across the internet. Because of this, you want to make sure they’re crawling the most critical, high-quality pages on your site — not wasting time on low-quality pages. This is referred to as “crawl budget.”

A crawl budget is key for larger websites. If you have thousands of pages, but your crawl stats show that Google’s only crawling a portion of them each day, it means they’re missing significant parts of your site. You can improve your crawl budget by excluding crawlers from irrelevant pages. These could be:

  • Admin or login pages
  • “Thank you” or confirmation pages
  • Paginated pages
  • Testing and development pages
  • PPC landing pages

Pro tip: Check which pages are indexed in search engines by doing a simple site: [website URL] search in Google. You can click through all the indexed results to see if a chunk of pages might be missing or if there are pages that shouldn’t be indexed.

3. Implement structured data

One way to improve how bots understand your website content is through structured data or schema markup. This is important for SEO and to prepare for the future of search, as Google and other engines continue to personalize the user experience and answer questions directly on their search engine results pages.

There are hundreds of different schema types, and the best fit for your website depends on your product or service, industry, and the type of content you offer.

Google’s Structured Data Markup Helper is a handy tool if you’re unfamiliar with structured data. It walks you through the steps to add structured data to your site, notes which items need to be marked up, and creates the HTML for you.

Structured data helps you stand out in the organic search results and increases the likelihood of your site appearing in SERP features like Featured Snippets (aka rich snippets) or People Also Ask.

This can be hugely beneficial for your site. If you already have structured data, you can check if it’s working properly by using the Rich Results Test tool.

If you already have structured data on your site, you can check if it’s working properly by using the Rich Results Test tool.

4. Secure your site

The “http” in URLs stands for Hypertext Transfer Protocol, and it allows for information to be passed between web servers and clients. The “S” in “https” stands for secure. 

If a website doesn’t have an SSL certificate, it isn’t secure and won’t have the small padlock icon in the URL bar that you see in Google Chrome. This essentially means that any information a user inputs on the page (like their name, address, or credit card details) is not protected and can be stolen.

On a secure website, the code is encrypted. This means any sensitive information cannot be traced. Having a secure site can give you a small ranking boost.

Plus, web browsers like Chrome are getting more aggressive about letting searchers know if they’re on a non-secure site and could be at risk, with 95% of websites on Google now using https.

Check if your website is secure by looking in your browser. If your site has a padlock next to the URL, it’s secure. Secure domains will also show “https” in the search bar, vs. just “http.”

5. Ensure your site is mobile-friendly

Fun fact: 45% of web traffic comes from mobile users (are you surprised?). 

Because of this, Google has a mobile-first indexing approach — aka mobile-friendly sites get priority ranking.

Not having a mobile-friendly website is no longer an option if you want to rank in search engines. You can test your mobile-friendliness by using Google’s Mobile-Friendly Test tool.

But it’s not enough for a site to be simply mobile responsive. Your site should also have a positive overall mobile user experience. Mobile users are very fickle and will bounce quickly if they can’t find what they’re looking for fast.

By optimizing for mobile users, you can push your site closer to the 1%-20% bounce rate to help your ranking (and user experience). Unfortunately, it’s a step many marketers forget to take since we often work on desktops. 

Google Search Console can also alert you to any mobile usability issues, like clickable elements being too small or content being too close to the screen’s edge.

mobile friendly and non-mobile friendly sites

Examples of sites that are mobile-friendly (left) and… not so much (right).

But don’t stop here — accessibility extends beyond mobile devices. 

“Accessibility is a technical SEO topic that continues to be important,” says Baas. “While, Google says website accessibility is not a direct ranking factor because it’s difficult to quantify, its tools such as PageSpeed Insights test has a TON of information including a section on accessibility because technical SEO and accessibility go hand in hand.”

“My recommendation to SEOs would be to focus more on the optimization opportunities for accessibility lined out in this test,” Baas advises.

6. Review your website architecture

The goal of your site architecture is to make navigating your website easy, clear, and intuitive while making it easier for search engines to crawl your pages. The main components of website architecture are:

  • Navigation
  • Internal links
  • URL structures
  • Metadata

Navigation

Navigation is important for user experience as well as search engines. Google bots crawl links and your XML sitemap, but they also use navigation to determine how important certain pages are on your site.

Because of this, you want to ensure your important pages are linked as “tier 1,” or most important. Ideally, you don’t have more than seven tier 1 items unless you have a large website. I usually don’t recommend linking tier 4 pages and beyond in the navigation to avoid clutter.

It’s also important to have footer navigation that lives on every page of your site. That way, when bots are crawling, they’re crawling your footer links. It’s common to link your privacy policy, support page, local info, and social media profiles in the footer.

Zappos tiered navigation

An example of tiered navigation on Zappos.com.

Internal links

When bots are crawling your content, they’re following both internal and external links. Because of that, you want to use internal links to guide them to the important pages on your site.

You usually don’t need to link to your homepage internally since it’s going to be your highest authority page anyway. You should, however, link to internal content, such as product pages and blogs.

Also, be sure to use keywords in your anchor text instead of generic phrases like “learn more” or “click here.” Bots use anchor text to determine the topic of the content you link to.

URL structures

If your website host automatically creates URLs for you when you add new pages to your site, you may not think about URL structures much. But these structures are yet another signal that explains what your page is about to search engine bots. Check out these two examples:

  • https://www.imdb.com/title/tt0120338/
  • https://hawksem.com/blog/b2b-paid-social-media-marketing-strategies/

Not to toot our own horn here, but it’s clear that one URL structure has a much clearer explanation of what a page is going to be about than another.

Also, you should use keywords in your URLs when possible, and URL structures should follow your navigation’s structure (like how the above blog title comes after “/blog/” root category).

Pro tip: Avoid underscores in your URLs. Bots ignore underscores and will think anything separated by an underscore is one long word, so use hyphens instead.

Metadata

Metadata refers to things like your site’s page title and meta description, which summarize the page’s content. These elements improve your click-through rate when you follow best practices like:

  • Including keywords
  • Adding meta tags
  • Using pipes or hyphens to separate words
  • Keeping titles under 60 characters
  • Keeping meta descriptions under 155 characters

The page title may cut off in search results if it goes too long, especially for mobile. It’s also worth noting that Google outputs the URL above the page title now on the SERP. This is another reason URL structure is important and should be easy to read.

7. Optimize for page speed

Page speed is another ranking factor that can hurt or help search engine page positions of your site. If it’s too slow, it’ll fall in the ranks—bad news for site owners wanting to create an interactive experience using moving graphics.

This and various other elements can bog down your page load time, including:

  • Large videos
  • Unoptimized images
  • Too many graphics
  • Excessive ads
  • Clunky or outdated plugins
  • Too much Flash content

“Page speed updates in the last year have really made this more important,” says Baas. “Google has improved their tools for webmasters to use to improve page speed, such as the Google Search Console page experience report and the PageSpeed Insights test.”

To improve your page load speeds, you must first identify what’s slowing it down. Then, you can either remove, optimize, or reduce them. For example, resizing images to reduce file size is a way to improve website performance. Or compressing them using PNG files via a tool like TinyPNG.  

So, what is the target page load speed? Roughly one quarter of websites load within five seconds. And if you want to beat 75% of the web, then your site should load within 2.9 seconds. Only 6% of the web loads at 0.8 seconds. 

Now, you have an idea of how fast to make your site compete with the majority of the web. 

8. Check backlink quality

Lightning strikes. Tornadoes. Earthquakes. Some things are outside of our control. And the same holds true for your link-building strategy. You choose which internal and external links to include on your site. 

But what about the websites that decide to link to yours? 

Google and other search engines use backlinks to determine the quality of your site. It used to be the more backlinks, the higher it ranks your site. However, after link farms sprung up, polluting the internet with spammy backlinks, Google had to change its tune. 

Today, backlinks only count for your site if the linking site applies to ‌the linked page/blog. Remember, Google is all about helping users find information. So, every link should lead the user to more information that they’ll find helpful. 

To check your site’s backlink profile, you can use site audit tools like Ahrefs, Moz, and Google Search Console. 

Although you can’t control who links to your site, you can improve your backlink profile by doing things like:

  • Writing guest blog posts that link back to your site
  • Creating informative posts other authority sites will want to link to
  • Finding mentions of your brand or product and asking for a backlink (if the site is relevant and high-quality)

Once you make improvements to your backlink profile, ask Google to re-crawl your site so it can re-rank it.

 Why technical SEO matters

You built a monster of a content marketing strategy that includes publishing high-quality, in-depth content every week. Plus, you optimize the on-page elements of each post with SEO best practices

But your content barely makes it to the second page of Google, even though it’s 10x better than what’s on the first page.

What’s happening?

Diving head-first into your technical SEO will shed some light on the problem. 

For instance, maybe your site’s performance is hurting the user experience (UX) — a critical Google ranking factor. 

Google states in its SEO fundamentals, “You should build a website to benefit your users and gear any optimization toward making the user experience better.”

If your site takes too long to load or has broken links, people will bounce away. A whopping 40% of consumers will wait no more than three seconds for a site to load before abandoning it.

And it’s not only your customers that you lose when they bounce, but also your SERP ranking.

When you create great UX (and equally excellent content) on the front and back end, the odds of people sticking around increase.

A website with a poor user experience is like landing in a new city to find it set out like a maze with confusing pathways. You may enter with curiosity, but the frustration of getting lost or navigating through unclear routes makes you hesitant to explore further. 

Just as a well-designed city that hands you a clear map as soon as you step off the plane encourages exploration, an intuitive and user-friendly website design invites users to journey through its pages seamlessly.

But most websites don’t do that. In fact, only 35% of sites see high page views per visit, according to the HubSpot Inbound Marketing Trends Report. This indicates the experience is bad and doesn’t encourage visitors to explore more content. 

If you ignore your technical SEO, your on-page optimization efforts won’t be enough to increase your search engine results page (SERP) position, traffic, and retention. Then comes the potential demise of your site’s ranking.

On-page SEO vs. technical SEO: What’s the difference?

On-page SEO and technical SEO help your website rank in search engines like Google, Yahoo!, and Bing. But how you use them to improve your site is where the two differ.

On-page SEO is all about content. Starting with keyword research and content creation, on-page SEO requires:

  • Incorporating keywords into the URL, page title, H1, H2s, and throughout on-page content
  • Writing metadata, such as meta descriptions and meta titles, that include the keywords
  • Insert relevant internal links and external links into blog posts and web pages with keywords
  • Use canonical tags to identify original content to avoid duplicate content

In contrast, technical SEO involves:

  • Improving core web vitals and especially page load speed
  • Ensuring the navigation and site structure are correct and intuitive
  • Making the site user-friendly for mobile devices with responsive designs
  • Optimize images and videos
  • Using schema markup
  • Adding “author” HTML tags to author bio sections
  • Creating an accurate sitemap
  • Hreflang tags (for international sites)
  • Making sure that the pages you want are indexed and the ones you do not want aren’t 
  • Using noindex tags when you don’t want search engines to index certain pages

This is a wildly abbreviated overview of the work that goes into both strategies, but in short: one makes site changes you can usually see (on-page SEO), and the other makes background site changes you experience (technical SEO).

Is technical SEO difficult?

Technical SEO is doable when you have the right tools (and the time) to run an audit and make improvements. It’s a tedious task that requires understanding why something is broken and how to fix it. 

Unfortunately, most website owners are too busy to audit, analyze, and improve their technical SEO. So even when you have tools like Screaming Frog to run analytics, it still requires you to make decisions and enhancements.

In some cases, this may create a long to-do list of tasks, like resizing thousands of images, redesigning your site for mobile-friendliness, and updating broken links. 

Don’t have the time or (mental capacity) to do it all yourself? Most don’t. And it’s a good reason to enlist the help of a full-service digital marketing agency like HawkSEM. Not only do we offer SEO (and technical SEO) services — we also manage content marketing, PPC, and conversion rate optimization. 

But there’s more (wink). 

At Hawk, we also have ConversionIQ, a proprietary reporting software that connects all your marketing platforms and reporting data onto one easy-to-navigate dashboard. This allows you to pull data from your performing keywords to influence on-site title tags and H1 content to drive more conversions.

The takeaway

The more technical side of SEO can be intimidating. After all, it’s filled with code, jargon, algorithms, and robots.

But by getting a handle on your technical SEO, you can be confident that your efforts are more thorough, well-rounded, and poised for maximum search engine visibility.

Ready to increase your tech SEO know-how?

Check out this webinar recording, “The Importance of Technical SEO,” for even more insights. Need help with your technical SEO? Get in touch.

This post has been updated and was originally published in November 2022.

Free Marketing Plan
Shire Lyon

Shire Lyon

Shire is a passionate writer and marketer with over eight years of experience as a writer and digital marketer. She's well-versed in SEO, PPC, and social media, helping businesses both big and small grow and scale. On her downtime, she enjoys hiking, cooking, gardening, reading, and sailing.