For 2020. SEO is arguably one of the most influential and consistent factors in an online marketing strategy. And, though the rules and governing algorithms may change, for site owners who really want to create a robust and lucrative online presence, understanding and employing technical SEO basics is an important part of securing a strong online presence. 

Whether you’re just starting to optimize your site, or you’ve been at it for a while, these 20 basics will help you solidify your SEO foundation and improve search results. You can have the best content online, but if these 20 technical website elements are causing issues, your rankings will suffer. This can quickly translate into a loss of ROI for all that influential content you have created.

1. Prioritize Mobile

Google once recognized the desktop version of a site as the primary version, but that’s shifting, and quickly.  Google is transitioning into a “mobile-first indexing” strategy, meaning mobile sites are the first to be crawled. Not having a mobile version doesn’t mean your site won’t be crawled, but it may push it down in the ranks, even if it does have content.  If you don’t have a mobile version, then it’s time to get one. If you do have one, then make sure that it’s optimized for the mobile experience, and that includes content as well as page speed and page loading speed.

Technical SEO: 20 Must Know Basics Mobile Friendly

2. Optimize for Local Search

With mobile being a priority for both search engines and users, it’s no wonder that local SEO optimization plays such a huge role in rankings. Claim your business on Google My Business, and make sure that, if relevant, your business information is correctly listed on social media networks like Facebook and Yelp. 

3. Clean Up Your Title Tags

Title tags in your HTML play a vital role in helping search engines and users determine what a page is about, and since relevance is a huge part of the ranking equation, it makes sense that title tags play a vital role in your SEO efforts. Missing title tags, ones that are too short (or too long), duplicate tags, etc., can all cause problems, making an SEO audit and subsequent clean up essential to success.

4. Optimize Headline +(H1, H2, etc.) Tags

Titles tags are important, but so are Headline tags, which are seen by a user who is visiting your site and factored into search engine analysis. Optimizing your Headline tags with proper keywords will impact your search rankings. These headline tags also play a vital role in UX. Make sure all your pages have H1s, which carry more weight than the H2, H3, etc., tags, and also make sure the headline tags aren’t duplicated. 

5. Use Alt Text in All Images

Let’s talk about the alt text, or alternative text, which also is known as alt description or alt attributes (and sometimes non-technically correct as alt tags).  Search engines can’t read an image, so alt text provides the necessary info needed for search engines to properly understand what the image is about (and help index it properly). These are often forgotten, which can cause issues; search engines will likely overlook images without those tags, which leaves valuable keyword potential on the table.  

6. Fix Broken Internal Links

Dead ends are annoying, and for users and search engines, 404s are just that.  Broken internal links lead to poor user experience and a high bounce rate, both of which will end up costing you in the end. Take the time to tidy-up your site and fix broken links.

7. Take Advantage of Meta Character Lengths

In late 2017, Google updated the number of characters it includes in its snippets from a maximum of 140 to 320 characters, give or take a few.  That’s not to say you need to hit 300+ characters every time, but it does mean that you have enough space to turn meta descriptions into rich snippets that contain highly useful content and keywords. These keywords are also bold, helping the UX portion of your website’s search engine results. Google says meta descriptions have zero ranking factors, but they are a huge part of presenting marketing messages and influencing prospects to click on your result.

8. Improve Page Speed

Page speed has and continues to be a priority for Google, and users, as well as search engines, are quick to ostracize pages that suffer from slow page loads. There are a variety of test speed tools available, but it’s worth using Google’s own tool, PageSpeed Insights, which will test and analyze page speed and offer suggestions on how to improve.

Technical SEO: 20 Must Know Basics Page Speed

9. Monitor Page Indexing

If Google indexes pages, organic traffic will come.  If pages are not indexed, well, even the best content will be left to lead a lonely existence, unseen by deserving audiences. Use Google Search Console to see how many pages are submitted and indexed. Also, you can use an SEO crawler like DeepCrawl or Screaming Frog to make sure that the content you submitted to the index is indexed, obviously excluding any disallowed pages.

10. Knowing When and Where to Use Robots.txt

Improperly configured or missing robot.txt files can quickly cause a lot of damage. Robots.txt files help web-crawling software determine where it can or cannot crawl, and when implemented wrong, search engines won’t be able to crawl your site as you wish – a problem that could put other SEO efforts in jeopardy. Over the years we’ve observed a robot.txt file telling search engines to ignore pages; this is a powerful element of SEO that can create confusion with one bad character.

11. Check for Coding Problems

If you really want to decimate your SEO efforts, have some messy source coding.  If you want to avoid that scenario, then make sure seemingly small tags like NOINDEX, rel=canonical, and disallow are only used where necessary and corrected implemented. Coding issues will surely surface during a tech audit and should be addressed ASAP.

 

 

digital marketing newsletter content tips contentmender

 

12. Keep XML Site Maps up to Date

Your site architecture offers search engines a lay of the land and helps them identify the best “landmarks,” or pages worthy of traffic.  As you can imagine, an outdated site map can be problematic, leading users and search engines to broken or irrelevant URLs.

13. Dump Duplicate Content

Duplicate site content can lead to a decrease in rankings and thus a decrease in traffic.  While this can happen externally (site scrapers, product information, etc.), if it’s happening on your site, you can resolve it by canonicalizing content.  To do this, use the rel=canonical attribute on pages that can cause duplicate content issues. The rel=canonical will point that content to the page you want the content to rank for. 

14. Avoid Multiple Homepage Versions

Along the same lines as duplicate content, but significant enough to merit its own number in our list is the issue of duplicate home page versions.  This can happen when you run traffic to your site through two URLs or it can simply be an issue with the use of both HTTP and HTTPs version. Either way, it will lead to the indexing of multiple versions of your home page, which can in turn decrease or water down your visibility.

15. Avoid Redirect Chains

Using 301s to permanently redirect traffic can certainly be useful, but after a while, that experience can cause technical as well as UX issues.  When auditing your site, look for an extensive chain of redirects (3 or more). If they are occurring, take the proper step to address and permanently solve the problem.

16. Consider Using the Disallow Rule

Depending on your site structure, you may have many pages that are extremely important to your online presence. But most sites have a few that are just taking of space and ultimately eating into what’s considered your crawl budget, or the number of pages that a search engine will crawl at a given time.  Old promotional pages, terms, and conditions, or policy pages (e.g., shipping and returns), are all pages that don’t necessarily need to be crawled, and if you find there is an issue with indexing, consider disallowing them through the robots.txt mentioned in #10 above.

17. Beware of On-Page Link Saturation

Links can be a great boon to your organic presence, but a ton of links?  Not so much. The only links on a page should be those that are relevant to your user’s experience and/or your SEO efforts.  Avoid the “more is better mentality,” and don’t overload.

Technical SEO: 20 Must Know Basics SSL HTTPS

18. Use HTTPs

Security and the internet now go hand in hand, and that’s particularly true for Google. This is true especially when it comes to Chrome, which has become the leading search engine.  If you have password or credit card input fields on your site, it’s imperative to use HTTPs, which is possible by updating to an SSL certificate through your web hosting company. Failure to do so will result in a warning to users, and, since Google prefers trusted sites, a nasty blemish on your ranking factors. 

19. Practice Ethical Link Building

Link building is a great way to increase rank and traffic, but only if it’s done ethically.  Questionable practices, like buying links or using automated link-building programs, are black hat strategies that have long been ousted by top search engines. It may take longer, but the value of a few quality links will outweigh the damage that a surplus of questionable ones will have.

20. Use Clean URLs

Whenever possible, rely on clean URLs that incorporate relevant keywords.  Yes, it may be easier to use things like automatically generated URLs, site facets, and parameters, but messy URLs don’t score big with Google.  

There are a lot of things you can do to improve your place in search results, and though some may seem simple, many can have a lasting effect on your online presence.  Get back to the technical SEO basics to build an even better future for your site.