VLT-Website-Heading

Google will be Indexing HTTPS pages by default

Dec 19, 2015 5:21:00 PM / by Vu Long Tran

Google will be Indexing HTTPS pages by default

Google just announced an important update to their search engine optimisation (SEO), where they will now be looking to list websites based on their HTTPS (with SSL) over their HTTP address.

Here's some details directly from Google's Webmasters team

"[Google will] tart crawling HTTPS equivalents of HTTP pages, even when the former are not linked to from any page. When two URLs from the same domain appear to have the same content but are served over different protocol schemes, we’ll typically choose to index the HTTPS URL if:
  • It doesn’t contain insecure dependencies.
  • It isn’t blocked from crawling by robots.txt.
  • It doesn’t redirect users to or through an insecure HTTP page.
  • It doesn’t have a rel="canonical" link to the HTTP page.
  • It doesn’t contain a noindex robots meta tag.
  • It doesn’t have on-host outlinks to HTTP URLs.
  • The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL
  • The server has a valid TLS certificate.

Although our systems prefer the HTTPS version by default, you can also make this clearer for other search engines by redirecting your HTTP site to your HTTPS version and by implementing the HSTS header on your server." - Source: Google Webmasters Central Blog

What does this indexing HTTPS pages mean to me?

This is only applicable to websites that are using the HTTPS as part of your website.

So if you have HTTPS, when you have to think about where you want traffic to go to. You can consider the following:

If you want people to go to your HTTP version of your website 

If you have an HTTPS version of
your website, and want people to go to your HTTP version of your
website, make sure you consider putting references on your HTTPS to
guide search engine web crawlers to look at your HTTP version of your
website. Examples include:

  • a rel="canonical" link to the HTTP page -
    this directs the site crawler to index your HTTP version of your
    website instead of your HTTPS. You put this on your HTTPS website (as
    you're telling it to index the other website to drive traffic to that
    website, not the HTTPS website). A reason you might consider this, is if
    you have a login section of your website that you don't want indexed by
    search engines.
  • a noindex robots meta tag - as the name suggests, so that the HTTPS version of your site isn't index

Topics: tech, web development, google, ssl

Vu Long Tran

Written by Vu Long Tran

Solutions Engineer APAC. ex-@Forrester consultant. Writing on #cloud #howto guides and #tech tinkering!