8 website factors that Google loves- (updated from 6 to 8)

No 1: Web site is responsive and mobile friendly

These days the majority of users access website via a smart phone or tablet and Google keeps separate search rankings depending on whether you’re searching with a smartphone or not?  The May  update to Google’s algorithm identified that mobile friendly pages would get higher rankings.

Google even has a free tool that can help you find out if your pages are mobile friendly which you can access here- “googles mobile friendly test tool” . If you’re site isn’t mobile friendly you really should make sure it is.

No 2: your website delivered via HTTPS/SSL

In September 2016 Google announced on it security blog that if your website is not using SSL which means https:// not http://. When they implement their latest changes  in 2017 (NOW), your website credibility and search engine rankings may be affected. Chrome- Googles web browser  has already begun to mark non-secure pages containing certain input fields as Not Secure in the URL bar as you can see below.
You can read more about it in this article on “why SSL is a must have for SEO/Google Results”

No 3: XML Sitemaps

Sitemaps are a way of letting Google know about all of your pages instead of just hoping that Google will find the link to them, follow it and index each of the pages. You can think of a sitemap as a book index or table of contents that tells you all the sections/pages of the book or website.

Most CMS systems can automatically create their own sitemaps, or you can use a plugin to generate one but you need to make sure that it is a valid xml sitemap by using a validator or by going to Google Webmaster Tools  and testing the sitemap.

If your sitemap validates and you want to signal to Google that you’re ready for indexing, you can submit your sitemap to Google Webmaster Tools. This is a great step to do if you’ve spent a lot of time revamping your site behind the scenes and are ready to go live and what your website re-indexed with all of your new content.

HTML sitemaps are also really good for sites that have a lot of dynamic content as they list of all the pages on the site by hierarchy, which can help human visitors- who should not be confused with robot visitors as detailed further down this list.

No 4: Site Speed

Thanks to the internet and humans expectation of instant gratification it is important that your webpages load fast and even more so on phones and devices that may be accessing your webpages via slow data connections. So for this reason  your site absolutely must load quickly.

Google has a tool to help you measure page speed for your site so you can tell if you’re too slow for their tastes. It will rank your site for both mobile and desktop and give you tips on how to improve the speed. If you want to go a little more advanced then consider going for Accelerated Mobile Pages.

No 5: Relevant and specific Titles and Meta Descriptions

Google says they don’t use titles and meta descriptions for ranking, but they’re quite important for readers that want to know what a page is about. So while it may not be something that google uses in its algorithm it is important for usability and getting user traffic from Search Engines. Titles are ideally under 60 characters to prevent Google from cutting them off. Meta descriptions should describe what the page is about and be between 150 and 160 characters.

If you don’t put in a meta description, Google will try to pull one from the text. Sometimes this is good enough for a page that’s trying to target a long-tail keyword, but it’s good practice to write your own titles and meta descriptions for all your pages in order to improve click-through rates in organic search results.

N0 6: Good Content Quality

When it comes to onpage SEO it is important to have good quality content.  Which in Googles eyes means:

  • Original content
  • How much content is “alot” or “enough” is relative but for this point I would say the amount of content needs to be “enough” to get the overall message across.
  • Fresh content or regular content- the expression “Google loves fresh content” didn’t come from no where for no reason.
  • Media rich enough for today’s content standards . So include pictures, video, etc that enhance the content.
  • Clear navigation

No 7: Robots.txt

Robots.txt files inform search engine spiders how to what of your content you want to be indexed. The search engine spiders want to index as much content as possible but there maybe pages or content that you do want to appear in the search engines. For example for usibility reasons you may not want to your admin pages to appear in Googles index or you may not want content or webpages that are still in development to be indexed.

You do this by including commands for the search engines spiders on whether they are allowed or disallowed as examples to index files and folders this is called The Robots Exclusion Protocol.

It works likes this: a robot wants to vists a Web site URL, say https://www.tipsforyourwebsite.com/stuff/stuff.html. Before it does so, it firsts checks for https://www.tipsforyourwebsite.com/robots.txt, and finds in it:

User-agent: *
Disallow: /

The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site.

Or the robots.txt may contain:

User-agent: googlebot
Disallow: /admin
Disallow: /stuff
Disallow: /cgi-bin

where in this one the robots.txt is giving instructions for only googles spider and telling it not to index the admin, stuff and cgi-bin folders

There are two important considerations when using /robots.txt:

  • robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
  • the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don’t want robots to index.

So don’t try to use /robots.txt as a security feature.

N0 8: Good quality reference links and citations

These are also often referred to as citation links. And while you won’t be penalised by Google for not including them they do provide supporting information and validation of your content and can be useful for your users.

Anther helpful SEO article