Are There Technical SEO Tricks That Can Improve Rankings?

January 03, 2017

If there were one technical trick (or two, or three) that was the silver bullet to search rankings, SEOs would have a much easier job. Unfortunately, Google Webmaster Trends Analyst John Mueller made it clear in a Webmaster Central hangout session that technical tricks won’t fool Google into thinking a website is high quality.

Why Technical Tricks Aren’t SEO Magic

Google has been very clear that high quality content is what gets a website top rankings. But that doesn’t mean webmasters aren’t looking for easy ways to make a website appear better to Googlebot. John pointed out a few of the technical tricks that people mistakenly think will move a site up the rankings.

High Word Count

There’s actually no magical number for how many words need to be on a page for it to be considered high quality. Just because there are 325 words on the page that doesn’t mean the words add value. John rightly stated that users aren’t counting the words on a page to determine if the content is good enough to link to or share it with others.

There have been instances where the entire page is blocked by robots.txt. Even though the content on the page can’t be read by Googlebot there’s still a chance it can rank fairly well for a specific search query, because other signals are telling Google it’s high quality.

Keyword Density

The number of times a keyword shows up on the page has also been debunked as a ranking factor. Even if it has a 1.2 factor density, that isn’t an indication of quality for a specific keyword. It can indicate the topical matter of the page, but quality doesn’t lie in how many times a keyword is used.

Meta Tag Algorithms

Contrary to what many people believe, John gave indication that there is no meta tag algorithm that gauges the quality of a website. Googlebot is reading the meta tags for sure, but there’s no algorithm that assigns quality directly.

User Signals

Google does try to pick up on whether users are happy or unhappy with a web page they landed on after a search by using links and shares. However, John noted that Google tries to steer clear of using user signals like bounce rate to determine quality. User behavior is used to improve the algorithms but not to assign quality rankings on a per page basis.

Users are taking in your web page as a whole to determine if it’s high quality or subpar. Googlebot is doing the same by using hundreds of signals to figure out what a web page is about and the value that it provides.

A technically sound site is necessary to move up in the rankings, but there are no technical tricks to making a website appear to be high quality. High quality websites incorporate many different factors, which is why SearchRPM focuses on the technical build of the website, design, quality content and outreach. Everything from the backend build to inbound links add up to make a website high quality.

Discover how many factors can be improved on your website. Get your FREE SEO Report - get yours today!

By Michael Ramirez
SearchRPM Founder
Michael Ramirez

Michael Ramirez is the Founder of SearchRPM, an Austin, TX based search marketing company that’s well-versed in Search Engine Optimization best practices. You can follow Michael Ramirez on Twitter @openmic0323 or on Google+ to see what he’s up to next.