Saturday, August 12, 2017

Top SEO Ranking Factors [Auditing Core Website Elements]

By Admin Baby Drivers  |  August 12, 2017

Top SEO Ranking Factors [Auditing Core Website Elements] - These are items that need to be checked right at the start. They're related to the overall larger site elements that will help you get a birds eye view of the audit as you dig deeper into the other elements.

Top SEO Ranking Factors [Auditing Core Website Elements]
Top SEO Ranking Factors [Auditing Core Website Elements]

Site / Domain Age
Sites that have been registered (without letting them drop of expire) have more domain age. Sites that have aged a lot have more established trust in the eyes of Google, IF and only IF all other site metrics - including on-page and off-page are healthy, and they have not been spammed or have any shady SEO links (black hat SEO etc) done on them.

Once a site is spammed, and gets into an algorithmic or manual penalty - no matter how old or aged the site is - it can never gain its original level of trust back in the eyes of Google. Age in this scenario does not matter and will not score you any advantage. A spammed and penalized site, is useless to rank from an SEO perspective.

Also, a dropped domain (one for which the renewal is not done) loses its age when it is re-registered. When you register it after it drops you gain its backlink power (if you set it up correctly) but you lose its age trust signal. The domain can still be used as long as it was not have have a historical record of spam or de-indexation.

Canonical Lookup
Where does the site resolve to? Do both the www and non-www versions of the site resolve to the same location (that is either www or non-www)?

In rare cases other sub-levels are used, but that is an advanced topic for now. Simply access both the www and the non-www URLs and see if the site resolves to one common URL in both cases (you can see if a common URL in the browser tab finally shows up when the site appears in the browser).

You should also check if it shows up as http or https and if there is any issue with the http vs https redirection. 

Robots.txt
How well structured is the robots.txt file? Does it exclude any important pages from being indexed by mistake? Does it exclude pages that result in duplicate content from being indexed (tags, category etc.)?
Duplicate pages may not harm you if they're not excessively done, but its always better to keep the site clean of internal duplicate content pages.

You can excluding pages to crawl and index via your robots file, and or the no-index meta tag inside the page itself. 

.htaccess files
The .htaccess file tell your server what to do with when your site is accessed by browsers. You can define if certain urls on your site and if it should be resolving them to other URL structures using complex or simple commands that do the mapping.

You can also block certain files or folders with passwords to the public or from certain bots. For example if you are still setting up a site and don't want it accessed - you can block it. This is very useful when building your Private Blog Network, because you can block tools like Ahrefs and Majestic from crawling your PBN site and hence hide any backlinks to your main money site

from being discovered by your competitors (and therefore hide your PBN entirely). You can read up on Private Blog Networks and how to build them in my PBN guide.

Friendly URLs
Is the core URL structure safe from URL parameters. For example, text in the URL of pages (address bar of browser) on the site like - ?postid=23345
Proper implementation of permalinks structure is important, so Google can understand the intent of the pages, and title and related keywords in the URL help, but you need to exclude any strange looking URL parameters that appear in them. Google may be thinking that people don't like them, so why should it?
 
Sitemap
Your sitemap needs to be planned and structured properly. A sitemap tells the Google bot which pages to get to, how and also the date they were created or updated.

You can also indicate which pages don't need to be crawled or are not important. You call the Googlebot to crawl and index your site from inside the Google Search Console. However, do note that although Google "looks" at your sitemap - Google is moreinterested in doing a raw crawl of your site - jumping from one link to another to spider all the pages in its database. By doing that, it also forms a link map of your site into its own index - which tell it which pages on your site are the most important pages (they are the ones that have the most links - the most prominent links).

So, its very important to internally link to your most important pages the most from other pages on your site - and to link to these pages from other pages that have the most traffic, content metrics and user metrics.

Tip - while building internal links you should also use keywords you want to rank the target page for as the text anchors on the linking page - without fear of over-optimization that you would have to worry about when you go about creating off-page links. However, do make sure your anchors are rotated amongst the different primary keywords of your target page. If the anchors appear in sitewide link areas (such as header and footer) its fine that the same anchor appears. Google is OK with that.

If your site is large and you need sub-sitemaps that is also fine. WordPress Plugins like SEO Yoast handle the splitting of sitemaps automatically.

Site in Google Index
Checking if your site is indexed properly is essential. You can do this inside Search Console. You need to make sure that all your pages are crawled and indexed and that you don't have any 404 errors or other page indexing issues - which includes the AMP (Accelerated Mobile Page) indexing issues - should you already have submitted your AMP powered site for the mobile index.

To check if the site is in the Google Index simply to a search like this...

If your site appears as the first result, it is indexed. If not, then it is not indexed and has been manually penalized by Google and therefore "de-indexed". 

Basic Penalty Check
Google has a manual webspam team and a complex automated spam checking system and algorithm - that sends out notices to webmasters through the Search Console warning them against issues it has found that are related to the sites profile for ranking in their SERPS.

Checking if there is a penalty - algorithmic or manual on your site, if you haven't done so, is essential.

You can do this by checking inside Search Console. You need to see if you received any notices from Google.

The notices Google sends are of two kinds - "Action against Site" or "Action against links".

If you have an "Action against Site" notice - then your site drops out totally from the SERPs and you have essentially been de-indexed. There will be a notice from the manual webspam team (real person) inside Search Console messages. If this happens, you cannot do much other than fix things and then send a plea and appeal to Google literally begging them to put your site back in their index - because you have cleaned up everything you do (or your SEO company did to your site).

If you get an "Action Against Links" notice - there is no plea you can send. Your site or some pages of it - would basically have a lot of rankings. To fix things you need to fix links and any other on-page and other issues and wait for the re-crawl to happen. Since the Penguin algorithm is not real-time... you will see ranking gains faster and you do not have to wait for a data-refresh to happen to see any benefits of your link purge or SEO fixes.

If you have not setup a Search Console account - you may check if your site is penalized by searching for the title of any page or post in quotes in Google and checking if the appropriate page/post shows up as the fist result. If not - then you need to start checking the severity of the penalty. This can be done by entering your domain name directly in the search and seeing what happens, or just searching for your domain brand name without the TLD or the TLD after a space separator.

Total No. of pages Indexed
Its very important that you check and compare the total number of pages that have been indexed of your site. This is a key factor that needs to be seen, because if there are pages that have not been indexed, but should be - then you need to analyze why Google is not indexing them.

You can also use this data to check how many pages of content your have that have a rich number of keywords as compared to the thin pages across your site. Then, go ahead and measure this up with your competition (the top ranking pages in your niche that keep appearing for your selected keyword searches) as a key ranking factor. 

SSL Certificate
An SSL certifcate is an absolute must. Even if you are not giving visitors a login, for them to access certain areas of your site - getting an SSL is essential now and does help in boosting your trust and help in ranking higher. For ecommerce sites and other sites that provide login areas - its an absolute must, or users of chrome will see a "red screen" while they access your site.



» Thanks for reading: Top SEO Ranking Factors [Auditing Core Website Elements]
Author: Baby Drivers

Never fear before trying it and never stop before it works.