15 Important Factors of SEO Structure – On-Page, Off-Page Optimization

The seriousness of any website will depend on its design, content, accessibility, rankings and its SEO structure. Many websites compromise with their site On-Page, Off-Page Optimization and expect quick results in less time. This is not possible until your website follows a proper SEO structure, as most search engines concentrate on a website back-end SEO structure to rank them high. Your website should be able to guide the search engine bots to tell what to index and what not to. Below are the 15 important factors of SEO structure you should follow in your site on-page, off-page optimization. These techniques when followed will give amazing results, and the beauty of these practices will not cost you even a penny unless you hire someone to do it.

15 Important Factors of SEO Structure

15 Important Factors of SEO Structure: On-Page and Off-Page Optimization

These below on-page and off-page optimization factors are clubbed alphabetically for easy understanding, since these both practices are very crucial for a website success, its better to follow them all without skipping anything.

Anchor Text:

Search engines give a lot of importance to anchor text links for improving their SERPs (Search Engine Results Pages). Even if a website is not properly SEO structured, it can still rank up with good amount of anchor text links pointing towards a page. Remember that when you are linking any particular text, always use a relevant keyword to point a link towards it. So instead of using Click here to know more, you should use the actual keyword to rank that page higher.

15 Important Factors of SEO Structure

The more number of links towards that page, the more higher that page ranks up in Google. Also make sure to insert proper Alternate Text from images with links, so that you don’t miss a chance of ranking high even with images. In the above screenshot, you might have observed that Adobe Reader is ranked no 1 with keyword “Click here” This means a lot of websites are pointing to the page get.adobe.com/reader/ with the keyword Click Here.

Critical Errors:

Your website should always avoid HTTP critical client, server errors to rank high in search engines. When your website throws these critical HTTP errors to your site visitors, it creates are bad impression on the site and ultimately make the visitor to leave the site immediately. This might create a serious issue in your website analytics, and will surely create a bad reputation in long run. Avoiding these kind of critical errors will keep your site clean to the search engines and help your site to rank well in long run. Some of the critical client and server issues are,

  • 400 – Bad File Request
  • 401 – Unauthorizedbroken link page
  • 403 Forbidden/Access Denied
  • 404 File Not Found
  • 403 Forbidden/Access Denied
  • 408 Request Timeout
  • 500 Internal Error
  • 501 Not Implemented
  • 502 Service Temporarily Overloaded
  • 503 Service Unavailable

Note: Take quick action on 404, 500 errors in your website.

Canonical Link Element:

The Canonical link element syntax is used to clean up the duplicate URLs in a website. If a website is having ugly URL structure, rel=”canonical” syntax is used to describe the actual URL of the site and helps to eliminate the duplicate URLs within the website. If your website produces different kinds of URL structures within its pages, search engines might consider it as a duplicate issue and ban the pages if necessary. So we should always keep an eye on canonical syntax to avoid these issues.

Syntax: <link rel=”canonical” href=”http://example.com/page.html”/>

15 Important Factors of SEO Structure

Example:

  • http://www.example.com/page.html?pid=fgq3304 (without rel=”canonical” syntax)
  • http://www.example.com/page.html (with rel=”canonical” syntax)

You can learn more about canonical link element and canonical HTTP headers from Google itself.

Do-Follow & No-Follow Attributes:

Do-Follow and No-Follow are the two most important attributes you should understand before structuring your site SEO.  Do-Follow is an attribute which suggest the spiders to index through the link and crawl as much as possible. This is the reason why most website link internally to allow the bots to crawl within their website. Similarly No-Follow is used to guide the bots to ignore the link completely, and always make sure to balance the number of do-follow and no-follow links within your website. Too many of these do-follow or no-follow links might effect your website in long run. dofollow and nofollow links

Duplicate Pages:

Google doesn’t show any mercy on websites which are careless about their site duplicate issues. A website should always be clean with their pages, content and links, else your website might be banned from Google search engine listings. Though there are many ways to find out these duplicate issues within a website, I personally use Screaming Frog SEO spider tool to detect and eliminate my site duplicate pages.

duplicate issueImage Source: WebSEOAnalytics

External Links:

External links are nothing but the hyperlinks which point any other external domain. Which means, if your website is pointing to other website links, it is considered to be a external link. Similarly if any website is pointing towards your site, even that is considered to be a external link. You should keep a track of all followed links and their subsequent status codes from your website, as this is very important to know how our site is linked with other domains. Never try to link to bad sources for quick results, this bad practice can ruin the whole website in long run.

external linkImage Source: SEOmoz

You should also take care of the Inlinks and Outlinks from all the pages linking to a URI.

File Size:

A website should always determine its page file size, as this might effect the site loading speed and decrease the site performance in many ways. Google always concentrate on page speed, as it directly connects with the user experience and bounce rate. If the visitors are experiencing low page speeds with high bandwidths, they are expected to leave the page quickly without waiting for the page to load completely. This is very bad for your site reputation, and you might be loosing a loyal visitor without correcting the issue. You can avoid these kind of issues with necessary changes like page compression, uploading smaller pictures, avoiding too many codes and etc.

speed up your website

Header Tags:

Whenever you are using H1 and H2 tags in your site on-page optimization, make sure to follow some guidelines. You should never miss writing a H1 and H2 tagline for your page content, and if you are writing any article, make sure to follow this H1 and H2 pattern with relevant keywords and try to avoid multiple header tags. You should also make sure the header tag is not duplicated in a page and doesn’t exceed above 70 characters.

header tags

Images:

You should keep track of your site images by checking if the files are exceeding over 100kb size, images which are missing alt text, images which have over 100 characters alt text. These factors are very important for image SEO, as they can fetch good amount of traffic in long run.

Meta Data:

Meta Title, Meta Description, Meta Keywords, are the most important on-page optimization techniques you need to follow compulsorily. When a page title is optimized, it should be genuine, same as h1 and shouldn’t exceed more than 70 characters to avoid duplication. Also make sure to write the site meta description without  exceeding 156 characters and not duplicating it anywhere else. Google doesn’t concentrate much on meta keywords these days, but yahoo search is still using this as their main factor to rank the websites in top.

meta data

Page Depth Level:

Page depth level is a difficult concept to understand for novice users. This concept tells us how many levels does a search engine has to crawl a website to index the content. If your content is deep inside folders to access, it would be little difficult for the search engines to read the data. It also depends on how many clicks does the folder is away from homepage to access the data. For example,

  • example.com/deep/deep/deep/deeppage.php (is linked from home page and is okay)
  • example.com/rootpage.php (4 clicks from homepage might create a problem)

These kind of page depth levels might create several issues without your notice. You have to take care of what depth your site pages have, and what steps can be taken to avoid these kind of issues.

Redirects:

Redirects are very useful when you are moving your website to a new domain. It is a process of forwarding one URL to different URL with three major kinds of redirects.

  1. 3o1 Redirect – Moved Permanently
  2. 302 Redirect – Found or Moved Temporarily
  3. Meta Refresh

Syntax: wp_redirect(get_permalink($url),301);

301 redirects

There are many plugins in WordPress which help you to redirect a particular page to some other page without any issues.

Robots File: 

You should learn how to write a Robots.txt file easily, as this is the most important factor to avoid the Google bots to index what all comes in their way. A robots.txt file will mainly consists of index, noindex, follow, nofollow, noarchive, nosnippet, noodp, noydir and etc. Each of these commands are important in their own way. Try to understand the Robots topics for keeping your site SEO structure in a perfect shape.

google robots

X-Robots-Tag HTTP Header:

X-Robots-Tag header is the advance version of meta robots tag, which allow us to do what we normally do in a robots meta tag, but little differently. If any of your site page is having too many links pointing towards it, the page can rank high even without proper SEO or indexing. This shows that the page cannot be hidden, even if you have stopped the access in robots.txt file, and you cannot really hide anything which you don’t want to get search engines notice. Since this topic is little complicated, I would suggest you to read this only if you are interested.

XML Sitemap Generator:

Any website which doesn’t have an XML sitemap is considered to perform very poorly in all aspects. Your website should have a proper XML sitemap, and must be submitted to Google search engines via Webmasters for good results in future. You can create basic XML sitemap using Google XML sitemaps plugin for WordPress.

15 Important Factors of SEO Structure

These are the 15 important factors of SEO structure you have to follow for better results. Some topics which I mentioned above might be completely new and confusing, don’t put much efforts to understand which is beyond your site optimization. Please leave your valuable comments and queries if you are confused about any part of this article. Happy Blogging!

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *