On page SEO is the foundation stone on which your website must be constructed for better search engine ranking. On page and off page SEO are interdependent. If you do not have either one of them, your website's chances of ranking high in search engines is meagre.
Off page SEO consists of link building strategies the results of which are not always in your hands. The good part about on page SEO, however, is that it is under your control. On page SEO is applied to content that consist of both the visible and the coded part of your webpage. It is applicable for both user and the search engines.
This checklist is in the form of questions to which the right answer should be a 'YES'. We have also listed an explanation below the checklist that gives you a gist of what you should know about that particular on page SEO element.
It's also worth noting that Google has a track record of ranking long content above short, assuming that the content is well written. Studies have shown that on average, the top ten search results include pages that contain 2,000 words! So let's begin.
The title tag is the most important on page SEO element and usually target keywords present in this content type helps return relevant pages on search engines.
- Do you have your most important keyword in your website title?
- Do you have the keyword placed towards the beginning of the title tag?
- Is your title tag within 70 characters?
- Is your title content unique?
Title tag is a standalone tag and is often confused with meta title. Incidentally both have the same function, location and some sites use the same content for both. Even if you do not have the meta title tag, it is a MUST to have the title tag.
You must have the most important keyword in your website's title and the keyword should be placed towards the start (within the 70 characters). Having not more than 70 characters is not compulsory but it is just for usability reasons. Do not duplicate title content anywhere else, the content should be unique and understandable by humans.
The description tag is important for your site visitors as they are the gist of the webpage content that they see on search results. The keywords in meta description used to help websites rank in the earlier days, but not much any more because of manipulation of this element for SEO, in the past.
- Does your meta description tag contain your targeted keywords?
- Is your meta description content within 150 characters?
- Do you use synonyms in the description content to help users understand better?
- Are they unique to every web page?
Similar to the title, the number of characters is not compulsory, but if it exceeds the number the extra content will be displayed as an ellipsis and is of no value to anyone. And you must use either the keywords or synonyms in the description as it usually highlighted in relevant search results.
It is not known to what extent the keywords in description help SEO, but the description itself helps a lot of sites to improve the click through rate on search results. What you could not describe in your title, you could describe in the meta description as it allows more characters than the title. Lastly, just make sure you do not duplicate your meta descriptions and keep them unique.
Unlike title and description, The heading tag is visible on the webpage to the human visitors. It is also important from search engine point of view as search engine bots acquire the information about the page from the heading tag, especially h1 tag.
- Do you have a h1 tag?
- Do you have your targeted keyword in the h1 tag and is it placed towards the beginning of the tag?
- Is the heading tag content unique?
The heading tag content sums up the intent of the website in a few words. There is no character limit as it occupies your website's realestate, but it is a norm to keep it as short as possible, to capture the visitor's eye. To help with the SEO part, the heading tag must contain your targeted keyword. It is also recommended that the keyword be placed towards the start of the heading tag.
You can have more than one heading tag, but it should be in an hierarchy of h1 to h6. A page must have two h1 tags. Also, do not duplicate content from the title tag onto the heading tag, although both the content's intent is quite the same. Sometimes having keywords in h2 and h3 (if you happen to have them) can also pass a weak relevancy signal to search engines.
NOTE: There are many meta tags that contribute to giving information about your site to search engines but they do not necessarily help in SEO.
The content of a site must contain keywords chosen for a particular web page. These could be targeted keywords or long tail keywords or synonyms. Just make sure the keyword density is not too high, which means use them only when they make sense to the human reader.
- Do your main keywords feature in the first few lines of your website content?
- Do you maintain an optimum keyword density in the content body?
- Do you have content more than 500 words?
- Do you have original and unique content for each webpage?
It is often a best practice to keep your content more than 500 or 1000 words on the website as search engines love longer content. But that does not mean that you have to turn your business website into a blog with fillers everywhere. Keep your content length as long as possible but also at the same time take care that the readers get exactly what information they need without having to scroll through a long essay.
If you search about content length affecting SERPs, you will find that the threshold is higher. But we are sure Google knows the difference between a blog and a business website while ranking them on this particular signal. This is what Google has to say about avoiding thin content, just to make things clearer.
As far as, keyword density is concerned there is no exact number to define it but all you need to know is in your average two to four paragraph content on a web page, use keywords whenever they are relevant to the human reader. To know more about keyword density we recommend you watch this video from Google engineer Matt Cutts who talks about ideal keyword density.
You can always use latent semantic indexing keywords that search engines also use to analyze the relevancy of your content to your target keywords.
Lastly, keep your content original. You do not want duplicating content from your own pages or from third-party pages, as Google does consider duplicate content bad for SEO. When a search engine finds duplicate content it chooses one content as the original and hides the others. The duplicated content are omitted and displayed at the end of original search results, as shown below:
Keep the URLs clean and fill them with keywords or words that aptly describe the content. The URLs say a lot about a site's architecture and often is considered important for SEO as well as usability. Make sure your pages can only be accessed using a single URL.
- Do you have short and clean URLs ?
- Do you have keywords in your URLs placed towards the beginning?
- Do you use hyphens in URLs?
- Is your content available via a single URL?
URLs are also one of the most important on page SEO elements. In the earlier days, keyword domains used to rank higher in search. However, keyword domains are not a factor in search ranking anymore. What you need to have is keywords in the rest of the URL, that goes beyond the domain name.
A clean URL that is sans unnecessary URL parameters is great for spidering search engine bots. Also, URLs with hyphens are preferred over URLs with underscores.
And if possible you must place your target keywords towards the start of the URL as search engines seem to give the first few words priority when trying to deduce the relevance of the content contained in the URL. Even if you are unable to do that it is ok, your URL must contain words that describe the content in the best possible manner.
Also, you must take care of URLs that have the same content. Use rel="canonical" tags on duplicate pages and let search engines know which page to consider as the original location for the content.
It’s easy to wind up inadvertently hosting duplicate content due to your content management system, syndicated content or e-commerce shopping systems. In these cases, use the rel="canonical” tag to point search engines to the original content. When search engines see this annotation, they know the current page is a copy and to pass link juice, trust and authority to the original page.
When deciding on a canonical URL, pick the one that’s best optimized for users and search engines, and points to content that has optimized on page elements. To implement a www resolve, set a preferred domain in Google Search Console. Google will take your preferred domain into account while crawling the web, so when it encounters a link to example.com, it will pass the link juice, trust and authority to your preferred domain www.example.com. It will also display your preferred domain in search results.
On HTML pages the rel="canonical” tag is implemented in the <head> of the page, while non-HTML pages should put it in the HTTP header:
HTML: <link rel="canonical" href="https://www.example.com"/> HTTP: Link: <https://www.example.com>; rel="canonical"
The URLs you use in your canonical tags have to be 100% exact matches to the actual canonical URLs. Google will see http://www.example.com, https://www.example.com, and example.com all as separate pages, and won’t be able to tell that http should really be https. You can only use one rel="canonical” tag per page. If you use more than one, or point to a URL that returns a 404 error, Google will simply ignore your canonical annotations.
Search engines cannot access text on images but they can certainly access text about images written between the alt text attribute.
- Do you have alt text for all images on your website?
- Do the alt text for images on your website contain keywords?
- Do your image file names contain keywords or relevant text?
Alt attributes for images are not only good for SEO but also good for usability. Say your site visitor cannot see but only hear, a screen reader would read the image's alt txt and the content would make sense. Say, the site visitor has disabled images or suffers from a slow internet connection, the image's alt attribute content will then be visible and would make sense to the site visitor.
As for search engines, not only are the keywords within image alt txt signals for search ranking, but your images will rank high in Google image search as well.
7. Keyword Consistency
More than a search ranking signal it is common sense to keep a consistency between keywords used on various on page SEO elements of a website.
- Do your target keywords appear in the title, description, heading, content, URL and image alt attribute, consistently?
It is imperative to keep a certain keyword consistency among spiderable content on your website. Although, it is not compulsory, it just considered as a best practice.
8. Internal Links
Internal links allow the smooth flow of link juice.
- Do your internal links ensure a smooth flow of link juice?
- Do you use targeted keywords as anchor text on internal links?
Your internal linking architecture must ensure that is allows a flow of link juice through every inner webpage. Although there are ways to popularize your inner pages by linking externally, internal linking is also essential for easy crawling and indexing of all web pages on your site.
And if you want to improve the search ranking of a certain new webpage on your site, you must ensure that it is linked to other popular pages on your site. As it is your own site, you can choose the anchor texts for the internal links. So instead of using generic words such as 'click here' or 'read more' you can use target keywords in your anchor texts.
Every site must have a sitemap. However, even if sites don't have one, they get indexed by search engines. The presence of a sitemap makes it a tad easier to find and index pages, for the search engines.
- Do you have an XML Sitemap?
- Have you submitted the sitemap to Google Search Console?
- Have you located your sitemap on your robots.txt file?
A sitemap helps tell the search engines two important things: the location of all the internal links and the priority of different pages in a website. The former improves indexing of all internal pages hence paving way to search engine visibility to internal pages while the latter helps understand the information architecture.
A sitemap location can also be specified in robots.txt file, which is usually the first file a search engine crawler looks for while crawling a site.
Robots.txt is a file which, if present, on your website is the first file to be crawled by search engines. This contains information about pages that should not be indexed. It generally consists of information for search engines to guide them to crawl and index the site efficiently, including the sitemap location.
- If you have your robots.txt file, have you checked it for any incorrect blocked pages or blocked search bots?
The presence of robots.txt does not directly affect your website's SEO. If you do have a robots.txt, make sure that no page that you want to be indexed, is blocked. If you have crawling issues reported on Google Search Console, make sure you double check your robots.txt file.
Also, to increase the crawl rate of search engine bots, you can avoid unnecessary pages from being indexed by specifying them on your site's robots.txt file.
11. Page Speed
Google announced back in 2010 that it was using page loading speed in search ranking. This leaves you with no choice but to make and maintain the loading speed of the website to a couple of seconds.
- Does you page load in four seconds or less?
Removing code-bloat, optimizing image sizes, optimizing caching, reducing DNS lookups are some of the many things that affect the page loading time of a website. Improving page loading speed improves the user experience on a website. According to an article in The Guardian, the attention span of the internet customer is decreasing with increasing technology. A faster loading page can capture the attention of the site visitor efficiently, thus enriching the user-experience. Also, search engines such as Google takes rich usability of a site into consideration to rank high on search (you will see that in point 20 below).
12. In-Page Links
In the earlier days, Google recommended that websites must not have more than 100 links on a page, on account of crawling limitations. But there is no definite recommended number of in-page links on a website ever since Google has acquired sophisticated and well-equipped technologies to crawl and index links.
- Do you have more than 100 to 200 links on a webpage on your site?
Even though search engines such as Google have removed the threshold of 100 links on a web page, it is important to note that lesser the number of links on a page, the more concentrated will be the acquired link juice. If you link unnecessarily from your page to pages on your own site or third-party site, a certain percentage of link equity is lost.
Thus it is advisable to keep your web page links under 100 to 200.
13. Page Indexing
Make sure all your site's pages are indexed. If there are problems in indexing reported on Google Search Console, it must be corrected immediately.
- Are all the pages of your site indexed?
If all or most of the pages of your site is not indexed it may indicate that there is something wrong with your internal linking architecture. Check the number of indexed pages on the WooRank Website Review, as shown below:
You can also check it in Google Search Console. You can find the indexing status under Google Index > Index Status, as shown in the screenshot below:
If problems in indexing is caused by broken links, get it fixed immediately. Problems in the sitemap could also cause lesser indexing of site's pages. Also, look out for inbound and outbound links. The internal linking structure of a page determines how well a site get's indexed.
If all the pages are indexed on your website and there seems to be an increase in indexation rate, it can indicate that the site receives fresh content. This is a positive ranking signal for search.
Higher amount of content is always encouraged in websites to rank higher in search. But is this content the HTML content or the text content?
- Is the Text to HTML ratio of your site high?
Search engines focus on user's experience on a site. A site with a decent amount of text is indication that it is built for people. There are sites that may have text within image files. This would be of no use if you want that text to be visible to search engines.
Increasing the text to HTML ratio not only serves for better SEO but also seems to increase page load speed with the elimination of unwanted code.
15. Use of Multimedia
The use of multimedia enhances the search engine positioning of a site. Multimedia is in the form of images and videos and according to this searchmetrics ranking factor study, the higher the number images used, the better the ranking?
- Does your website contain a good amount of multimedia?
This factor has got to do a lot with the user-experience of the site. The images and videos improve user interaction with the site thus also reducing bounce rates. Higher bounce rates cannot be good for a site's SEO and hence you must embellish your content with relevant multimedia elements and make it more inviting for your site visitors.
Just remember not to overload your site's pages with multimedia as it can lower the site speed.
16. Social Media Signals
You need social media share and like icons on your web page content, especially if you generate content at regular intervals in the form of blogs or press releases.
- Have you integrated social media buttons?
It is wise to tell the search engines how many times your content has been liked or shared on social media. You can do that on page of your site by integrating social media share buttons and social media connect buttons. You can also go a step further and install a blog for your website to increase social media popularity of your site. Social media is an important signal for enhanced SEO. and this is the onsite way to do it.
Other offsite ways include having a presence in social media sites and getting shares and likes on your content through them.
17. HTML Errors
There is no excuse for having HTML Errors on your site.
- Does your site code pass W3C validation?
Faulty coding and lots of errors in the code can indicate poor quality in a website. Although not strong ranking factor in search, it does account to one of the many factors that causes drop in rankings. Hence, it is important that you fix any HTML errors in your site on a regular basis.
Check if the code of your site is W3C Valid.
18. Site Uptime
Having maintenance issues quite often from your server leads to site downtime ans is not good for on page SEO.
- Is your website downtime more than 24 hours?
Having shorter duration of downtime is okay, but if your site is used to being down for longer periods and at high frequency it is time to consider change of servers and redirect to new IP.
Having a WooRank account will give you site downtime alerts via email. You can also use any of these free uptime monitoring tools to keep a track of your site's uptime and downtime rate.
Usability is a ranking factor for search engines and is controlled by on page elements.
- Is your site easy to navigate and understand?
- Is your site's bounce rate low?
- Is your content relevant to the target search keywords?
- Is the information on your site easily accessible?
- Does you site have trust signals?
- Is your site compatible on all browsers?
- Is your site user-friendly when viewed on smaller screens (mobile)?
- Is your site devoid of pop-ups and splash pages?
Apart from the answers to the above questions there are more factors that affect the usability of a site. How a certain search engine user engages with the link that is returned in search is taken into consideration.
For instance, image that you are a searcher and you click on a link and immediately click the back button. Your engagement with that site is poor. This may be due to poor user-experience and such engagement metrics may cause lowering of search ranking.
20. Outbound Links
Outbound links have a small role to play on search ranking and they are totally under your control. These are links that you create from your website to a third-party website with relevant content.
- Do you link to authority sites in your niche from your site?
When your site is linked to other sites in your niche with a higher authority, it helps search engines understand the relevance of your content better. For instance if your site has the keyword 'House' and it links to sites such as metacritic.com, the search engines understand that your site is about House - the TV series and not an actual house.
Having way too many outbound links can backfire though, as it will be considered as a distraction to the user from the main content. Also, you need not nofollow all the outbound links and give the impression that you are Page Rank sculpting.
These were some of the most important on page SEO checklist items, that will ensure your website is set on the right path towards search engine optimization. There are some more factors on your on page SEO Plan to be considered as well. So, make sure to follow our blog and keep updated for new posts.
Before you set your website on it's on page SEO journey, you need to make sure that your website is connected to analytic tools from search engines that you need the traffic from. Google being the most popularly used search engine in the globe, you will be benefited by setting up a Google account for your website and installing and setting up Google Analytics and Google Search Console. The fact that you have an account with all these free tools provided by search engines makes it easier for the search engines to gain access to your site's information. Some SEO's believe this also helps in ranking in the search engines.
You must also have your target keywords ready as they are the integral part of on page SEO. Learn how to keyword research the smart way using the right tools.