SEO audit of a website - A checklist for SEO audit (2)

Category: Entertainment

Presentation Description

No description available.


Presentation Transcript

slide 1: Page 1 What is SEO Audit SEO audit of a website can be defined as the ‘complete health check-up of a website’ in terms of traffic crawlability by bots quality of content backlinks loading speeds and various such technical factors that decide the overall search engine friendliness of your website. What are the steps in SEO Audit  Authority Check  Keywords Analysis  Competitor Analysis  Technical Analysis  Keyword Cannibalization  Duplicate content  Xml sitemap  Robots.txt  Mobile friendlyGoogle check  Speed  404 errors/ Redirect issues  Back link Profile Analysis  Quality of backlinks  Total unique referring domains A Guide to Complete / SEO Audit of Your Website

slide 2: Page 2  On Page Factors  Meta descriptions  Title tags  URL optimization  Keyword mapping and density 1. Authority Check Authority indicates the overall trust and quality of a website. You can download mozBar chrome extension to find out the Authority of your websites. It is called Domain / Page authority. The tool shows two metrics DA for complete website Page Authority For a page on website. Generally a DA of more than 30 is considered good. 2. Keyword Analysis Keyword analysis involves checking the relevancy of the keywords. Analyze your current set of keywords and make sure you are targeting the right keywords. This also involves examining the current rankings and search volume data of all the keywords in your list. You can use tools like SEMrush or keyword planner toolGoogle Adwords to get the search traffic and positions of your current keyword. Once you get traffic data find out the keyword that have relatively low volume and target then as they are less competitive compared to other high search traffic value keywords. 3. Competitor Analysis Competitor analysis helps us understand how we stand compared to our competitors. It paves the way to include the best practices in our strategy that our competitors are using. Again here you can use SEMrush to find out your top organic competitor. Once you know your top competitor analyze competitors domain to decode their keywords. This way you can get new keyword ideas to add into your list of keywords. You can even find some low hanging fruits of competitors keywords having low competition. Examine the back links of competitor and make a plan to acquire backlinks from those domains. 4. Technical Analysis 1 Keyword cannibalization - Keyword Cannibalization occurs when two pages compete for same keyword. It can confuse Google on which page should rank for that particular keyword. Use screaming frog SEO spider to find such issues. For example once you scan your domain in the screaming frog by entering the URL within the box you will see the data within different tabs. Go to the page titles and set ‘duplicate’ in the drop down menu for filtration. You will see the list of pages having same titles which is not good for SEO. Make sure you map unique keyword for each page titles. 2 Duplicate content - Duplicate content hurt website’s SEO. Make sure you write unique meta descriptions for each page as well as unique content within each of your website’s pages. Here again you can use screaming frog. Go to the Meta description tab and filter the results by ‘duplicate’ to view the pages having same Meta descriptions. In order to check duplicate content

slide 3: Page 3 within your pages itself or the blog posts you publish use the tool called copyscape or plagiarism checker to check the originality of content. 3 XML sitemap – You can use Google Webmaster Tools to check if the website has a sitemap submitted or not if you have linked website with Google Search Console. Or you can simply enter the command ‘https://yourdomain/sitemap.xml’ to find whether your website has a sitemap or not. In some cases you will see the list of sitemaps with ‘https://yourdomain/sitemap_index.xml’. Make sure all the pages are indexed. You can check that with Google Webmasters Tools. 4 Robots.txt – Similar to sitemap check for a robots file using command ‘https://yourdomain/robots.txt’. Also robots.txt is an effective to give crawl instructions to search engine bots but sometimes webmasters accidently block some important pages. So carefully examine your robots file and look for command ‘Disallow: /’. Check if you have blocked any important resource. 5 Mobile Friendly Check – Perform Google’s Mobile friendly check to test how your page gets displayed on mobile devices. 6 Speed – Website loading speed is an important factor in SEO. You can check website loading speed with various free tools. I would recommend Tools like Pingdom and Google’s speed tools. 7 404 Errors – Check for 404 errors in screaming frog tool or use Google Webmaster’s tool and go to the crawl errors section. You can then fix the errors with 301 redirection or custom page navigation at 404 pages. 5. Backlink Profile Analysis – 1 Quality of links – Not all the backlinks are good for SEO. Examine the DA or the trust score of all the domains pointing to you. You can first download the list of domains from SEMrush and then run the bulk check on Majestic to know the trust score of the links. Also check only the relevant domains point to you. You can Disavow spam backlinks in Google webmasters. 2 Total unique referring domains – The number of unique domains referring to your website is more important than having too many backlinks only from a few domains. 6. On Page Factors – 1 Meta descriptions – Ideal length of Meta descriptions is 160 to 170 characters. Also check if the target keyword is included in the Meta descriptions. 2 Title Tags – Ideal length for title tag is 60 to 70 characters. Include the target keyword naturally in the title tag. Title tags should be included within the head section of page. 3 URL Optimization – The URLs of all your pages should be clean and clear. Avoid using special characters within URLs. You can use hyphens to separate the words within URLs. Its a good practice to include target keywords within URLs but don’t over stuff the URLs. Keep the URL navigation clean and neat. 4 Keyword Mapping and Density – Make sure all the pages within your website are mapped to unique keywords. You can check that with screaming frog analyzing titles Meta descriptions and URLs of all the pages. Also check the keyword density within the content. Ideally use keyword once in the title of a post once in the first paragraph and then different variations of keyword 2 to 3 times within body. Check if you have keyword stuffed content and replace that as it is bad for SEO

slide 4: Page 4 These are some basic steps in the SEO audit of a website. You can even purchase some tools for a complete SEO audit of a website. SEO audit helps us understand the current state of our website on the internet and this paves the way for further improvement. You can then even hire Digital marketing services company to solve the problems your website is facing to rank better on the search engine results list. You can also check your website’s overall digital roadmap with Digital Intelligence tool like TrinityX to digitally transform your business. Hope this check list helps you perform an effective audit of your website.

authorStream Live Help