Jun 13, 2018

100+ Google SEO Ranking Factors


May 23, 2017

Xpath for Extracting tags for webpages using screamingfrog

This feature basically tells Screaming Frog for Extracting tags
//meta[(@name='robots')]/@content - To extract the meta robots tags on pages
//link[(@rel="alternate")]/@hreflang - To indentify the hreflang HTML tags on pages


Sep 14, 2012

Components of Google's Ranking Algorithm

Components of Google's Ranking Algorithm


Jun 28, 2012

Add multiple xml sitemap files robots.txt

robots.txt file helps to pass message to search engine bots whether to allow or disallow a webpage or a directory for index

robots.txt file look like,

User-agent: * Disallow: /abc/

In Detail: User-agent: * Means this section applies to all robots

Disallow: Means that the robot should not visit any pages in the mentioned folder.

If you have a single or multiple XML sitemap then add in robots.txt file.

Below is an example of how multiple xml sitemap can be added to robots.txt

User-agent: *

Disallow: /includes/

Disallow: /scripts/

Sitemap: http://www.yoursite.com/sitemap1.xml

Sitemap: http://www.yoursite.com/sitemap2.xml

Sitemap: http://www.yoursite.com/sitemap3.xml


Feb 16, 2012

301 Redirects using htaccess, 301 Redirects Tips, 301 Redirects SEO

301 Redirects Tips
Now I want to share how to redirect the page via 301 redirects using Apache's htaccess. As we know 301 redirect is Search Engine Friendly method for web page redirection. And this is not much difficult to implement and will conserve your search engine rankings as well.If anyone want to re-direct the old page to new page or website You should have a redirect in place if your old link structure varies from the new one. This will help you maintain you search engine rankings and decrease the possibility if someone reaching a 404 page.

For example:

When someone tries to access your website without the (www); It will redirect the URL to your website with the www. So, we should stick with one pattern because accessing your website both ways is not good in SEO (Search Engine Optimization) of view. Google will treat as duplication. To avoid the errors and make the process SEO friendly, we have to implement 301 redirect for your website,

Follow these instructions:

1. Open up notepad or any other text editor and copy and paste the commands below.
2. Change (yourdomain) to your domain name.
3. Save the file with this name: (.htaccess).
4. Upload it to the root directory of your website.

Options +FollowSymLinks
RewriteEngine On
RewriteCond %{HTTP_HOST} ^yourdomain\.com$ [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [R=301,L]


Feb 15, 2012

Google Analytics - How to manually Track Clicks on Outbound Links

Google Analytics Event Tracking Tips
you want to use the tracking feature of Google Analytics to track all clicks on outbound links from your site, however, you also want those outbound links to open a new window...
If you simply follow the instructions to track outbound links you will discover that your links will no longer open in a new window (regardless of whether you are using Javascript or target="_BLANK" to open the window.

And here's the solution:

Apply the javascript to you hyperlink as usual...
your link text
...but modify the tracking code as follows...
function recordOutboundLink(link, category, action) {
try {
var pageTracker=_gat._getTracker("XX-XXXXXXX-XX");
pageTracker._trackEvent(category, action);
setTimeout('document.location = "' + link.href + '"', 100;window.open(this.href))
(Don't forget to replace XX-XXXXXXX-XX with your tracking code)
...and that's all — Outbound links with Google Analytics tracking and opening in a new window


Jan 6, 2012

10 Google Panda Check List, Panda SEO Checklist

From few months we are hearing about panda ---- Google panda update – Panda has made fabulous change in search engine results. The main reason beside Google panda update is to get rid of duplicate content, content farm and scrapper sites.
Still Google is working on latest techniques to get rid of content farm and scrapper sites. I have found few things which I want to share with you. Below are the Google Panda Check List:
1). Do you have a low search performance page?
Status - Yes / No
Note: You need to work on the quality of the page or you may delete the page. One non-quality page should not disturb other n number of quality pages.
2). Does your site loads slow?
Status - Yes / No
Note: The site load time should not be high. You need to optimize your images and code to reflect site load speed.
3). Do you have well researched keywords that related your web page content?
Status - Yes / No
Note: Without enough quality content, do not optimize your page titles for search keywords. The site page will be repeatedly listed for the user’s keywords only when the page is useful for the search user’s .Do not get into the personalized search users signed in site block list. At one point the search algorithm will be informed about the blocked sites.
4). Do you have high bounce rate?
Status - Yes / No
Note: The site need to have well structured related links within the content or around the content. The navigational structure of the site should be clear with quality content. If the content is clear and interesting then the bounce rate will gets decreased because users will be interested to visits other relevant links.
5). Do you see users spending minimum time in your page(s)?
Status - Yes / NO
Note: For a good site the users average time spent on the page should be high. This is based on the size of the quality content and other relevant content pages. You need to work on your content to hold users on the page(s) for long period.
6). Do you have related images for the content and do you own the copyright?
Status - Yes / No
Note: If you are having images for your content then it should be relevant to the content. The image should speak about the content. Use your own image content or make sure to get the copyright of the image.
7). Do you have a clean design?
Status - Yes / No
Note: Users experience is one of the main factors for panda update. Make sure the content is readable and website structure is user friendly with right combination of colours for a proper look and feel. Balance users experience with search engines experience. Also make sure that you include unique title, Meta tags, H tags, P tag, blog font, alt attribute and anchor texts.
8). Do you have a Meta description that briefs your page content?
Status - Yes / No
Note: Make sure that the Meta description is not stuffed with keywords and it includes a brief of the page content.
9). Have you done a grammatical check on your web page content?
Status - Yes / No
Note: Try to avoid grammatical mistakes in your content. Try to proofread before the content goes live. Clear content has high chance to list top.
10). Do you have more backlinks from same c class IP sites?
Status - Yes / No
Note: Natural links are always better that artificial backlinks. Natural links are risk free. Update you site often with quality content so that your readers will like to give link from their site(s). Artificial backlinks from directories submission site has high chance to be from the same c class ip unless it is well researched. Bunch of links from same c class IP are not valid and can also be considered as spam.
This post will help you. Google panda SEO checklist helps. Main factor for panda recovery is to avoid duplicate content and to remove or update poor quality content. Feel free to share your experience and factors if any.


60 SEO Tips and Tricks, best SEO Tips and Tricks

Latest and new 60 seo tips and tricks which helps you website to make them rank higher in search engines and visibility. List of 60 SEO tips and tricks, best 60 SEO tips and tricks, New SEO SEO tips and tricks.

• Write great content
• Write unique content
• Add new content all the time
• Create a great keyword phrase
• Choose a phrase that is popular
• Write an accessible site for search engine spiders and screen readers
• Use the keyword phrase in your title tag
• Get a domain with your keyword phrase
• Use the keyword phrase in your URL

• Use your keyword phrase a lot, but not too much
• Use your keyword phrase in headlines
• Use your keyword phrase in anchor text of links
• Ask other people for links to your page
• keyword phrase inside incoming links
• Try to get links from reputable sites
• get links from similar sites
• Create as much content as you can
• Keep your site content inside one theme
• Keep your site live as long as possible
• Create a sitemap
• Create an XML sitemap
• Use 301 redirects for permanent redirects
• Use 302 redirects only for long or ugly URLs
• Get as many inbound links as you can
• Put your keyword phrase in the first paragraph
• Put your keyword phrase at the top of the HTML
• Put your keyword phrase in alternative text
• Increase the font size of your keyword phrase
• Format your keyword phrases to stand out
• Write a descriptive meta description
• Link to your page from within your site
• Put up links that flow within the text
• Keep asking for inbound links
• Get linked in DMOZ and Yahoo!
• Periodically check your outbound links for pagerank
• Link all major images
• Keep your pages up-to-date
• Use frames, always use the noframes tag
• Flash, always include alternative text
• Use Flash for non-critical pieces of a page
• Keep your pages close to the root directory
• Use the meta keywords tag and include your keyword phrase
• Keep your kewords together
• Use your keyword phrase in your meta description
• Set your language meta keyword
• Optimize for a few secondary keywords
• Use your keyword phrase in named anchors
• Use synonyms for your keywords
• Don't link a lot to external sites
• Register a separate domain instead of a sub-domain
• Register a .com domain over a .biz or .us domain
• Use hyphens to separate words in domains
• Use hyphens or underscores to separate words in URLs
• Don't write your content with JavaScript
• Don't have more than 10 words in your URL
• Don't use dynamic URLs
• Don't use session IDs
• Don't link to link farms
• Don't have broken links on your site
• Don't use the meta refresh tag to redirect users


Aug 10, 2011

Other way for Url Removal from Goolge

You use the following URL when logged into Google Webmaster Tools:


Then replace {YOUR_URL} with a URL you control within Webmaster Tools, and replace {URL_TO_BLOCK} with the URL of the site you want to block.

You could block a whole site, section or single page this way, based on how you entered the URL. To block a site, use the top level domain (E.g. http://www.someurl.com/), to block a section (subfolder) use a subfolder URL (E.g. http://www.someurl.com/somefolder/) and to block a page use the specific page URL (E.g. http://www.someurl.com/somefolder/somepage.html).

I am waiting an update from Google on why this happened, if site’s were impacted and how long this was an issue


Free SEO Related Top Articles, Articles on SEO & SEM, Latest Articles on SEO Tips

Latest Topics on SEO, SEM Updates Free SEO Directory List Free SEO | SEM | Google Top Videos