In order to block your
website or particular Directory/Folder in different search engines like Google,
Bing, Yahoo, ASK and many more. There are different methods to block website in
different search engines. Here are the couple of reasons to block website in
different search engines, Protection of your content from the direct search
traffic and Duplicate Content Issue.
Methods of Blocking
Websites:
Method 1: Blocking a Single Page, if you want to Block a
Single Page on Search Engines, add below HTML tag to the particular page which
you want to block.
<meta name="robots" content="noindex,
nofollow">
If you want to block
from Google at that time you can add below HTML Tag,
<meta name="googlebot" content="noindex,
nofollow">
Method 2: If you want to block entire website in search
engines, through your website Robots.txt file you can restrict search engine
bots.
User-agent: *
Disallow: /
If you want to disallow
particular search engine from crawling,
User-agent: Baiduspider
Disallow: /
Disallow all search
engines from particular folders,
User-agent: *
Disallow: /cgi-bin/
Disallow: /private/
Disallow: /tmp/
Disallow all search
engines from particular files,
User-agent: *
Disallow: /contactus.htm
Disallow: /index.htm
Disallow: /store.htm
Disallow all search
engines but one,
User-agent: *
Disallow: /private/
User-agent: Googlebot
Disallow:
Method 3: Blocking a Single Outgoing Link, To hide a
single link on a page, simply use the "rel" tag as shown:
<a
href="http://www.abc.com" rel="nofollow">ABC</a>
No comments:
Post a Comment