ALL >> Search-Engine-Optimization >> View Article
Google And Automated Seo Tools-00-703
This is what Matt Cutts had to say:
Hi Scott. Google does use algorithms and different techniques to block excessive automated queries and scraping, especially when a someone is hitting Google quite hard. The reason is that scraping consumes server
resources. We don't want real users to be slowed down or affected just because a bot is sending bunches of automated queries to Google.
We do turn off a number of tools/bots/IP addresses that scrape us too heavily. It's a common enough phenomenon that we did a blog post on Google's Online Security Blog about the subject. In fact, I know that just a week or so ago our algorithms turned off an IP belonging to one of the entities that you mentioned in your post.
In general, I would approach the bizdev folks at Google about how to send automated queries to Google with permission. Failing that, be aware that if a tool sends too many queries to Google, we do reserve the right to disable the IP address(es) of that tool. One thing I would *not* recommend is that if a tool is blocked for bad behavior,
trying to make the tool more "sneaky" (e.g. trying to make ...
... the tool look closer to a web browser). Attempts to fake out Google and pretend to be more like a web browser (after you've been blocked once already) is an example of the sort of thing that is really bad in our opinion.
JohnMu further stated:
This is an interesting topic. I'm not quite sure how all those links apply to our terms of service, which do not allow these kinds of automated tools.
A tool accessing other websites should try to obey the rules set forth by that website. In general, these rules are described in several ways:
1. The robots.txt covers which URLs may be accessed and which ones are disallowed. You'll notice that in our robots.txt we explicitly disallow "/search", which is what most of the ranking tools generally try to access.
2. The server result codes give more information when a URL is accessed. When our network recognizes automated queries, it may return a result code of 500 or similar.
3. The HTTP headers returned by the server can provide information through the "x-robots-tag".
4. A HTML page may provide information through a "robots" (or in our case, "googlebot") meta tag.
As far as I am aware, there are no "SEO-tools" that have permission to access our web-search results in an automated way. I am also not aware of any plans to change that in the near future.
About the Author:
Learn about
http://www.hirank.com/ at the www.hirank.com/blog/
Add Comment
Search Engine Optimization Articles
1. Debunking Common Seo Myths: What Actually Works In 2025?Author: iDigitize
2. Drive Your Business Growth In New Zealand With Our Expert Seo Services
Author: Top Rank Digital
3. Maximize Your Online Presence With Seo And Ppc Services In New Zealand
Author: Top Rank Digital
4. Small Business Seo Service In New York: A Complete Guide To Growing Your Online Presence
Author: Captivate Designs
5. The Definitive Resource For Choosing The Right Marketing Agency In Miami
Author: Kickoofadvertising
6. Nova Tales Media : Digital Marketing Agency
Author: Taramalhotra
7. Empower Your Business Growth With Expert Seo Services In Auckland, Hamilton, And Christchurch
Author: Top Rank Digital
8. Looking For Reliable Seo Services In Kochi? Choose Globosoft!
Author: Seo Globo
9. Seo Services Only At 7000 || Holi Offer
Author: Brand Roof Solutions
10. Revolutionizing Campaign Strategies With Precision And Innovation
Author: Kedar Beyond Creation
11. Empower Your Website With Seamless Access And Seo Power
Author: EliteSiteOptimizer
12. Master Seo Success With Coaching In New York
Author: Captivate Designs
13. Newsindia9 - ,meet The Team
Author: Shhanya Madan Bhatia
14. How Important Are Search Engine Optimization (seo) Services For Business?
Author: ValueHits
15. The Role Of Analytics In Social Media Advertising Measuring What Matters
Author: Digital Hive Solution - Software and Digital Marke