# Define access-restrictions for robots/spiders # http://www.robotstxt.org/wc/norobots.html # By default we allow robots to access all areas of our site # already accessible to anonymous users User-agent: * Crawl-delay: 5 Disallow: /*sendto_form$ Disallow: /*folder_factories$ Disallow: /*manage_translations_form$ Disallow: /*folder_contents Disallow: /*article-view-components User-Agent: AhrefsBot Disallow: / User-Agent: Baiduspider Disallow: / User-Agent: Ezooms Disallow: / User-Agent: MJ12bot Disallow: / User-Agent: YandexBot Disallow: / # Add Googlebot-specific syntax extension to exclude forms # that are repeated for each piece of content in the site # the wildcard is only supported by Googlebot # http://www.google.com/support/webmasters/bin/answer.py?answer=40367&ctx=sibling User-Agent: Googlebot Disallow: