Filtering web robots

You can use the web robots settings to ensure better community performance when you're experiencing heavy traffic from bots and spiders hitting your site.

Fastpath: Admin Console: System > Settings > Web Robots

If you find that web robots constitute a large part of site traffic, you can limit the amount of extra information overhead caused by these visits by setting rules to identify bots based on the User-Agent values found in the HTTP header. The rules you set determine how page views are reported and analyzed. Because bots change constantly, you may need to modify or add rules from time to time.

When you create rules, you can specify a literal substring, or you can use a fully-anchored regular expression for pattern matching. For the Java grammar for regular expressions, see the Java documentation at the Oracle portal (

All the strings defined in this list are used to filter out bots: you must remove any strings you don't want to match. You can also restore the pre-defined list.