Filtering Web Robots

You can use the Web Robots settings to ensure better community performance when you're experiencing heavy traffic from bots and spiders hitting your site.

Fastpath: Admin Console: System > Settings > Web Robots

If you find that web robots constitute a large part of site traffic, you can limit the amount of extra information overhead caused by these visits by setting rules to identify bots based on the User-Agent values found in the HTTP header. The rules you set here will determine how page views are reported and analyzed. Because bots change constantly, you may need to modify or add rules from time to time.

When you create rules, you can specify a literal substring, or you can use a fully-anchored regular expression for pattern matching. Use the Java grammar for regular expressions.

All the strings defined in this list will be used to filter out bots: you must remove any strings you don't want to match. To restore the pre-defined list, click Reset to Defaults.