Filtering web robots

You can use the web robots settings to ensure better community performance when you're experiencing heavy traffic from bots and spiders hitting your site.

Note: This is an optional feature available as a module. For more information, ask your Jive account representative.

If you find that web robots constitute a large part of site traffic, you can limit the amount of extra information overhead caused by these visits by setting rules to identify bots based on the User-Agent values found in the HTTP header. The rules you set determine how page views are reported and analyzed. Because bots change constantly, you may need to modify or add rules from time to time.

When you create rules, you can specify a literal substring, or you can use a fully-anchored regular expression for pattern matching.