There is no requirement to do anything with this feature. You could simply leave it as it is after you install kiwitrees. It was designed so that kiwitrees will work perfectly if you just ignore it.
The page consists of two parts:
Exactly what you do with this information depends on how aggressively you want to restrict access to the site.
If you do want to actively manage things, you should start by finding out a little more information about each of your “unrecognised visitors”:
Remember also that user-agent strings are easily faked. Anyone can send a request with a UA string of: Mozilla/5.0 (compatible; Googlebot/2.1; +www.google.com/bot.html)
Most robots are harmless. They only tend to cause problems when they visit too frequently, and consume too many server resources. But it is important to identify them. For example, you don’t want them to have access to the calendar page. By following it’s links they would end up with an almost infinite number of pages. The “Site Access Rules” page helps to control access through universally allowing, denying, or limiting access to IP address ranges, but you also need to have a “robots.txt” file in place to more specifically control which pages robots are denied access to.If you have had a robots.txt file in place for some time you should review it. The latest version of kiwitrees (3.1) has a new example robots.txt file (robots-example.txt) included in its installation package. This reverses the previous files “white-list” rules in favour of a simpler “black-list approach. This ensures search engines such as Google can access all site resources in order to accurately assess SEO rankings; while still denying them access to pages that are not necessary for them and that might consume excessive server resources if crawled repeatedly. These are mainly pages with large numbers of links to, for example, hundreds of individual pages that the robot will find anyway.
This file needs to be copied, renamed, and placed in your web site domain root directory, such as “www.example.com/robots.txt”. It will not work in a subdirectory, such as “www.example.com/webtrees/robots.txt”. If you do need to move it, remember to adjust the paths as well, e.g. “Disallow: /login.php” becomes “Disallow: /kiwitrees/login.php”.