rejetto forum
General => Everything else => Topic started by: teeburd105 on July 26, 2007, 05:50:53 PM
-
Is there any way to block googlebots from scanning my server without creating user groups.
I like people to be able to access w/o passwords. I know with some webservers a file called a robot.txt with specific command lines is used. Any option like that in hsf?
thanks, tony
-
yes, it's the same. create a robot.txt and put it in the virtual file system (under the home).
-
User-agent: *
Disallow: /
Save that as robots.txt, and put it on the root of your hosting directory
-
I've made the robot.txt file, but I don't use virtual folders, so where on my hard drive do I place it?
Is there an ini file i've missed?
tony
-
decide for yourself, doesnt really matter. Just drag the file to the root of your VFS.
-
the home is like a folder, you can add anything in it.
and, for your information... surprise, you can add files to real folders too. it will display its content from the disk plus the added file.
-
User-agent: *
Disallow: /
Save that as robots.txt, and put it on the root of your hosting directory
thanks M8
tuskenraider