rejetto forum
Software => HFS ~ HTTP File Server => Topic started by: maverick on September 03, 2006, 12:39:22 PM
-
Does anyone know how to stop spiders from indexing a https site?
robots.txt and <META NAME="ROBOTS" CONTENT="NOINDEX,NOFOLLOW"> only work for a http site.
Anyone know. ???
-
I would make all subdirectories passwordprotected/login and put the user:pass in a gif /jpg. Only humans can read and login, keeps the spiders stuck on your first page.
-
Thanks.