# On gopher to http proxies

I have done some searching on the www search engines and my (and everyone elses) content is live on the web due to robots.txt not being setup correctly I realized when I got to the last sentence that I am part of the problem. I had forgotten about the early version of the RPoD proxy living on SDF. So this is now fixed with:

    User-agent: * 
    Disallow: /

So, I want to appologize, and also ask everyone running a gopher proxy to create robots.txt files in their www root to keep gopher and the web away from each other.

Thanks