I have some webpages on htohananet.com that I want to modify or remove, but hawaiiantel cut off my access to it a couple of years ago without telling me (well it was someplace on their website, but who looks there? I found out only after the fact.)
Anyway, hawaiiantel now says they no longer have access to those pages. And what's worse, couldn't direct me to anybody who does.
Anyway, the site has now been spidered by google. And google effectively says virtually anything on the web is up for grabs (except for confidential information like social security numbers).
Well, to prevent this from happening again, I want to prevent the search engines from finding some webpages when I upload them to another site.
And here's a page that tells me how.
From what I gather, all you have to do is create a robots.txt file with the following commands:
User-agent: *
Disallow: /
Alternatively, you can insert this meta tag in the HEAD section of the page
***
Here's google's instructions.
No comments:
Post a Comment