Ovid (publius_ovidius) wrote,
Ovid
publius_ovidius

A New Program?

When a Web site is created, you can add what is called a "robots.txt" file to your pages to tell search engines what they should and should not visit. This has plenty of uses. What I'm curious about, though, is why the White House has a robots.txt file that seems geared toward ensuring that search engines don't index anything they say about Iraq. That's curious.

Now I'm thinking about writing a small program that will take all of the "disallowed" pages and archive them by date and later check for text differences to find out if the White House is redacting those documents.
Subscribe
  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 5 comments