An update to Google’s privacy policy suggests that the entire public internet is fair game for it’s AI projects.

    • renrenPDX@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      It’s a plain text file that is hosted on your site that should be visible to the internet. Basically allows/disallows scraping from search engines in your site.

    • sudo@lemmy.fmhy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Here’s an example https://www.google.com/robots.txt

      Basically it’s a file people put in their root directory of their domain to tell automated web crawlers what sections of the website and what kind of web crawlers are allowed to access their resources.

      It isn’t a legally binding thing, more of a courtesy. Some sites may block traffic if they’re detecting the prohibited actions, so it gives your crawlers an idea of what’s okay in order to not get blocked.