Cloudflare Tool Can Prevent AI Bots from Scraping Websites

Cloudflare has released AI Audit, a free set of new tools designed to help websites analyze and control how their content is used by artificial intelligence models. Described as “one-click blocking” to prevent unauthorized AI scraping, Cloudflare says it will also make it easier to identify the content bots scan most, so they can wall it off and negotiate payment in exchange for access. Helping its clients toward a sustainable future, Cloudflare is also creating a marketplace for sites to negotiate fees based on AI audits that trace cyber footprints on server files. Continue reading Cloudflare Tool Can Prevent AI Bots from Scraping Websites

Mind Your Facebook Comments: Soon Accessible via Google Search

  • Google has developed a new indexing plan that marks a shift in its traditionally passive approach.
  • “Mind what you say in Facebook comments,” reports Wired, “Google will soon be indexing them and serving them up as part of the company’s standard search results.”
  • “Google’s all-seeing search robots still can’t find comments on private pages within Facebook, but now any time you use a Facebook comment form on other sites, or a public page within Facebook, those comments will be indexed by Google.”
  • The article suggests the new policy may upset developers and users alike.
  • “There are two primary requests you can initiate on the Web,” explains Wired. “GET requests are intended for reading data, POST for changing or adding data. That’s why search engine robots like Google’s have always stuck to GET crawling. There’s no danger of the Googlebot altering a site’s data with GET, it just reads the page, without ever touching the actual data. Now that Google is crawling POST pages the Googlebot is no longer a passive observer, it’s actually interacting with — and potentially altering — the websites it crawls.”