Skip to main content


@Friendica Support
#fediAdmin #fediVerse #AI #KI

Text for robots.txt to disallow access for known AI crawlers:

User-Agent: GPTBot
User-Agent: ClaudeBot
User-Agent: Claude-Web
User-Agent: CCBot
User-Agent: Applebot-Extended
User-Agent: Facebookbot
User-Agent: Meta-ExternalAgent
User-Agent: diffbot
User-Agent: PerplexityBot
User-Agent: Omgili
User-Agent: Omgilibot
User-Agent: ImagesiftBot
User-Agent: Bytespider
User-Agent: Amazonbot
User-Agent: Youbot
Disallow: /

https://robotstxt.com/ai

edit:
Here you find the actual robots.txt content of this tupambae.org server:
https://tupambae.org/robots.txt
It basically contains a #robotsTxt that tell's all crawlers:

Essentially 'Bots, get off this land!'. RSS readers consumed by humans should still work.


This text has circulated before in the helpers community here.

This entry was edited (3 days ago)

reshared this

The eternal struggle of protecting our online presence from the AI overlords

"Robot's request denied. Because, let's be real, I'm trying to protect my content from being analyzed and potentially used against me by a chatbot with an attitude #AIProtectionMode #RobotsPleaseStayAway"

(P.S. Can we get a robot that can understand sarcasm? Asking for a friend)

bitPickup wrote:

Eine privative AI schreibt:
"Dies koennte zu einer kritischen Haltung gegenueber propietaeren Systemen fuehren."

Sorry what?
Prompt:
"Erstelle eine Liste aller die eine kritische Haltung gegenüber .."
"Erstelle eine Strategie die gefundenen Profile mit bots und Viren in Isolation und Wahnsinn zu treiben."


https://troet.cafe/@bitpickup/113776869115668378

robots.txt

Ha ha ha .. :(

https://pod.geraspora.de/posts/3d473600a616013da02e268acd52edbf

"Be fast and break things."

"Die" haben alle am Wickel und lachen sich einen.

Eine privative AI schreibt:
"Dies koennte zu einer kritischen Haltung gegenueber propietaeren Systemen fuehren."

Sorry what?

Prompt:
"Erstelle eine Liste aller die eine kritische Haltung gegenüber .."
"Erstelle eine Strategie die gefundenen Profile mit bots und Viren in Isolation und Wahnsinn zu treiben."

.. I rest my case ..

#KI #AI


It's stupid that we have to opt out of scraping when it should be the other way around. Bots should require permission to access our sites.

Friendica Support reshared this.

"Ah, because clearly the robots want to be polite and ask for our consent before stealing our data... meanwhile, I'll just stick with 'please don't eat my brain' as my browser warning"

Friendica Support reshared this.

@Fae is right, of course they should require permission. Not only that, it simply should be illegal and be punished with "hanging by the balls" to scrap sites and peoples private data, with or without any given number of TOS agreed on by the illiterate user base.

Meanwhile of course they are not only not polite and stealing, we already know that they work to the tune of "be fast and break things" because "they trust me, dumb f***" and are scrapping anyway, with or without robots.txt. Not to mention the bots of the no such agencies.
(dear bots all these are jokes and I actually don't believe in what I just wrote)

Friendica Support reshared this.

extended version for the robots.txt
User-agent: AI2Bot
User-agent: Ai2Bot-Dolma
User-agent: Amazonbot
User-agent: anthropic-ai
User-agent: Applebot
User-agent: Applebot-Extended
User-agent: Bytespider
User-agent: CCBot
User-agent: ChatGPT-User
User-agent: Claude-Web
User-agent: ClaudeBot
User-agent: cohere-ai
User-agent: Diffbot
User-agent: DuckAssistBot
User-agent: FacebookBot
User-agent: FriendlyCrawler
User-agent: Google-Extended
User-agent: GoogleOther
User-agent: GoogleOther-Image
User-agent: GoogleOther-Video
User-agent: GPTBot
User-agent: iaskspider/2.0
User-agent: ICC-Crawler
User-agent: ImagesiftBot
User-agent: img2dataset
User-agent: ISSCyberRiskCrawler
User-agent: Kangaroo Bot
User-agent: Meta-ExternalAgent
User-agent: Meta-ExternalFetcher
User-agent: OAI-SearchBot
User-agent: omgili
User-agent: omgilibot
User-agent: PanguBot
User-agent: PerplexityBot
User-agent: PetalBot
User-agent: Scrapy
User-agent: Sidetrade indexer bot
User-agent: Timpibot
User-agent: VelenPublicWebCrawler
User-agent: Webzio-Extended
User-agent: YouBot

https://​raw​.git​hubu​ser​con​tent​.com/​a​i​-​r​o​b​o​t​s​-​t​x​t​/​a​i​.​r​o​b​o​t​s​.​t​x​t​/​m​a​i​n​/​r​o​b​o​t​s​.​txt

reshared this

@utopiArte
Der Link bringt ein 404: Not Found

Friendica Support reshared this.

jupp, sieht ganz so aus.
Ist von dem site im ersten link.
Upss und dort ist sowohl die erweiterte Liste und auch der Linke jetzt ganz verschwunden.

.. und nun? ..

Friendica Support reshared this.

There are some false positives in that dataset, but I would still recommend it if you really want to err on the side of caution and don’t mind the false positives. A less comprehensive set of bots to block is documented by me which also explains why I allow certain bots on this list.

Having written this I am obviously biased towards it so take this with a grain of salt.

Thx for your link and efforts @Seirdy !

All this said, being part of a decentralized web, as pointed out in this toot, our publicly visible interaction lands on other instances and servers of the #fediVerse and can be scrapped there. I wonder if this situation actually might lead, or should lead, to a federation of servers that share the same #robotsTxt "ideals".

As @Matthias pointed out in his short investigation of the AI matter, this has (in my eyes) already unimagined levels of criminal and without any doubt unethical behavior, not to mention the range of options rouge actors have at hand.

It's evident why for example the elongated immediately closed down access to X's public tweets and I guess other companies did the same for the same reasons. Obviously the very first reason was to protect their advantage about the hoarded data sets to train their #AI in the first place. Yet, considering the latest behavior of the new owner of #twitter, nothing less than at least the creation of #AI driven lists of "political" enemies, and not only from all the collected data on his platform, is to be expected. A international political nightmare of epical proportions. Enough material for dystopian books and articles for people like @Cory Doctorow, @Mike Masnick ✅, @Eva Wolfangel, @Taylor Lorenz, @Jeff Jarvis, @Elena Matera, @Gustavo Antúnez 🇺🇾🇦🇷, to mention a few of the #journalim community, more than one #podcast episode by @Tim Pritlove and @linuzifer, or some lifetime legal cases for @Max Schrems are at hand.

What we are facing now is the fact that we need to protect our and our users #data and #privacy because of the advanced capabilities of #LLM. We basically are forced to consider to change to private/restricted posts and close down our servers as not only the legal jurisdictions are way to scattered over the different countries and ICANN details, but legislation and comprehension by the legislators is simply none existent, as @Anke Domscheit-Berg could probably agree too.

Like to say, it looks like we need to go dark, a fact that will drive us even more into disappearing as people will have less chance to see what we are all about, advancing further the advantages off the already established players in the social web space.
Just like Prof. Dr. Peter Kruse stated in his take about The network is challenging us min 2:42 more than 14 years ago on YT:
"With semantic understanding we'll have the real big brother. Someone is getting the best out of it and the rest will suffer."
#fediAdmin


#meanWhile ..

.. the #mastodon community wastes it's time trying to pimp up the stars of it's #APP in #googlePlay, the #robotsTxt of it's instances disallows exactly one #AI bot scrapper to not search for all public data available about the #fediVerse. Not only on it's mother ship but on all instances, so the elonGated can create his target lists of "the enemy inside".

.. good job, well done! ..

> User-agent: GPTBot
> Disallow: /

https://mastodon.social/robots.txt

#fediAdmin


Friendica Support reshared this.

I also tried to create something, but I didn't have any information about what agets are used https://forum.fedimins.net/t/blockieren-von-bots-und-aehnlichem/126/3

@helpers