The deadliest poison known to AI
Find a file
Gergely Nagy e8c9bc0056
docs: Sort the sections properly
How? -> Configuration -> Deploying

Signed-off-by: Gergely Nagy <me@gergo.csillger.hu>
2025-01-25 11:44:15 +01:00
.forgejo/workflows ...run the documentation workflow if it changes 2025-01-25 01:36:15 +01:00
data Slight environment variable adjustment 2025-01-25 02:16:06 +01:00
docs docs: Sort the sections properly 2025-01-25 11:44:15 +01:00
LICENSES Initial import 2025-01-16 10:44:56 +01:00
nix nix: Add a NixOS module 2025-01-16 12:27:10 +01:00
src Slight environment variable adjustment 2025-01-25 02:16:06 +01:00
.envrc Initial import 2025-01-16 10:44:56 +01:00
.gitattributes Initial import 2025-01-16 10:44:56 +01:00
.gitignore Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
.gitmodules Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
Cargo.lock cli: Default to "config.toml" as config file 2025-01-16 12:24:16 +01:00
Cargo.toml Change the homepage URL to iocaine.mh-p.o 2025-01-25 02:33:11 +01:00
flake.lock Initial import 2025-01-16 10:44:56 +01:00
flake.nix Change the homepage URL to iocaine.mh-p.o 2025-01-25 02:33:11 +01:00
README.md Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
REUSE.toml Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00

iocaine

Build status Container image Demo Documentation

The deadliest poison known to AI.

This is a tarpit, modeled after Nepenthes, intended to catch unwelcome web crawlers, but with a slightly different, more aggressive intended usage scenario. The core idea is to configure a reverse proxy to serve content generated by iocaine to AI crawlers, but normal content to every other visitor. This differs from Nepenthes, where the idea is to link to it, and trap crawlers that way. Not with iocaine, where the trap is laid by the reverse proxy.

iocaine does not try to slow crawlers. It does not try to waste their time that way - that is left up to the reverse proxy. iocaine is purely about generating garbage.

For more information about what this is, how it works, and how to deploy it, have a look at the dedicated website.

Lets make AI poisoning the norm. If we all do it, they won't have anything to crawl.