The deadliest poison known to AI
Find a file
Gergely Nagy 43af527368
nix: Add zstd to the devshell
Signed-off-by: Gergely Nagy <me@gergo.csillger.hu>
2025-01-29 08:23:40 +01:00
.forgejo/workflows ci: build & publish a static binary too 2025-01-29 08:16:36 +01:00
data Improved documentation 2025-01-29 01:14:58 +01:00
docs docs: Fix the nginx rewrite 2025-01-29 07:44:34 +01:00
LICENSES Initial import 2025-01-16 10:44:56 +01:00
nix nix: Provide a static build too 2025-01-29 07:59:04 +01:00
src Make templating actually useful 2025-01-29 00:20:21 +01:00
templates Make templating actually useful 2025-01-29 00:20:21 +01:00
.envrc Initial import 2025-01-16 10:44:56 +01:00
.gitattributes .gitattributes: Try to mark markdown files detectable 2025-01-25 12:40:00 +01:00
.gitignore Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
.gitmodules Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
Cargo.lock Make the generated HTML templatable 2025-01-28 08:50:22 +01:00
Cargo.toml Make the generated HTML templatable 2025-01-28 08:50:22 +01:00
flake.lock Initial import 2025-01-16 10:44:56 +01:00
flake.nix nix: Add zstd to the devshell 2025-01-29 08:23:40 +01:00
README.md Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
REUSE.toml Make the generated HTML templatable 2025-01-28 08:50:22 +01:00

iocaine

Build status Container image Demo Documentation

The deadliest poison known to AI.

This is a tarpit, modeled after Nepenthes, intended to catch unwelcome web crawlers, but with a slightly different, more aggressive intended usage scenario. The core idea is to configure a reverse proxy to serve content generated by iocaine to AI crawlers, but normal content to every other visitor. This differs from Nepenthes, where the idea is to link to it, and trap crawlers that way. Not with iocaine, where the trap is laid by the reverse proxy.

iocaine does not try to slow crawlers. It does not try to waste their time that way - that is left up to the reverse proxy. iocaine is purely about generating garbage.

For more information about what this is, how it works, and how to deploy it, have a look at the dedicated website.

Lets make AI poisoning the norm. If we all do it, they won't have anything to crawl.