The deadliest poison known to AI
Find a file
Gergely Nagy 6fa438ebcb
Make the metrics always available
Previously, metrics were behind a feature flag (which was enabled by
default), now it's always available, but still disabled by default.

Signed-off-by: Gergely Nagy <me@gergo.csillger.hu>
2025-02-07 15:07:40 +01:00
.forgejo/workflows ci: Prepare for cross-building binaries 2025-02-04 08:47:57 +01:00
data data: Update the Grafana dashboard 2025-02-06 22:07:17 +01:00
docs metrics: No need to set instance 2025-02-06 22:01:54 +01:00
LICENSES Initial import 2025-01-16 10:44:56 +01:00
nix Add a number of tests 2025-01-30 09:18:35 +01:00
src Make the metrics always available 2025-02-07 15:07:40 +01:00
templates Make templating actually useful 2025-01-29 00:20:21 +01:00
tests Make the metrics always available 2025-02-07 15:07:40 +01:00
.envrc Initial import 2025-01-16 10:44:56 +01:00
.gitattributes .gitattributes: Try to mark markdown files detectable 2025-01-25 12:40:00 +01:00
.gitignore Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
.gitmodules Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
Cargo.lock Implement Prometheus-compatible, optional metrics 2025-02-05 02:36:13 +01:00
Cargo.toml Make the metrics always available 2025-02-07 15:07:40 +01:00
flake.lock Initial import 2025-01-16 10:44:56 +01:00
flake.nix nix: Add zstd to the devshell 2025-01-29 08:23:40 +01:00
README.md Move documentation to a dedicated site 2025-01-25 01:31:38 +01:00
REUSE.toml docs: Add a HOWTO about monitoring 2025-02-05 11:26:49 +01:00

iocaine

Build status Container image Demo Documentation

The deadliest poison known to AI.

This is a tarpit, modeled after Nepenthes, intended to catch unwelcome web crawlers, but with a slightly different, more aggressive intended usage scenario. The core idea is to configure a reverse proxy to serve content generated by iocaine to AI crawlers, but normal content to every other visitor. This differs from Nepenthes, where the idea is to link to it, and trap crawlers that way. Not with iocaine, where the trap is laid by the reverse proxy.

iocaine does not try to slow crawlers. It does not try to waste their time that way - that is left up to the reverse proxy. iocaine is purely about generating garbage.

For more information about what this is, how it works, and how to deploy it, have a look at the dedicated website.

Lets make AI poisoning the norm. If we all do it, they won't have anything to crawl.