ea9972dbfc
With the previous nginx configuration, all requests that matched `$badagent` were rewrote to `/ai`, without the original URL anywhere. This resulted in `iocaine` only seeing `/ai`, and thus, every page ended up being rendered the same. The rewrite should take the original request URI into account, which is what this patch does. Signed-off-by: Gergely Nagy <me@gergo.csillger.hu> |
||
---|---|---|
.forgejo/workflows | ||
data | ||
docs | ||
LICENSES | ||
nix | ||
src | ||
templates | ||
.envrc | ||
.gitattributes | ||
.gitignore | ||
.gitmodules | ||
Cargo.lock | ||
Cargo.toml | ||
flake.lock | ||
flake.nix | ||
README.md | ||
REUSE.toml |
iocaine
The deadliest poison known to AI.
This is a tarpit, modeled after Nepenthes, intended to catch unwelcome web crawlers, but with a slightly different, more aggressive intended usage scenario. The core idea is to configure a reverse proxy to serve content generated by iocaine
to AI crawlers, but normal content to every other visitor. This differs from Nepenthes, where the idea is to link to it, and trap crawlers that way. Not with iocaine
, where the trap is laid by the reverse proxy.
iocaine
does not try to slow crawlers. It does not try to waste their time that way - that is left up to the reverse proxy. iocaine
is purely about generating garbage.
For more information about what this is, how it works, and how to deploy it, have a look at the dedicated website.
Lets make AI poisoning the norm. If we all do it, they won't have anything to crawl.