fix robots.txt

currently the robots.txt file is useless because
it's interpreted as one path "/icons/ /fonts/ *.js *.css"
(an example path that would be accepted -- and therefore disallowed
for robots) by this regex would be `https://cobalt.tools/icons/ /fonts/ bla.js .css`,
which is obviously nonsense & useless)
This commit is contained in:
dumbmoron 2023-11-10 18:27:18 +01:00
parent 463ece02c7
commit d936dd73fe
No known key found for this signature in database
GPG key ID: C59997C76C6A8E5F

View file

@ -1,2 +1,5 @@
User-Agent: *
Disallow: /icons/ /fonts/ *.js *.css
Disallow: /icons/
Disallow: /fonts/
Disallow: /*.js
Disallow: /*.css