aboutsummaryrefslogtreecommitdiffhomepage
path: root/doc/Server-security.md
diff options
context:
space:
mode:
authorArthurHoaro <arthur@hoa.ro>2016-10-12 12:36:59 +0200
committerArthurHoaro <arthur@hoa.ro>2016-10-12 12:36:59 +0200
commitfdf88d194874a533cf3a8de3d317d70018aa8a62 (patch)
treef204fb8ba9e589d4e063c57da01716c487b6e891 /doc/Server-security.md
parentdc8e03bfc415d3771a45ccd324078a48b7466cbe (diff)
downloadShaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.tar.gz
Shaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.tar.zst
Shaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.zip
Bump version to v0.8.0
Signed-off-by: ArthurHoaro <arthur@hoa.ro>
Diffstat (limited to 'doc/Server-security.md')
-rw-r--r--doc/Server-security.md14
1 files changed, 14 insertions, 0 deletions
diff --git a/doc/Server-security.md b/doc/Server-security.md
index 0d16e284..50549a21 100644
--- a/doc/Server-security.md
+++ b/doc/Server-security.md
@@ -58,3 +58,17 @@ before = common.conf
58failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ 58failregex = \s-\s<HOST>\s-\sLogin failed for user.*$
59ignoreregex = 59ignoreregex =
60``` 60```
61
62## Robots - Restricting search engines and web crawler traffic
63
64Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic.
65
66```
67User-agent: *
68Disallow: /
69```
70
71See:
72- http://www.robotstxt.org/
73- http://www.robotstxt.org/robotstxt.html
74- http://www.robotstxt.org/meta.html