aboutsummaryrefslogtreecommitdiffhomepage
path: root/doc/Server-security.md
diff options
context:
space:
mode:
authorArthur <arthur@hoa.ro>2016-10-12 12:39:52 +0200
committerGitHub <noreply@github.com>2016-10-12 12:39:52 +0200
commit24cfb960cfdd88255333bfb2a08d586916b460ae (patch)
treef204fb8ba9e589d4e063c57da01716c487b6e891 /doc/Server-security.md
parentdc8e03bfc415d3771a45ccd324078a48b7466cbe (diff)
parentfdf88d194874a533cf3a8de3d317d70018aa8a62 (diff)
downloadShaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.tar.gz
Shaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.tar.zst
Shaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.zip
Merge pull request #656 from ArthurHoaro/v0.8.0v0.8.0
Bump version to v0.8.0
Diffstat (limited to 'doc/Server-security.md')
-rw-r--r--doc/Server-security.md14
1 files changed, 14 insertions, 0 deletions
diff --git a/doc/Server-security.md b/doc/Server-security.md
index 0d16e284..50549a21 100644
--- a/doc/Server-security.md
+++ b/doc/Server-security.md
@@ -58,3 +58,17 @@ before = common.conf
58failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ 58failregex = \s-\s<HOST>\s-\sLogin failed for user.*$
59ignoreregex = 59ignoreregex =
60``` 60```
61
62## Robots - Restricting search engines and web crawler traffic
63
64Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic.
65
66```
67User-agent: *
68Disallow: /
69```
70
71See:
72- http://www.robotstxt.org/
73- http://www.robotstxt.org/robotstxt.html
74- http://www.robotstxt.org/meta.html