diff options
author | ArthurHoaro <arthur@hoa.ro> | 2016-10-12 12:36:59 +0200 |
---|---|---|
committer | ArthurHoaro <arthur@hoa.ro> | 2016-10-12 12:36:59 +0200 |
commit | fdf88d194874a533cf3a8de3d317d70018aa8a62 (patch) | |
tree | f204fb8ba9e589d4e063c57da01716c487b6e891 /doc/Server-security.md | |
parent | dc8e03bfc415d3771a45ccd324078a48b7466cbe (diff) | |
download | Shaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.tar.gz Shaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.tar.zst Shaarli-fdf88d194874a533cf3a8de3d317d70018aa8a62.zip |
Bump version to v0.8.0
Signed-off-by: ArthurHoaro <arthur@hoa.ro>
Diffstat (limited to 'doc/Server-security.md')
-rw-r--r-- | doc/Server-security.md | 14 |
1 files changed, 14 insertions, 0 deletions
diff --git a/doc/Server-security.md b/doc/Server-security.md index 0d16e284..50549a21 100644 --- a/doc/Server-security.md +++ b/doc/Server-security.md | |||
@@ -58,3 +58,17 @@ before = common.conf | |||
58 | failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ | 58 | failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ |
59 | ignoreregex = | 59 | ignoreregex = |
60 | ``` | 60 | ``` |
61 | |||
62 | ## Robots - Restricting search engines and web crawler traffic | ||
63 | |||
64 | Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic. | ||
65 | |||
66 | ``` | ||
67 | User-agent: * | ||
68 | Disallow: / | ||
69 | ``` | ||
70 | |||
71 | See: | ||
72 | - http://www.robotstxt.org/ | ||
73 | - http://www.robotstxt.org/robotstxt.html | ||
74 | - http://www.robotstxt.org/meta.html | ||