diff options
author | Arthur <arthur@hoa.ro> | 2016-10-12 12:39:52 +0200 |
---|---|---|
committer | GitHub <noreply@github.com> | 2016-10-12 12:39:52 +0200 |
commit | 24cfb960cfdd88255333bfb2a08d586916b460ae (patch) | |
tree | f204fb8ba9e589d4e063c57da01716c487b6e891 /doc/Server-security.md | |
parent | dc8e03bfc415d3771a45ccd324078a48b7466cbe (diff) | |
parent | fdf88d194874a533cf3a8de3d317d70018aa8a62 (diff) | |
download | Shaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.tar.gz Shaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.tar.zst Shaarli-24cfb960cfdd88255333bfb2a08d586916b460ae.zip |
Merge pull request #656 from ArthurHoaro/v0.8.0v0.8.0
Bump version to v0.8.0
Diffstat (limited to 'doc/Server-security.md')
-rw-r--r-- | doc/Server-security.md | 14 |
1 files changed, 14 insertions, 0 deletions
diff --git a/doc/Server-security.md b/doc/Server-security.md index 0d16e284..50549a21 100644 --- a/doc/Server-security.md +++ b/doc/Server-security.md | |||
@@ -58,3 +58,17 @@ before = common.conf | |||
58 | failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ | 58 | failregex = \s-\s<HOST>\s-\sLogin failed for user.*$ |
59 | ignoreregex = | 59 | ignoreregex = |
60 | ``` | 60 | ``` |
61 | |||
62 | ## Robots - Restricting search engines and web crawler traffic | ||
63 | |||
64 | Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic. | ||
65 | |||
66 | ``` | ||
67 | User-agent: * | ||
68 | Disallow: / | ||
69 | ``` | ||
70 | |||
71 | See: | ||
72 | - http://www.robotstxt.org/ | ||
73 | - http://www.robotstxt.org/robotstxt.html | ||
74 | - http://www.robotstxt.org/meta.html | ||