aboutsummaryrefslogtreecommitdiffhomepage
path: root/doc/Server-security.md
diff options
context:
space:
mode:
authornodiscc <nodiscc@gmail.com>2017-01-26 18:52:54 +0100
committernodiscc <nodiscc@gmail.com>2017-06-18 00:19:49 +0200
commit53ed6d7d1e678d7486337ce67a2f17b30bac21ac (patch)
treef8bef0164a70bd03d2b9781951c01bdd018f1842 /doc/Server-security.md
parentd5d22a6d07917865c44148ad76f43c65a929a890 (diff)
downloadShaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.tar.gz
Shaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.tar.zst
Shaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.zip
Generate HTML documentation using MkDocs (WIP)
MkDocs is a static site generator geared towards building project documentation. Documentation source files are written in Markdown, and configured with a single YAML file. * http://www.mkdocs.org/ * http://www.mkdocs.org/user-guide/configuration/ Ref. #312 * remove pandoc-generated HTML documentation * move markdown doc to doc/md/, * mkdocs.yml: * generate HTML doc in doc/html * add pages TOC/ordering * use index.md as index page * Makefile: remove execute permissions from generated files * Makefile: rewrite htmlpages GFM to markdown conversion using sed: awk expression aslo matched '][' which causes invalid output on complex links with images or code blocks * Add mkdocs.yml to .gitattributes, exclude this file from release archives * Makefile: rename: htmldoc -> doc_html target * run make doc: pull latest markdown documentation from wiki * run make htmlpages: update html documentation
Diffstat (limited to 'doc/Server-security.md')
-rw-r--r--doc/Server-security.md74
1 files changed, 0 insertions, 74 deletions
diff --git a/doc/Server-security.md b/doc/Server-security.md
deleted file mode 100644
index 50549a21..00000000
--- a/doc/Server-security.md
+++ /dev/null
@@ -1,74 +0,0 @@
1#Server security
2## php.ini
3PHP settings are defined in:
4- a main configuration file, usually found under `/etc/php5/php.ini`; some distributions provide different configuration environments, e.g.
5 - `/etc/php5/php.ini` - used when running console scripts
6 - `/etc/php5/apache2/php.ini` - used when a client requests PHP resources from Apache
7 - `/etc/php5/php-fpm.conf` - used when PHP requests are proxied to PHP-FPM
8- additional configuration files/entries, depending on the installed/enabled extensions:
9 - `/etc/php/conf.d/xdebug.ini`
10
11### Locate .ini files
12#### Console environment
13```bash
14$ php --ini
15Configuration File (php.ini) Path: /etc/php
16Loaded Configuration File: /etc/php/php.ini
17Scan for additional .ini files in: /etc/php/conf.d
18Additional .ini files parsed: /etc/php/conf.d/xdebug.ini
19```
20
21#### Server environment
22- create a `phpinfo.php` script located in a path supported by the web server, e.g.
23 - Apache (with user dirs enabled): `/home/myself/public_html/phpinfo.php`
24 - `/var/www/test/phpinfo.php`
25- make sure the script is readable by the web server user/group (usually, `www`, `www-data` or `httpd`)
26- access the script from a web browser
27- look at the _Loaded Configuration File_ and _Scan this dir for additional .ini files_ entries
28```php
29<?php phpinfo(); ?>
30```
31
32## fail2ban
33`fail2ban` is an intrusion prevention framework that reads server (Apache, SSH, etc.) and uses `iptables` profiles to block brute-force attempts:
34- [Official website](http://www.fail2ban.org/wiki/index.php/Main_Page)[](.html)
35- [Source code](https://github.com/fail2ban/fail2ban)[](.html)
36
37### Read Shaarli logs to ban IPs
38Example configuration:
39- allow 3 login attempts per IP address
40- after 3 failures, permanently ban the corresponding IP adddress
41
42`/etc/fail2ban/jail.local`
43```ini
44[shaarli-auth][](.html)
45enabled = true
46port = https,http
47filter = shaarli-auth
48logpath = /var/www/path/to/shaarli/data/log.txt
49maxretry = 3
50bantime = -1
51```
52
53`/etc/fail2ban/filter.d/shaarli-auth.conf`
54```ini
55[INCLUDES][](.html)
56before = common.conf
57[Definition][](.html)
58failregex = \s-\s<HOST>\s-\sLogin failed for user.*$
59ignoreregex =
60```
61
62## Robots - Restricting search engines and web crawler traffic
63
64Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic.
65
66```
67User-agent: *
68Disallow: /
69```
70
71See:
72- http://www.robotstxt.org/
73- http://www.robotstxt.org/robotstxt.html
74- http://www.robotstxt.org/meta.html