aboutsummaryrefslogtreecommitdiffhomepage
path: root/doc/md/Server-security.md
diff options
context:
space:
mode:
authornodiscc <nodiscc@gmail.com>2017-01-26 18:52:54 +0100
committernodiscc <nodiscc@gmail.com>2017-06-18 00:19:49 +0200
commit53ed6d7d1e678d7486337ce67a2f17b30bac21ac (patch)
treef8bef0164a70bd03d2b9781951c01bdd018f1842 /doc/md/Server-security.md
parentd5d22a6d07917865c44148ad76f43c65a929a890 (diff)
downloadShaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.tar.gz
Shaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.tar.zst
Shaarli-53ed6d7d1e678d7486337ce67a2f17b30bac21ac.zip
Generate HTML documentation using MkDocs (WIP)
MkDocs is a static site generator geared towards building project documentation. Documentation source files are written in Markdown, and configured with a single YAML file. * http://www.mkdocs.org/ * http://www.mkdocs.org/user-guide/configuration/ Ref. #312 * remove pandoc-generated HTML documentation * move markdown doc to doc/md/, * mkdocs.yml: * generate HTML doc in doc/html * add pages TOC/ordering * use index.md as index page * Makefile: remove execute permissions from generated files * Makefile: rewrite htmlpages GFM to markdown conversion using sed: awk expression aslo matched '][' which causes invalid output on complex links with images or code blocks * Add mkdocs.yml to .gitattributes, exclude this file from release archives * Makefile: rename: htmldoc -> doc_html target * run make doc: pull latest markdown documentation from wiki * run make htmlpages: update html documentation
Diffstat (limited to 'doc/md/Server-security.md')
-rw-r--r--doc/md/Server-security.md73
1 files changed, 73 insertions, 0 deletions
diff --git a/doc/md/Server-security.md b/doc/md/Server-security.md
new file mode 100644
index 00000000..8df36f46
--- /dev/null
+++ b/doc/md/Server-security.md
@@ -0,0 +1,73 @@
1## php.ini
2PHP settings are defined in:
3- a main configuration file, usually found under `/etc/php5/php.ini`; some distributions provide different configuration environments, e.g.
4 - `/etc/php5/php.ini` - used when running console scripts
5 - `/etc/php5/apache2/php.ini` - used when a client requests PHP resources from Apache
6 - `/etc/php5/php-fpm.conf` - used when PHP requests are proxied to PHP-FPM
7- additional configuration files/entries, depending on the installed/enabled extensions:
8 - `/etc/php/conf.d/xdebug.ini`
9
10### Locate .ini files
11#### Console environment
12```bash
13$ php --ini
14Configuration File (php.ini) Path: /etc/php
15Loaded Configuration File: /etc/php/php.ini
16Scan for additional .ini files in: /etc/php/conf.d
17Additional .ini files parsed: /etc/php/conf.d/xdebug.ini
18```
19
20#### Server environment
21- create a `phpinfo.php` script located in a path supported by the web server, e.g.
22 - Apache (with user dirs enabled): `/home/myself/public_html/phpinfo.php`
23 - `/var/www/test/phpinfo.php`
24- make sure the script is readable by the web server user/group (usually, `www`, `www-data` or `httpd`)
25- access the script from a web browser
26- look at the _Loaded Configuration File_ and _Scan this dir for additional .ini files_ entries
27```php
28<?php phpinfo(); ?>
29```
30
31## fail2ban
32`fail2ban` is an intrusion prevention framework that reads server (Apache, SSH, etc.) and uses `iptables` profiles to block brute-force attempts:
33- [Official website](http://www.fail2ban.org/wiki/index.php/Main_Page)
34- [Source code](https://github.com/fail2ban/fail2ban)
35
36### Read Shaarli logs to ban IPs
37Example configuration:
38- allow 3 login attempts per IP address
39- after 3 failures, permanently ban the corresponding IP adddress
40
41`/etc/fail2ban/jail.local`
42```ini
43[shaarli-auth]
44enabled = true
45port = https,http
46filter = shaarli-auth
47logpath = /var/www/path/to/shaarli/data/log.txt
48maxretry = 3
49bantime = -1
50```
51
52`/etc/fail2ban/filter.d/shaarli-auth.conf`
53```ini
54[INCLUDES]
55before = common.conf
56[Definition]
57failregex = \s-\s<HOST>\s-\sLogin failed for user.*$
58ignoreregex =
59```
60
61## Robots - Restricting search engines and web crawler traffic
62
63Creating a `robots.txt` with the following contents at the root of your Shaarli installation will prevent _honest_ web crawlers from indexing each and every link and Daily page from a Shaarli instance, thus getting rid of a certain amount of unsollicited network traffic.
64
65```
66User-agent: *
67Disallow: /
68```
69
70See:
71- http://www.robotstxt.org/
72- http://www.robotstxt.org/robotstxt.html
73- http://www.robotstxt.org/meta.html