May 3rd, 2011, 12:27 AM
Hi, Alex.
I can't speak on behalf of Ben but I'll try to address this in broad terms since I'm of the belief that security should be applied through layers; it's never one thing.
The first thing I'd do with any Form Tools installation is obfuscate the installation directory (either on the public root or in it's own subdomain). Naturally, this depends on the application you have in mind. In my case, staff and sysadmins need only ever access the Control Panel; I never expose it to the public and setup .htaccess rules accordingly.
Something like:
Always place files requiring public access (e.g. forms) outside of the FT directory (this includes upload directories). Always ensure any form you set-up for accepting uploads has strict client- and server-side validation (e.g. never allow PHP files to be uploaded). You get the idea.
Never expose the FT directory in your robots.txt file for excluding webcrawlers; use .htaccess for this.
One thing I'm curious to get Ben's take on (and this is more a feature request) is to include the following line on all include files that only ever need to be accessed locally be the host:
Probably extraneous but, again, it adds an extra layer of protection; the FT config file should probably include this (contains your db credentials). It essentially tells the server to never execute the script unless it's accessed locally by the host.
I could go on but that's how I'd approach things in a broad sense.
I can't speak on behalf of Ben but I'll try to address this in broad terms since I'm of the belief that security should be applied through layers; it's never one thing.
The first thing I'd do with any Form Tools installation is obfuscate the installation directory (either on the public root or in it's own subdomain). Naturally, this depends on the application you have in mind. In my case, staff and sysadmins need only ever access the Control Panel; I never expose it to the public and setup .htaccess rules accordingly.
Something like:
Code:
Order deny,allow
Deny from all
AuthName "htaccess password prompt"
AuthUserFile /path/to/.htpasswd
AuthType Basic
Require valid-user
# Allowed IP Address(es)
Allow from 127.0.0.1
Satisfy Any
Always place files requiring public access (e.g. forms) outside of the FT directory (this includes upload directories). Always ensure any form you set-up for accepting uploads has strict client- and server-side validation (e.g. never allow PHP files to be uploaded). You get the idea.
Never expose the FT directory in your robots.txt file for excluding webcrawlers; use .htaccess for this.
One thing I'm curious to get Ben's take on (and this is more a feature request) is to include the following line on all include files that only ever need to be accessed locally be the host:
PHP Code:
<?php if (eregi("name_of_file.php", $_SERVER['PHP_SELF'])) die('This page is not directly accessible');
Probably extraneous but, again, it adds an extra layer of protection; the FT config file should probably include this (contains your db credentials). It essentially tells the server to never execute the script unless it's accessed locally by the host.
I could go on but that's how I'd approach things in a broad sense.