Security, Server Administration, Technology, Uncategorized

Nikto Server Auditing and Resolving Issues.

If you are security conscious and want to find an easy way to determine what aspects of your server setup are presently vulnerable to known exploits; then you may want to try a server security auditor/scanner. There are lots of security scanning/auditing scripts and apps out there, including some websites that will audit your site and provide you with a free report.

Here we will look at the basic usage of Nikto 2 and some of the common issues that it points out, and how we can resolve them. This guide is targeted at users running a LAMP stack (Linux, Apache, MySQL and PHP). However, it may still apply to some other setups out there. Hopefully the information in this article will help get you started.

If you do not have Nikto already, you can download it here.

Usage is relatively simple, just type:

$ perl nikto.pl -h yourdomain.com

Or do a more comprehensive scan with:

$ perl nikto.pl -j yourdomain.com -C all

Here are some common results you will get and what some things to consider when remedying the issue.

+ Cookie PHPSESSID created without the httponly flag

session.cookie_httponly = True

+ Cookie __cfduid created without the httponly flag

The __cfduid cookie is set by CloudFlare, so you won’t see this if you do not use CloudFlare. Its nothing to be concerned about.

+ The anti-clickjacking X-Frame-Options header is not present.

How to set the X-Frame-Options Response Header. Just add this code snippet below to your /etc/httpd/conf/httpd.conf file. Don’t forget to restart the httpd server.


Header always append X-Frame-Options DENY

After setting this, and running another test, you might find you now get this:

+ Uncommon header ‘x-frame-options’ found, with contents: DENY

I am not sure why this is, my best guess is that nikto expects SAMEORIGIN instead of DENY. Either of which is fine though. Unless you know otherwise, i would just ignore this at this point.

+ Uncommon header ‘cf-cache-status’ found, with contents: MISS

This is CloudFlares cache for your sites assets. Make sure your servers clock time and your httpd.conf’s headers are set properly to ensure that the sites assets are not interpreted as being stale. You may need to login to CloudFlare, purge the cache and visit the site a few times to ensure the CloudFlare cache is up to date. Then this issue should be resolved.

+ Server leaks inodes via ETags, header found with file /n8YeaczG.pl, fields: 0x3c3 0x4bbd982a52140

Disable ETags from within your /etc/httpd/conf/httpd.conf by setting the following:

FileETag None
Header unset ETag

+ robots.txt retrieved but it does not contain any ‘disallow’ entries (which is odd).
+ “robots.txt” contains 7 entries which should be manually viewed.

If you have a robots.txt file, you may receive one of the 2 above messages, they are not that important. If it says you do not have any disallow entries, while it is not much of a security risk (unless you are exposing sensitive data in html files [which is ridiculous and you shouldn’t be doing that]) you should add some default disallow entries. The reason why this is, search engines will use this file to determine what pages on your site should be indexed and cached etc. By adding some disallow entries for pages that will present just forms or stuff that is useless to your users google and other search engines can focus on indexing the pages on your site that really count. This is important because since googles panda update, pages on your site that have a high html to content ratio are actually really bad for your sites page rank. So for SEO purposes, block form pages and pages with useless junk or high html/content ratios in your robots.txt file. This will mean the pages that google does index should have better content:html ratios that will improve your overall page rank.

How to write a robots file.

+ DEBUG HTTP verb may show server debugging information. See http://msdn.microsoft.com/en-us/library/e8z01xdh%28VS.80%29.aspx for details.

I get this despite using Apache on a Linux machine with no .net/asp or other MSFT technologies on board. I can only presume its a cautionary tale told for all scans. If you know otherwise, please let me know in the comments as i could not find anything related to this on a LAMP stack. If you are running IIS though, and have .net setup, check the link in the error message for some advice on how to disable the debugging setup.

+ No CGI Directories found (use ‘-C all’ to force check all possible dirs)

If you denied from all the traffic for the cgi directories then that is the most likely message. This is a good thing. However it is best to do as it says and use the ‘-C all’ option as a precaution. This might possibly be a little brutal on your server though as it scans every possible known location for cgi directories and other stuff no doubt. It will likely take a long time, though should you find any directories still open, you best make sure you have a directive to deny from all.

The cgi directories are not the only directories you will likely need to close up. Plesk for example has a number of directories left wide open. See my other post on hardening plesk for more information on how you can fix some of the remaining issues.