Page 2 of 2 FirstFirst 12
Results 11 to 13 of 13

Thread: Securing my web server

  1. #11
    Senior Member
    Join Date
    Jan 2002
    Posts
    1,207
    Every website you go to sends HTTP headers to and from the browser. One of these headers is a server header and looks like this:

    Server: Apache/1.3.26 (Trustix Secure Linux/Linux) mod_perl/1.26
    This header is easily reduced or disabled in Apache, just look at the ServerTokens command in the config file.

    http://httpd.apache.org/docs/mod/core.html#servertokens

  2. #12
    Senior Member
    Join Date
    Oct 2001
    Posts
    255
    may i also but in to make sure the scanner dosnt perform DoS testing (ping of death, syn floods) as that would be the last thing u wanted to do

    i reccomend to go with the old computer running a bsd/linux running ipchains and keep your packages upated

    Preep
    http://www.attrition.org/gallery/computing/forum/tn/youarenot.gif.html

  3. #13
    Senior Member
    Join Date
    Jun 2002
    Posts
    148
    Useing the suegestions I downloaded a few vulnerability scanners, One called Stealth HTTP vulnerability scanner version 2 build 47 and a fer CGI vulnerability scanners.

    I could not run Nessus to scan my computer from localhost, it gave me an error. If i remember corectly, I tryed a few scanners.

    Stealth did find two possible bugs, one was my robots.txt file which contains instructions to search engines as to what directories may be spidered and which ones may not. I clicked a button for more information and it searched google for me and found a few articles. Aparently because robots.txt is known by atackers and is also can be read by anyone, this can expose information about secret files and directorys. To fix this problem I might delete robots.txt and use the new robot meta tags to instruct the search engines to weather ti can follow or not follow:

    <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

    instead of

    robots.txt:
    User-agent: *
    Disallow: /secret/
    Disallow: /passwords/secret/

    And the other possible bug found was test.shtml. But when i looked it up with google it apeared that only certian web server could have an exploit with text.shtml, such as iis.

    I will try some experimenting to see if my test.shtml can be exploited.

    Then the CGI scanner found a exploit. So all together i only had maybe 3 exploits combineing the HTML and CGI exploits.

    Previous versions of my software allowed an atacker to get a directory listing if they typed in a URL then a file name that exists, then %00

    http://localhost/mydirectory/index.shtm/%00

    that issue has been fixed since.

    other then nessus are they any other scanners you recomend?
    In snatches, they learn something of the wisdom
    which is of good, and more of the mere knowledge which is of evil. But must I know what must not come, for I shale become those of knowledgedome. Peace~

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •