Hidden directories
Results 1 to 6 of 6

Thread: Hidden directories

  1. #1
    Member
    Join Date
    Mar 2004
    Posts
    94

    Hidden directories

    While using nessus to verify security on some of my servers I noticed it checks for certain directories on web servers. This is done by a plugin written my H D Moore that contains 685 common directory names. This makes me wonder... is there a tool that will attempt to find "hidden" directories on a web server using either a dictionary or brute force?

  2. #2
    AFLAAACKKK!!
    Join Date
    Apr 2004
    Posts
    1,066
    You mean from an attackers point of view? I don't know of any, but if there are and somone used it, then that person is very dumb. It would be so easy to point out that kind of attack just by looking at the logs.
    I am the uber duck!!1
    Proxy Tools

  3. #3
    Senior Member
    Join Date
    Aug 2003
    Posts
    1,019
    is there a tool that will attempt to find "hidden" directories on a web server using either a dictionary or brute force?
    I'm not exactly sure why you would want to do that, especially since there is readily availible site mapping software...??

    Or am I just not wrapping my brain around the question properly?

  4. #4
    Senior Member
    Join Date
    Jan 2003
    Posts
    3,914
    Hey Hey,

    That questions not skiddieish at all. I'd suggest you becareful because while you may be johnathans daddy, if you stick with questions like that you'll end up calling a bunch of 350lbs men named bubba daddy. I don't think you want to do that.

    Find a "hidden" directory on a web server eh? Let's see... nessus searches for common directories, you can use a crawler to follow all linked directories... as far as hidden directories.... they could be named anything and you have no reference point to look for. About the only thing you can do is check for a robots.txt on the root of their webserver. This file contains directories which you don't want robots to crawl.

    Source: www.adventive.com/tools/SEO.html
    A text file stored in the top level directory of a web site to deny access by robots to certain pages or sub-directories of the site. Only robots which comply with the Robots Exclusion Standard will read and obey the commands in this file. Robots will read this file on each visit, so that pages or areas of sites can be made public or private at any time by changing the content of robots.txt before re-submitting to the search engines. The simple example below attempts to prevent all robots from visiting the /secret directory: User-agent: * Disallow: /secret
    For a tutorial on Robots.txt visit - http://www.searchengineworld.com/rob...s_tutorial.htm

    Peace,
    HT
    IT Blog: .:Computer Defense:.
    PnCHd (Pronounced Pinched): Acronym - Point 'n Click Hacked. As in: "That website was pinched" or "The skiddie pinched my computer because I forgot to patch".

  5. #5
    Member
    Join Date
    Mar 2004
    Posts
    94
    The Duck: agreed, it would be dumb, but the world (and prisons) are full of people that do dumb things. Nessus /Nikto/etc scanners are also easy to pick out in a log file.

    Groovicus: site mapping software doesn't find the directories which are not linked to. I have used directories (as I'm sure others have) and relied on security by obscurity as *part* of the security for it. If you don't know it's there, it's hard to defeat weak passwords that certain individuals use.

    HTRegz: Yes, it does sound a bit skiddieish, but I was just wondering. I have no worries about having to call Bubba daddy. Thanks for the info on robots.txt. It makes me wonder... (here I go again)... can you include robots.txt in the robots.txt file so would-be attackers can't google "inurl:robots.txt site:mysite.com"? I didn't see any mention of this in your references.

  6. #6
    AO Part Timer
    Join Date
    Feb 2003
    Posts
    332
    About the only thing you can do is check for a robots.txt on the root of their webserver. This file contains directories which you don't want robots to crawl.
    It is funny you mention robots.txt. I always thought it was kind of a double edged sword. Isn't it kind of like leaving a note for a burlgar saying the jewels are hidden in the vase on the coffee table. Ironically, I have looked at a few robots.txt files. Then tried to browse the dir, to find index enabled. They went to the trouble to write a robots.txt. Yet overlooked something as simple as indexing.

    Back towards the topic however. Go here
    Perhaps you can read some of these scripts. Then alter them, and write your own.


    Be safe and stay free
    Your heart was talking, not your mind.
    -Tiger Shark

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •