Need Help
Results 1 to 9 of 9

Thread: Need Help

  1. #1
    Senior Member
    Join Date
    May 2004
    Posts
    107

    Need Help

    Hello All:

    Is there any way(I mean any tools) by which we can view the subdirectories of a web site.(I'm not sure u can call it subdirectories )

    What the tool should do ...

    The Input Must be : http://www.abc.com

    The Output Must be somewhat like below :
    www.abc.com/contains..
    index.html
    advlogo.jpg
    sub1
    sub2

    http://www.abc.com/sub1/ contains
    someth.html
    rat.html
    mod.jpg

    http://www.abc.com/sub2/ contains
    man.html
    woman.html
    figure.jpg

    ..........


    In the antionline FAQS, I've seen its better do a search on google before posting a question. But in my case, I dont know what exactly to search for .. ? Can u call those(in the eg. sub1 and sub2) as subdirectories ??

    Thanks.
    XNikon
    please don\'t visit www.BusyTalk.com

  2. #2
    AO übergeek phishphreek's Avatar
    Join Date
    Jan 2002
    Posts
    4,325
    Depends on how they have it configured... sometimes, you can get the index page.

    Or, search the source code, that will provide some good clues as to how the site is structured.

    Sometimes you have to mirror a site. But, that will download a copy to your PC...
    (sometimes this is frowned upon?)

    For example...

    My brother was just in boot camp. They were posting pics of their progress and I wanted to save the site for him when he got back and such. So, I used httrack to mirror the site. That gave me a copy of the site they had public. If there was not a link to a file, then the file would not be copied.

    I could then go to the copy that I have on my hard drive and look at the directory structure, just as it was on their server. But, if there were files/folders not referenced in the source... I didn't get a copy. The program had no way to know that they were there.

    This was a small site, and it didn't take too much bandwidth. So, I don't think they'd mind. It would be the same thing as me sitting there and saving every page/image manually. It just saved me time. I also set it so it would only download a page at a time, and at a certain speed. I was trying to be nice to the server and others that might be on the site (as not to degrade its performance).
    Quitmzilla is a firefox extension that gives you stats on how long you have quit smoking, how much money you\'ve saved, how much you haven\'t smoked and recent milestones. Very helpful for people who quit smoking and used to smoke at their computers... Helps out with the urges.

  3. #3
    A google search that would work would be something like "index of" site:www.target.com or allinurl:"index of" site:www.target.com. A lot of websites dont like to give out directory listings. I have seen forced directory listings hinted at, but have never been able to find out how to do it. Is this just a myth? There are a few tools that can strip websites, but I dont know of any tools that can look at a website and pull up a directory listing unless you could see the listing in your browser normally.

  4. #4
    Senior Member
    Join Date
    May 2004
    Posts
    107
    So, is it almost 100% secure to store some very important data within directory structures, without making a listing of it or a link to it ??
    Can't sites store client information (maybe encrypted) in such a fashion ?
    How do leading web sites store their client information... Is it actually an encrypted file somewhere in the directory structure, or do they store them in the server (on some other location), or is it in some other highly secure server ??
    XNikon
    please don\'t visit www.BusyTalk.com

  5. #5
    Most web sites store customer information in a database or on part of the server that is not accessable over the internet. Your best bet is probably a database if you want all the information in one place and easily accessible with many programs/programming languages. I would suggest MySQL(www.mysql.org). What OS are you using and what kind of very important information are you planning on keeping secure?

  6. #6
    AO Ancient: Team Leader
    Join Date
    Oct 2002
    Posts
    5,197
    So, is it almost 100% secure to store some very important data within directory structures, without making a listing of it or a link to it ??
    No! Just because I can't get a folder list or actual index doesn't mean I can't get files served to me from the webroot that aren't indexed or linked in any way. I can guess, I can brute force or, if I simply find an exploit on your server the first thing that comes available is your valuable data. A _really_ simple example would be if I saw 1.htm, 2.htm, 3.htm, 5.htm referenced on your web site.... I can send a request for http://www.yourdomain\4.htm.... The page will be served if it is there. This is called security by obscurity - it may work for a while but it is the easiest form of security to break.

    If it is of value you have to properly secure it. The first question I would ask is does the data need to be served to the public or do you just have a storage issue and want to use free space on this server?
    Don\'t SYN us.... We\'ll SYN you.....
    \"A nation that draws too broad a difference between its scholars and its warriors will have its thinking done by cowards, and its fighting done by fools.\" - Thucydides

  7. #7
    Senior Member
    Join Date
    May 2004
    Posts
    107
    I'd like to store something online which only my very trusted people can access
    XNikon
    please don\'t visit www.BusyTalk.com

  8. #8
    AO Ancient: Team Leader
    Join Date
    Oct 2002
    Posts
    5,197
    Then is the value of providing the data greater than the risk of it being made public?

    If the answer is no then don't do it.

    If the answer is yes then I would place it in a subfolder. Remove all rights to it by the anonymous/Internet user account(s). Place a form of authentication on it such as forcing Windows authentication and removing anonymous access. Create accounts with long and complex user names and passwords and lock down the account lockout parameters to 3 tries/lockout for at last 24 hours. Patch the box daily, make sure there are no open ports that aren't absolutely necessary throught he firewall and maybe encrypt the data with PGP or similar with a different complex password. Then hope.....
    Don\'t SYN us.... We\'ll SYN you.....
    \"A nation that draws too broad a difference between its scholars and its warriors will have its thinking done by cowards, and its fighting done by fools.\" - Thucydides

  9. #9
    Senior Member
    Join Date
    May 2004
    Posts
    107
    Aw. Thanx tiger,

    Hope that it'll be pretty very much secure indeed!
    XNikon
    please don\'t visit www.BusyTalk.com

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •