Page 1 of 2 12 LastLast
Results 1 to 10 of 13

Thread: Gotta love Google

  1. #1

    Gotta love Google

    First off i would like to say, that is by far the best search engine around. It index's all, and never sleeps.

    The first problem i see with is the fact that it index's all files, this include .mdb and .db files. or when that lame new exploit comes out, where do you think people search for that vuln site? Now the latter of my complaints is not googles fault at all. People just need to update their programs.

    The first part, is a very dangerous problem, which i beleive is on googles side. we dont need our search engines keeping databases of admin.php sites, we dont need it keeping pass.mdb files, or even orders.mdb. You can go into google and search for only *.mdb files with a .gov host. This probably wont turn up much now, but boy it used too. The easyest way to steal credit cards is to do a simple search and look for some .db/.mdb files.

    At that same time, try to search for some admin.php sites. Now why would the search engine need to know where your admin login is....

    These are not huge problems, but they are still problems.

    Google has over 100 features for searching, and you always find things you shouldnt. Make sure you use google and dont let it use your site.
    The Hack Back Revolution

  2. #2
    Senior Member
    Join Date
    Nov 2002
    How are they indexing mdb files? Shouldn't a good administrator be able to stop that?
    Mike Reilly

  3. #3
    Yes, small web sites that do not handle tons of data use Access, but do not be stupid; place the files AWAY from web root so Google (or crackers) will have at least a tougher time of accessuing or modifying the files. You will be surprised how many peeople disregard this. Here is a link that may help you better secure your box:

    Keep Your MDB Files Out of Web Root - If you are using Microsoft Access Databases in your ASP application it is very important that you do not put them in publicly accessible folders on your web server. Doing so means that anyone who can guess the name and location of your database will be able to download it from their web browser. Moving it to another directory accessible by your web server but not by your web users is simple to do. If you use a 3rd party provider they should have a safe data directory that is out of web root for you to place your MDB files in. If not you should seriously consider switching to a host that understands this important issue.
    Hope this helps you, bluebeard96.

    As for the Google, people should read the above link to better secure their stuff. It is not Google's fault they did not take precautions.

    On a personal note, I would not use .mdb files when using a database driven site. Access is slow and limits how many users can connect to the database (If I recall correctly it is 256), and did I say it is slow?

  4. #4
    Senior Member
    Join Date
    Nov 2002
    thanks alittlebitnumb. That's what I figured, but I've seen several places where (probably in the beta) the asp coders will write the dbpath as an html comment (for ease in debugging), but they don't remove it when the site goes live. Stupid if you ask me. I'm just not that familiar with the indexing techniques the search engines use and wasn't sure if they have a way of finding the info.

    BTW, you can get pretty fast access times using dsnless connections... you just have to be sure to close them. And yes, the limit is 255 concurrent users, but as long as you're closing your connections you shouldn't have too much of an issue (depending on your site load).
    Mike Reilly

  5. #5
    BTW, you can get pretty fast access times using dsnless connections... you just have to be sure to close them.
    Right-o. I go around all of that and go overkill. It's free and depending on your server and pipe, you can have thousands of users connected at a time with MySQL... it's just a preference though, and is beyond the scope of this thread.

  6. #6
    Ninja Code Monkey
    Join Date
    Nov 2001
    Washington State
    Erm, web site admins should probably learn to configure robots.txt so that agents used by search engines know what to pick up, and what not to.

    A quick search of the sans reading room, the owasp project, security focus, google, and the score documentation section will bring up an assortment of documentation on better web site/application design and auditing. Depending on the documentation you will get anything from what you should and should not have in comments and where, application structure and file placement, server hardening, authentication auditing, etc.
    "When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
    "There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
    "Mischief my ass, you are an unethical moron." - chsh
    Blog of X

  7. #7
    I second the vote for robots.txt. Here's a quick link

  8. #8
    Junior Member
    Join Date
    May 2003

    OT, for the one person who hasn't seen it yet

  9. #9
    Senior since the 3 dot era
    Join Date
    Nov 2001
    Another flaw / power, depending how you see it, of google is the cached and view as html function. You can search for a certain word in a MS word document if the doc is protected with a pasword while opening in word, you only need to click the 'view as html' link in your google webpage and there is the document.

  10. #10
    Senior Member
    Join Date
    Oct 2001
    Erm, web site admins should probably learn to configure robots.txt so that agents used by search engines know what to pick up, and what not to.
    Exactly right. It's not google's responsibility to babysit slack web site administrators. If you're a web admin and you've never heard of robots.txt, now you have so there's no excuse.
    OpenBSD - The proactively secure operating system.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts