Search Engines Revealing Vulnerable Files
Results 1 to 7 of 7

Thread: Search Engines Revealing Vulnerable Files

  1. #1

    Exclamation Search Engines Revealing Vulnerable Files

    Search Engines Revealing Vulnerable Files

    I've fount this to be rather interesting, CNET has an interesting article on search engines that expose sensitive files when indexing web pages. Sites like Google have recently started supporting searches within PostScript, Lotus, Word, Excel, RTF, and other similar file types. If a company is careless enough to leave such files in public web directories on their server, they will be indexed by the search engines and thus open to a wider than intended audience. Searches for things like 'password' within these file types can be fruitful for attackers. This is nothing new - attackers have been using search engines to find open password files, servers running default (and vulnerable) configurations, and similar security holes for years. Indexing these new file types just amplifies the issue.
    Of course, it's not really the search engine's fault since their bots just examine whatever's public. Data should be stored in a database whenever possible, instead of in files on the web server. And if absolutely necessary, server administrators can just exclude files from the search engines using robots.txt.

    latr,
    Remote_Access_
    Share on Google+

  2. #2
    Senior Member
    Join Date
    Nov 2001
    Posts
    1,255
    This point was brought up about a week and a half ago on BugTraq IIRC. Just open up google and do a search for "Index of /" and you'll get quite a few hits from misconfigured webservers.

    One thing this does is turn a search engine as powerful as google into another tool to hack with.
    Chris Shepherd
    The Nelson-Shepherd cutoff: The point at which you realise someone is an idiot while trying to help them.
    \"Well as far as the spelling, I speak fluently both your native languages. Do you even can try spell mine ?\" -- Failed Insult
    Is your whole family retarded, or did they just catch it from you?
    Share on Google+

  3. #3
    Member
    Join Date
    Oct 2001
    Posts
    88
    Excellent post, keep those `Security' related posts coming.
    Share on Google+

  4. #4
    Senior Member
    Join Date
    Oct 2001
    Posts
    293
    search engines like google has always been one of the hackers best friends, just like the public whois databases (as long as the services has existed i.e.). another funny thing about the internet today is that every1 makes their own hp. an error i stumble upon quite often is the misconfiguration of the MS Frontpage direct editing thingy (sorry, don't know it's real name).. quite a few pages on the net allows any1 to open the page in MS FP and let them do to it whatever they want. the page doesn't even prompt for a password!... this is many SK's favorite target, thus giving hackers a bad name once again.

    (the last page i saw with this kind of error belonged to a University! it probably wouldn't be to hard to make such a system into a slave... they actually got warned about the error, but still they didn't fix it... what r these people thinking!?)
    zion1459
    Visit: http://www.cpc-net.org
    \"Software is like sex: it\'s better when it\'s free.\" -Linus Torvalds
    Share on Google+

  5. #5

    Wink Search Engines

    search engines have been useful to hackers/phreaks/etc since the existence of search engines. they have been used to find vulnerabilities, the latest in security news, updates, downloads, and what ever the end user decides to use the engine for.

    Remote_Access_
    Share on Google+

  6. #6
    Old-Fogey:Addicts founder Terr's Avatar
    Join Date
    Aug 2001
    Location
    Seattle, WA
    Posts
    2,007

    Re: Search Engines Revealing Vulnerable Files

    Originally posted by Remote_Access_
    Search Engines Revealing Vulnerable Files

    I've fount this to be rather interesting, CNET has an interesting article on search engines that expose sensitive files when indexing web pages. Sites like Google have recently started supporting searches within PostScript, Lotus, Word, Excel, RTF, and other similar file types. If a company is careless enough to leave such files in public web directories on their server, they will be indexed by the search engines and thus open to a wider than intended audience. Searches for things like 'password' within these file types can be fruitful for attackers. This is nothing new - attackers have been using search engines to find open password files, servers running default (and vulnerable) configurations, and similar security holes for years. Indexing these new file types just amplifies the issue.
    Of course, it's not really the search engine's fault since their bots just examine whatever's public. Data should be stored in a database whenever possible, instead of in files on the web server. And if absolutely necessary, server administrators can just exclude files from the search engines using robots.txt.

    latr,
    Remote_Access_
    Show The SOURCE.
    [HvC]Terr: L33T Technical Proficiency
    Share on Google+

  7. #7
    Senior Member
    Join Date
    Nov 2001
    Posts
    114
    i thought this article was interesting as well....thougt i'd include it for members who wanted to read more about this issue.

    This article is from........

    http://www.theregister.co.uk/content/55/23069.html



    The Google attack engine
    By Thomas C Greene in Washington
    Posted: 28/11/2001 at 12:25 GMT


    Some clever empiricist appears to have been abusing Google to attack Web servers, switches and routers in a novel way, by crafting search terms to include known exploits. Such a search will occasionally yield active Web pages used by administrators. On top of that, a number of them have already been cached. It's reasonable to surmise that a hacker has been using Google not merely to search for vulnerabilities, but as a proxy to hide behind while executing attacks.

    SecurityFocus > http://www.securityfocus.com
    researcher Ryan Russell discovered a wealth of such pages quite by accident, while working on improved rules for Snort
    > http://www.snort.org a popular open-source IDS (Intrusion Detection System).

    "I was using Google to check how common a particular string is on the Web, to gauge how often a rule might cause a false-positive. Part of the process of deciding how often the rule might cause a false positive is deciding how common the string is that the rule searches for," Russell explains.

    So while searching Google for a vulnerability in Cisco IOS Web Server, Russell followed a link and found himself in a switch belonging to a US .gov site.

    The malicious use of search engines is nothing new, as we reported in a story back in June of 2000

    > http://www.theregister.co.uk/content/archive/11174.html

    but this does bring it to new levels of finesse. The significant thing here is that the cache can be used to cover one's tracks, assuming there are no graphics to be fetched.

    Cruise control?
    So how did all this stuff get indexed in the first place? Did Google's mighty spiders do it all automatically, or did someone deliberately add the URLs?

    Google offers "an advanced search feature that allows you to look for sites that link to a particular URL. When I looked for the URLs that are exploit attempts, there were no links to them. This either means they were submitted manually to Google, or possibly that the page that used to link to them has changed, and Google has already re-indexed it," Russell says.

    "The simplest explanation is that they just went to Google's submit URL page, and typed it in."

    > http://www.google.com/addurl.html

    On Monday we invited Google to comment, but they've not yet replied. We wondered if they might be able to prevent at least some of these abuses, perhaps by filtering according to rules similar to those in Snort. After all, if they can purge 95 per cent of the porn you'd expect to encounter in their images search, they ought to be able to filter a fair number of HTTP exploits as well.

    We sent them a rather long list of potentially exploitable search terms based on Russell's queries. "As I continued to work the rule-set, I kept finding more. Some of them might be innocent. Many clearly aren't. Some succeed, some don't," he notes.


    [>] <--- i included these links in case they didn't show up in the article.
    Share on Google+

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

 Security News

     Patches

       Security Trends

         How-To

           Buying Guides