Search Engines Revealing Vulnerable Files
Search Engines Revealing Vulnerable Files
I've fount this to be rather interesting, CNET has an interesting article on search engines that expose sensitive files when indexing web pages. Sites like Google have recently started supporting searches within PostScript, Lotus, Word, Excel, RTF, and other similar file types. If a company is careless enough to leave such files in public web directories on their server, they will be indexed by the search engines and thus open to a wider than intended audience. Searches for things like 'password' within these file types can be fruitful for attackers. This is nothing new - attackers have been using search engines to find open password files, servers running default (and vulnerable) configurations, and similar security holes for years. Indexing these new file types just amplifies the issue.
Of course, it's not really the search engine's fault since their bots just examine whatever's public. Data should be stored in a database whenever possible, instead of in files on the web server. And if absolutely necessary, server administrators can just exclude files from the search engines using robots.txt.
latr,
Remote_Access_
Re: Search Engines Revealing Vulnerable Files
Quote:
Originally posted by Remote_Access_
Search Engines Revealing Vulnerable Files
I've fount this to be rather interesting, CNET has an interesting article on search engines that expose sensitive files when indexing web pages. Sites like Google have recently started supporting searches within PostScript, Lotus, Word, Excel, RTF, and other similar file types. If a company is careless enough to leave such files in public web directories on their server, they will be indexed by the search engines and thus open to a wider than intended audience. Searches for things like 'password' within these file types can be fruitful for attackers. This is nothing new - attackers have been using search engines to find open password files, servers running default (and vulnerable) configurations, and similar security holes for years. Indexing these new file types just amplifies the issue.
Of course, it's not really the search engine's fault since their bots just examine whatever's public. Data should be stored in a database whenever possible, instead of in files on the web server. And if absolutely necessary, server administrators can just exclude files from the search engines using robots.txt.
latr,
Remote_Access_
Show The SOURCE.