June 17th, 2002, 12:56 AM
Using HOSTS file to take burden off of Proxy
Can anyone tell me if I'm heading in the right direction with this. First of all, approximately what percentage of a proxy server's workload/bandwidth consists of blocking access to restricted sites?
I was thinking that instead of using the proxy server to block access to restricted sites, you (as the system admin) could instead make a list of those sites that you want to deny network access to, and include them in the HOSTS file (for W2k machines). And if someone tried to access them, then your HOSTS file would just redirect them to a page on the intranet or something (anything actually). And you could deny rights to users from modifying this HOSTS file. Any thoughts on this?
June 17th, 2002, 12:58 AM
the overhead in regards to proxys is next to nothing at all.
illv // seen the digital world from monochrome dial up to what it is today.
June 17th, 2002, 01:12 AM
Why not get a word filter progie for networks? They do a pretty decent job.
June 17th, 2002, 02:11 AM
Thanks. I wasn't sure what the overhead was, and whether it made sense to do that or not.
June 17th, 2002, 05:33 AM
I have read a tutorial on this very subject at http://blacksun.box.sk in the tutorials section, I believe it was called local DNS cacheing. I think it depends on the bandwidth of your internet conection, if you have dialup, the time it takes to query the DNS servers for the IP may take several seconds, not sure on that figure. If you have a fast connection like Cable, DSL, or satelite I don't think it would be very long at all, even if it is one second. however if you constantly move from site to site, multiply 0.5 seconds by 5 and it starts to grow. I have however tryed this, i believe there is a program called fastnet99 for windows that maintains the hosts table, it did not make any difernce for me however when i tryed placeing a invalid entery in the table, exaple the ip for google where yahoo should go, it did not give me google, it always gave me yahoo. Perhaps Microsoft came up with a fix so you cant spoof IPs, I dont know. I think overall it might save you a bit of surfing time but I do not believe local DNS cacheing will give you much of a "improvment". Thats my opinion anyway.
In snatches, they learn something of the wisdom
which is of good, and more of the mere knowledge which is of evil. But must I know what must not come, for I shale become those of knowledgedome. Peace~
June 17th, 2002, 05:45 AM
It could be a good idea to go to cnet and check out their IT section. They might have some useful proggies for you. I think this is the link http://download.com.com/2001-2027-0.html?tag=tab . I think you can find some goodies over there. GL.
June 17th, 2002, 07:36 AM
great. thanks for the tips and links (http://download.com.com/2001-2027-0.html?tag=tab and http://blacksun.box.sk).... i'll give 'em a go tomorrow when i'm a little bit more awake... damn it's late.
June 17th, 2002, 10:18 AM
Actually... some people use a "similiar" technique for blocking sites they have no desire to ever talk to - though it's usually done at the router level. It's called null-routing... and it's usually a decent response to areas of the world who perform routine port scans or more blazen approaches at trying to break in to a box, etc.
In fact, there are a few links out there of suggested "blackholes" (ie. null routes) to add to your firewalls and/or routers. Here's a link to (the intro of) one of the better ones (it's autoupdated with offenders listed in many different router logs, etc):
\"Windows has detected that a gnat has farted in the general vicinity. You must reboot for changes to take affect. Reboot now?\"
June 17th, 2002, 04:02 PM
thanks for the tip. now i'm just trying to figure out how to incorporate this info into my setup. i don't have a router and i'm running zonealarm. thanks though.
Actually... some people use a "similiar" technique for blocking sites they have no desire to ever talk to