Page 1 of 2 12 LastLast
Results 1 to 10 of 13

Thread: Investigating User's Web Usage

  1. #1
    Member aciscorouter's Avatar
    Join Date
    Mar 2002
    Location
    Brampton, ON, Canada
    Posts
    35

    Question Investigating User's Web Usage

    Hey folks,

    I am commissioned regularly from HR for our large enterprise to investigate user’s web usage and other "unproductive" activities. It's not the favorite part of my job since these are people I know, but all ethics aside, it's my job.

    Enterprise Security has a limited budget and I can only get one span port for each major network block as Network runs the show. In any case I git' 'er done using some Open Source tools.

    Ok, here's some background before I pose the question:

    I run some hardened homegrown NetBSD / Snort boxes that log locally and I use SFTP through scripts to pull the logs back to a local server where I run SnortSnarf to generate the log statistics. I have created a custom signature that tracks HTTP, HTTPS, MSN, AIM, Yahoo Messenger, FTP and various P2P apps from these user's workstations (their IPs are DHCP with a 14 day lease). On the local snort box on my Internet surfing edge (where I run the sig's) I am running snort while dumping the application layer and logging in fast alert mode (makes SnortSnarf happy :-).

    Great, I can use the entire application layer to help me figure out destinations that are virtual websites. SnortSnarf tells me the top alerts to these websites by these users. Problem is...

    HOW DO I REPORT WITH SOME GUESS OF HOW MANY TIME THE USER VISITED???

    The problem is, 30,000 alerts from a user to Lavalife.com over a 2 week period doesn't tell me how many times the user connected intentionally vs. how many of that 30k was just HTTP keep-alives.

    I grep'd for GET and POST but what else can I use to add or subtract to the 30,000 alerts that will give me a better report to present to HR?

    Does anyone else do this? Is it called Network Forensics?
    aCISCOrouter

    "I used up all my sick days, so I’m calling in dead."
    http://www.facebook.com/profile.php?id=554370423

  2. #2
    I do the same, however, i dislike using snort alone, so I use ethereal alongside and just track using custom filters....so when they do goto p0rn goat and horse sex, I can tell if it was a keep alive or not when I need to verify snort's authenticity/reports.

  3. #3
    Just Another Geek
    Join Date
    Jul 2002
    Location
    Rotterdam, Netherlands
    Posts
    3,401
    Do your users surf the web through a proxy server? If you have a proxy server running use it's logs. There are numerous log analyzers that can give you really nice statistics. Things like top 10 users, top 10 sites visited etc..

    But as a side note, there's no 'real' way to find out how much time your users are spending on surfing the net. The only way to find out would be to place a camera behind the user and note what s/he is doing all the time. Think about it: If I open antionline i.e. but minimize my browser and do my work, you'll notice a couple of refeshes (automaticly) and maybe log the amount of bandwidth I've used. But was I really spending all that time surfing or doing my job? What if I was listening to an Internet radiostation while doing my job?

    And another thing that just popped into my head How are your privacy laws? In my country I cannot create statistics that are directly linked to a perticular user. That user has some privacy (the same reason why I cannot listen in on his phonecalls). I can analyze the logs for that perticular user if and only if there is a reasonable doubt that user is abusing our resources.

    Sjees, I keep adding stuff Do you have a policy? My company has strickt rules and regulations for using the Internet. If someone breaks those rules it's his/her manager's job to make sure the employees do what they are payed for. If that person keeps breaking the rules we don't try and disconnect them from the Intenet, we disconnect them from the company; we fire them.
    Oliver's Law:
    Experience is something you don't get until just after you need it.

  4. #4
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    Location
    United Kingdom: Bridlington
    Posts
    17,188
    Hmmmmmmmmmmm.....................


    I am commissioned regularly from HR for our large enterprise to investigate user’s web usage and other "unproductive" activities
    In my book, "HR" is an "unproductive activity"...............over here we consider them too ugly fpr prostitution and to stupid to peddle drugs...............

    I am looking at a programme to redeploy them all as figure #11 emulation modules. (Think N.A.T.O.)

    It's not the favorite part of my job since these are people I know, but all ethics aside, it's my job.
    Yes...............I can see that you understand my reservations ..............don't worry, I will ask for an extra $50k to let the hammer down on you

    Where HR dictate to IT it is time to dump the stock NOW!

    Security/IT and Finance/IT are acceptable alliances to me............

  5. #5
    Member aciscorouter's Avatar
    Join Date
    Mar 2002
    Location
    Brampton, ON, Canada
    Posts
    35
    Originally posted here by SirDice
    Do your users surf the web through a proxy server?
    No, we have ISA implemented for our call centers but in the main data center the users can access the Internet without any caching or proxying. We tried caching and sending the data to a log analyser but it didn't work well with DHTML and we created so many exceptions we finally turned it off.

    And another thing that just popped into my head How are your privacy laws? In my country I cannot create statistics that are directly linked to a perticular user. That user has some privacy (the same reason why I cannot listen in on his phonecalls). I can analyze the logs for that perticular user if and only if there is a reasonable doubt that user is abusing our resources.
    Ah, Legal and HR have done a great job in the employment contract in basically having the user waive their rights to privacy while using any of the corporate resources. This is reiterated in the security policy and reminders go out often that "Big Brother" is watching.
    aCISCOrouter

    "I used up all my sick days, so I’m calling in dead."
    http://www.facebook.com/profile.php?id=554370423

  6. #6
    Member aciscorouter's Avatar
    Join Date
    Mar 2002
    Location
    Brampton, ON, Canada
    Posts
    35
    Originally posted here by nihil
    In my book, "HR" is an "unproductive activity"...............over here we consider them too ugly fpr prostitution and to stupid to peddle drugs...............Security/IT and Finance/IT are acceptable alliances to me............
    Damn skippy! I agree 100%. HR are such techno-phobes that everything is taken literally and they see 100 alerts as 100 visits to a website. Try explaining anything to them and their eyes glaze over and they get agitated and just want to get on with their witch-hunt.

    Unfortunately, when manager's here feel they have a "hunch" about an employees lack of productivity they get permission from HR to have us conduct an investigation. I've suggested content filtering alternatives such as NetFilter or Websense that will give employees "quota-based free surfing" and giving the managers access to the reporting component so they can rat on their own employees to HR (instead of having our team do the dirty on anyone).

    Our organization does not have a CSO and it's only been recent (last year) that this large telecom has decided security is necessary (due to SoX compliance and three worms that paralysed us). We are promoting awareness up the chain of command and we're still hitting walls.
    aCISCOrouter

    "I used up all my sick days, so I’m calling in dead."
    http://www.facebook.com/profile.php?id=554370423

  7. #7
    AO Ancient: Team Leader
    Join Date
    Oct 2002
    Posts
    5,197
    Are you placing a threshold limit on the rules? See Here.

    It will make your job a lot easier....
    Don\'t SYN us.... We\'ll SYN you.....
    \"A nation that draws too broad a difference between its scholars and its warriors will have its thinking done by cowards, and its fighting done by fools.\" - Thucydides

  8. #8
    Member aciscorouter's Avatar
    Join Date
    Mar 2002
    Location
    Brampton, ON, Canada
    Posts
    35
    Originally posted here by Tiger Shark
    Are you placing a threshold limit on the rules? See Here.

    It will make your job a lot easier....
    Nice one mate! I have finished the investigation already and I'm looking for a way to justify my numbers but I know the average website visit is between 6 and 12 GET requests per page (did a dirty baseline of actyivity to the sites one user visited consistently) so I could, in future, use the threshold to only log, say, every 7 alerts. It would really be fudging the numbers though because there would be no way of saying "only log one of every six packets per website" which will may miss some of the transactions all together.
    aCISCOrouter

    "I used up all my sick days, so I’m calling in dead."
    http://www.facebook.com/profile.php?id=554370423

  9. #9
    Just Another Geek
    Join Date
    Jul 2002
    Location
    Rotterdam, Netherlands
    Posts
    3,401
    Originally posted here by aciscorouter
    No, we have ISA implemented for our call centers but in the main data center the users can access the Internet without any caching or proxying. We tried caching and sending the data to a log analyser but it didn't work well with DHTML and we created so many exceptions we finally turned it off.
    Errm. ISA is a firewall+proxy. ISA creates great log files. Analyze those...

    We use ISA too (only as a proxy). Never had any problems with dynamic pages. Usually caching problems are caused by the site itself. It happens when they don't use any tags in their dynamic content that specify the page shouldn't be cached.
    Oliver's Law:
    Experience is something you don't get until just after you need it.

  10. #10
    AO Ancient: Team Leader
    Join Date
    Oct 2002
    Posts
    5,197
    so I could, in future, use the threshold to only log, say, every 7 alerts. It would really be fudging the numbers though because there would be no way of saying "only log one of every six packets per website" which will may miss some of the transactions all together.
    If you use the "limit" parameter then you can say "Show me only the first instance, (limit 1), of access to an IP address in each 15 seconds". Thus you would see only the first GET for www.yahoo.com for example. After 15 seconds all the GET's would have been complete, (you could adjust the 15 seconds downwards or upwards depending upon your connection speed), and this should give you a reasonable picyure of use.
    Don\'t SYN us.... We\'ll SYN you.....
    \"A nation that draws too broad a difference between its scholars and its warriors will have its thinking done by cowards, and its fighting done by fools.\" - Thucydides

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •