Page 1 of 2 12 LastLast
Results 1 to 10 of 11

Thread: URLScan Example/Tutorial

  1. #1
    Banned
    Join Date
    May 2003
    Posts
    1,004

    URLScan Example/Tutorial

    Did you know that a with a single tool you could have skipped every single IIS patch issued in the last few years? (depending on your exact needs of course) As I am sure you have already guessed the tool is called URLScan and can be found at:

    http://www.microsoft.com/technet/tre...ls/urlscan.asp

    What does this tool do and how do we use it?
    As you can tell from the above page, the tool is an ISAPI filter that runs pattern matches against all IIS requests. These requests are matched against an admin defined configuration file. (urlscan.ini)

    Let's have a look at my urlscan.ini file:

    Code:
    [options]
    UseAllowVerbs=1
    UseAllowExtensions=1
    NormalizeUrlBeforeScan=1
    VerifyNormalization=1
    AllowHighBitCharacters=0
    AllowDotInPath=0
    RemoveServerHeader=1
    EnableLogging=1
    PerProcessLogging=0
    PerDayLogging=0
    AllowLateScanning=0
    LogLongUrls=0
    UseFastPathReject=0
    RejectResponseUrl=/documents/urlserr.wtx
    
    [AllowVerbs]
    GET
    HEAD
    POST
    
    [DenyHeaders]
    Translate:
    If:
    Lock-Token:
    
    [AllowExtensions]
    .
    .asf
    .avi
    .exc
    .gif
    .htm
    .ico
    .jpg
    .mov
    .mp3
    .mpg
    .pdf
    .png
    .wmv
    .wtx
    
    [DenyUrlSequences]
    ..
    ./
    \
    :
    %
    &
    cmd
    
    [RequestLimits]
    MaxAllowedContentLength=350000000
    MaxUrl=200
    MaxQueryString=0
    Don't worry all of this is much simpler than it may initially look.

    Code:
    UseAllowVerbs=1
    UseAllowExtensions=1
    These two just tell URLScan to use a default deny stance and only accept the verbs and extensions listed and to deny everything else. If these values are set to 0, URLScan will take a default accept stance and allow everything except items listed in [DenyVerbs] and [DenyExtentions]. I prefer to use default deny because it only allows exactly what I specify, this way if I forget something, a resource is blocked and I can correct it. With default allow, if I forget something the server may be compromised.

    Code:
    NormalizeUrlBeforeScan=1
    VerifyNormalization=1
    Remember all those Unicode issues with all of the %25%sd%23%fd%fb looking things in them? In this case the are prolly garbage as I just mashed some keys, but in reality they contained things like "/" and ".." in order to bypass the systems checking for such things. These options ensure that the %xx are turned into whatever they equal, the normalized data is then matched. What about those more sophisticated attacks that were double encoded? That is were the VerifyNormalization comes in. If after one pass of normalization the request is still encoded, URLScan knows that something is amiss and the request is denied.

    Code:
    AllowHighBitCharacters=0
    No resources on my system require high bit characters, so I have denied them. Odds are good if you don't know if you use high bit chars, you can safely deny them.

    Code:
    AllowDotInPath=0
    If this is set to zero, the request may only have one dot and everything after that dot is later matched as the extension. I felt this was safer because it may catch directory transversal attacks missed by other aspects of this filter and because I have no resources which fail to follow the (name).(extension) scheme.

    Code:
    RemoveServerHeader=1
    Although I do not believe this adds any real security I figured that the modification and though it is still possible to reasonably detect the system, I figured that no harm could come from this and any gain at all is worth the cost.

    Code:
    EnableLogging=1
    PerProcessLogging=0
    PerDayLogging=0
    I wish to have logging, but my server does not get enough traffic to justify additional granularity.

    Code:
    AllowLateScanning=0
    My server runs no other filters that modify the request so I had no need to scan after those filters.

    Code:
    LogLongUrls=0
    I have limited the maximum URL size to 200 bytes, the normal URL cut off is at 1024 bytes, consequently there was no need to enable this.

    Code:
    UseFastPathReject=0
    RejectResponseUrl=/documents/urlserr.wtx
    Fast reject uses a predefined standard IIS error, for my purposes I wished to use my own URLScan specific error. The first line tells it to do this, the second points to the new error page. :)

    Code:
    [AllowVerbs]
    GET
    HEAD
    POST
    These are the only verbs I wished to allow, I don't have use for things like CONNECT, so I chose not to bother with them.

    Code:
    [DenyHeaders]
    Translate:
    If:
    Lock-Token:
    I do not have WebDAV enabled so I have no use for these headers.

    Code:
    [AllowExtensions]
    .
    .asf
    .avi
    .exc
    .gif
    .htm
    .ico
    .jpg
    .mov
    .mp3
    .mpg
    .pdf
    .png
    .wmv
    .wtx
    . is allowed as I have both directory browsing and default documents enabled and without . listed http://www.address.tld/ would give an error as it does not have an allowed extension :) You will notice that I allow primarily image files (gif,ico,jpg,png), movie/music files (asf,avi,mov,mp3,mpg,wmv). htm and wtx files are also allowed. I bet this raises two questions. No active content? and What the heck are wtx and exc files? This system does have active content, I have mapped asp files to the htm extension, this allows me to serve all dynamic and static mark-up content under one extension and I don't need to worry about worms that use asp files (another "just in case") plus kiddies don't tend to save my server address for use with 1-day asp exploits, overall lowering naive hostile traffic, though no really increase in security. This leads to a minor reduction in speed for delivering static html documents, but my systems traffic is limited enough to make this a non-issue. So what about those wtx files? wtx = plain text, look in your windows extension map if you don't believe me! :P I use wtx because I this allows me to exactly specify what I do and do not wish to share as some of my http dirs are also network shares that may contain .txt or .doc files that I do not wish to share via the website. This allows user to determine what is and is not web available at a glance. The same is true for exc, I have some exe files that I do not wish to be run by web users, and although they are protected via ACLs and DACs, this is just an extra precaution in case someone makes a mistake.

    Code:
    [DenyUrlSequences]
    ..
    ./
    \
    :
    %
    &
    exe
    This should all be self explanatory. :) I have no content on my site that includes those chars, so if a user is trying to pass them, it means they are stepping outside of the realm of "normal use" and we can't have that. exe is just obvious. :) Although the web server is denied access to cmd.exe via both ACLs and DACs one extra precaution doesn't hurt, plus this will make it more obvious in the logs.

    Code:
    [RequestLimits]
    MaxAllowedContentLength=350000000
    MaxUrl=200
    MaxQueryString=0
    Content length is kinda silly when it is set so high, but I believe the default is much lower, and I have some very large files on there for people that are unable/do not wish to learn the wonders of FTP. Too large for normal practical applications, yes... but none the less appropriate for my uses. Max URL should be defined to whatever the longest path on your server is. I guesstimated at 200 bytes, though I am sure it is less. QueryString? I allow no queries, all data is passed via post and the server only accepts post data from itself as a refer, this makes it very difficult for an attack to send any extra data to the server at all.

    Although my system has additional protections as well, they have yet to be needed, URLScan has defeated every IIS exploit released in recent memory. And to think, people ask why I never patch my server. ;) An ounce of prevention (well I also run Argus' Pitbull Protector and a few other research security alterations, but you get the idea) = years of laziness, and let's face it, people who run NT tend to be lazy, I know I am. :D

    Let me know about any questions,

    catch

    PS. This file of course will not meet your exact needs, but I thought an example might be useful, although... IIS6 has made much of URLScan uneeded.

  2. #2
    Junior Member
    Join Date
    Apr 2003
    Posts
    7
    Are there any other default settings which may cause problems?

  3. #3
    Banned
    Join Date
    May 2003
    Posts
    1,004
    Considering the whole point of the application is to eliminate select types of requests... it is possible for even the most liberal settings (much less the default ones) to have adverse effects with different types of service software.

    Aka... yes, the default settings may interfer with your particular system's functionality.

    I have tried to outline the major points so you can figure out a configuration to meet your own needs.

    catch

  4. #4
    Senior Member
    Join Date
    Jan 2002
    Posts
    1,207
    It's worth noting that your [DenyUrlSequences] may cause severe problems for web applications which encode data in the query string.

    The percent sign, ampersand and several other characters are entirely legitimate and can be used safely.

    I developed a web application at one point, it worked fine, until one day it stopped working. I did some digging and discovered that our oh-so-vigilant systems admins had installed URLScan and had all sorts of weird rules which were breaking stuff good and proper.

    URLScan is very dangerous - breaking web apps with it is extremely easy.

    Also don't overlook the possibility that almost any character can be in a URL on a query string (in a search for instance)

    The best way to avoid these attacks you're trying to guard against is to be patched, and remove the default IIS configuration. URLscan doesn't really enhance this much. It just makes it very easy to break legitimate apps.

  5. #5
    Banned
    Join Date
    May 2003
    Posts
    1,004
    Slarty, yes... it is easy to break web apps with URLScan, that is why it is important to understand exactly how it works. URLScan is a fantastic tool when used carefully.

    It is true the ampersand is frequently used with get requests, the percent is rarely used. You are perhaps considering things like %20, which if you normalize first (as mine is configured), %20 will read as a space.

    Even with very loose settings, URLScan is useful in assuring things like protection against all 0-day attacks that go after weaknesses in the IIS service to get at cmd.exe (etc)

    Yes, it is possible for any char to be in a get string, this may or may not be a good thing.

    Patching is fine and all, but if you'd rather not watch NTBugtraq like a hawk, URLScan is a good free solution, that as I said would have mitigated the need for nearly every patch released for IIS.

    catch

  6. #6
    Senior Member
    Join Date
    Oct 2002
    Posts
    181
    forget using URLcsan, all that input validation can be writen into your web app, if you do it via the web app you have control over what happens when the tests pick up somthing bad. However URL scan will only prevent cirtain kinds of vulnerabilities it will not stop the changing between accounts within a web app by changing id=1 to id=2 in a url (that is only an example it will depend how a web is writen is it is vulnerable). To prevent that kind of attack, all information that determines the state of the user should not be held on the client side.

    To sum up, there is more to web app security than just testing what is with the url.

    SittingDuck
    I\'m a SittingDuck, but the question is \"Is your web app a Sitting Duck?\"

  7. #7
    Just a Virtualized Geek MrLinus's Avatar
    Join Date
    Sep 2001
    Location
    Redondo Beach, CA
    Posts
    7,323
    **Moved from Microsoft Security to Tutorials**
    Goodbye, Mittens (1992-2008). My pillow will be cold without your purring beside my head
    Extra! Extra! Get your FREE copy of Insight Newsletter||MsMittens' HomePage

  8. #8
    Banned
    Join Date
    May 2003
    Posts
    1,004
    "forget using URLcsan, all that input validation can be writen into your web app"

    Although I agree that input validation in the web app is important as well, how would that have protected you from things like unicode attacks?

    I am not trying to say that URLScan is the be all end all security solution. It is merely a handy tool when properly understood, hence I felt the need for a tutorial... as most people don't seem to fully understand its use.

    For example, my system uses URLScan to limit the scope of possible attack types against my system, and allows users to in a glance ensure if the docs the put on the web server are shared internally or externally without dealing with DAC/ACLs. I use Pitbull Protector to isolate the IIS process from the rest of the system and to remove privilieges that it simply doesn't need easily. I use concepts of least privilege when dealing with database interaction to dramatically reduce the danger of SQL injection. I use hierarchical mandatory access controls to prevent privilege elevation within the web apps. SSL and addy range restrictions are used to secure sessions.

    As you can see from this setup, URLScan doesn't actually afford any security to the web app itself, it does however limit the scope of availible attacks against the service that I need to worry about. I hope this clears things up.


    catch

  9. #9
    Senior Member
    Join Date
    Oct 2002
    Posts
    181
    Although I agree that input validation in the web app is important as well, how would that have protected you from things like unicode attacks?
    Simple keep your server patched uptodate

    Input validation within the web app is vital. you may use "concepts of least privilege when dealing with database interaction to dramatically reduce the danger of SQL injection". Which is a good thing, but that does not prevent SQL injection from, correctly implemented input validation will prevent any SQL injection from happening in the first place.

    SittingDuck
    I\'m a SittingDuck, but the question is \"Is your web app a Sitting Duck?\"

  10. #10
    Banned
    Join Date
    May 2003
    Posts
    1,004
    I am sorry, but relying on patches is just bad policy. It makes far more sense to reduce your exposure... you wouldn't run unneeded services and secure them by keeping them current, so why is this any different than running unneeded functionality within a service?

    "Which is a good thing, but that does not prevent SQL injection from, correctly implemented input validation will prevent any SQL injection from happening in the first place."

    You prefer to take the attitude that it is practical to ensure that everything on your system, from the services themselves to your own applications within those services. I prefer to take the attitude that failure is inevitable so it is best to limit the possibility of failure and within that to limit the effect of the failure.

    You're never going to get it perfect, so why try? Design it with the understanding that it will fail, just ensure it fails securely.

    I come from a TOS background and I am guessing you are a UN*X person, which would explain the difference in philosophy.

    catch

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •