Page 4 of 5 FirstFirst ... 2345 LastLast
Results 31 to 40 of 45

Thread: The End of Buffer Overflows!

  1. #31
    Senior Member
    Join Date
    Aug 2001
    Posts
    117
    Interesting article. I would think that yes a chip that can prevent a buffer overflow would be limiting the chip's potential. If there's logic in the chip that prevents large numbers or chunks of data..what happens when there are large data transactions?

    As for 100's of gigs of data... it's possible to make a program to creat it dynamically from random stuff and pump it to a receptor.
    Luck--TSM
    Atlanta, GA


  2. #32
    Junior Member
    Join Date
    Feb 2004
    Posts
    17
    There is know way this will work only on known problemsi guess and a new way will be found to exploit faults. it would have to wach every file on the system and how slow would that make your pc, is it worth the it?
    Who am i to question your motive?

  3. #33
    YOU THINK THIS IS THE END OF BUFFER OVERFLOWS????? MWAHAHAAHAHAHAHAAAA
    THERE WILL BE NO END TO BUFFER OVERFLOWS - I WILL PERSONALLY SEE TO THIS...
    I WILL ENSURE THERE ARE FOREVER MORE BUFFER OVERFLOWS JUST TO GET TO ANNOY ALL OF YOU BIG GAY PEADOPHILES COMMONLY COME TO BE KNOWN AS GEEKS....

    YOURS SINCERELY

    I HATE GEEKS
    (aka Boab Fae Barmulloch)

    PS - DIE GEEKS - DIE!!!

  4. #34
    Banned
    Join Date
    Feb 2004
    Posts
    93
    Negative.

    -Cheers-

  5. #35
    Senior Member
    Join Date
    Jun 2003
    Posts
    772
    Lol, this is definitly a member that wanted to have some fun typical
    The above sentences are produced by the propaganda and indoctrination of people manipulating my mind since 1987, hence, I cannot be held responsible for this post\'s content - me

    www.elhalf.com

  6. #36
    Custom User
    Join Date
    Oct 2001
    Posts
    503
    lol, guys, sorry. That post was by one of my friends. He saw me coming onto this thread on antionline in uni and decided that he wanted to attempt to make fun of us.

    The amusing thing is that he doesn't know what a buffer overflow is. Ah well...

    (knew I shouldn't have checked the forums at uni)

    ac

    P.S. For some reason, he didn't get that joke at the start of the thread :P hehe

  7. #37
    Custom User
    Join Date
    Oct 2001
    Posts
    503
    I doubt that there would be any compatibility problem that would make it impossible for you to run other operating systems than windows. At the moment, intel and amd chips at least (if not others) are designed to be backwards compatible to run certain commands that were designed decades ago (using an interpreter rather than hard coded)

    I don't see why there would be any problem now...

    ac

  8. #38
    Senior Member RoadClosed's Avatar
    Join Date
    Jun 2003
    Posts
    3,834
    Interesting Book

    While there are dozens of software books that deal with exploiting code. I found this one a great read. I know this is about designing chips to "filter" exploits in code, but perhaps this would interest a few "geeks" out there.

  9. #39
    Originally posted here by thehorse13
    LOL, wait until an exploit is discovered on how the processor makes logical descisions. The exploit will go something like this: By passing 400 gigs of specially crafted data to the processor, you can overwhelm it and cause an overflow condition...


    Since you LOVE to argue logistics. Hopefully you'll know what i'm talking about. Tell me the viable logistics of passing 400 gigabites to a processor. Even if that was a local exploit, and NOT sending 400G of data over the wire, you would crash the machine long before the processor "overflowed."

    1 - cache to small
    2 - page / swap file to small
    3 - volitile memory to small

    etc, etc, etc.

    A good metaphore would be this: Passing 400Gig's of data to a processor, is like transporting 1 million rounds of ammo to the metalstorm weapon. *hint* *hint*

  10. #40
    Senior Member
    Join Date
    Nov 2003
    Posts
    107
    Question! Wouldn't the chip have to store the data/code somewhere in RAM? I mean, it could store it on onboard memory or the pointer pairs for beginning and ending a code or data segment on onboard data (don't know if the latter is feasible unless the chip actually re-organizes the code into stricly data and code parts).

    This would reduce the amount of RAM available to programs and what's to stop people from altering these settings? We could see a new era of attacks where you fool the chip into thinking that data is code. You wouldn't have to overflow anything, you could simply put code directly into the data area and then switch it to a code area.

    I don't have experience with these types of systems or how they would be implemented, so I'm not qualified to really speak about this topic. It just seems that there could be some problems if the idea is implemented poorly.
    Is there a sum of an inifinite geometric series? Well, that all depends on what you consider a negligible amount.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •