-
February 24th, 2004, 02:21 PM
#31
Senior Member
Interesting article. I would think that yes a chip that can prevent a buffer overflow would be limiting the chip's potential. If there's logic in the chip that prevents large numbers or chunks of data..what happens when there are large data transactions?
As for 100's of gigs of data... it's possible to make a program to creat it dynamically from random stuff and pump it to a receptor.
-
February 24th, 2004, 02:25 PM
#32
Junior Member
There is know way this will work only on known problemsi guess and a new way will be found to exploit faults. it would have to wach every file on the system and how slow would that make your pc, is it worth the it?
Who am i to question your motive?
-
February 24th, 2004, 02:55 PM
#33
YOU THINK THIS IS THE END OF BUFFER OVERFLOWS????? MWAHAHAAHAHAHAHAAAA
THERE WILL BE NO END TO BUFFER OVERFLOWS - I WILL PERSONALLY SEE TO THIS...
I WILL ENSURE THERE ARE FOREVER MORE BUFFER OVERFLOWS JUST TO GET TO ANNOY ALL OF YOU BIG GAY PEADOPHILES COMMONLY COME TO BE KNOWN AS GEEKS....
YOURS SINCERELY
I HATE GEEKS
(aka Boab Fae Barmulloch)
PS - DIE GEEKS - DIE!!!
-
February 24th, 2004, 03:22 PM
#34
-
February 24th, 2004, 04:30 PM
#35
Lol, this is definitly a member that wanted to have some fun typical
The above sentences are produced by the propaganda and indoctrination of people manipulating my mind since 1987, hence, I cannot be held responsible for this post\'s content - me
www.elhalf.com
-
February 25th, 2004, 12:05 AM
#36
lol, guys, sorry. That post was by one of my friends. He saw me coming onto this thread on antionline in uni and decided that he wanted to attempt to make fun of us.
The amusing thing is that he doesn't know what a buffer overflow is. Ah well...
(knew I shouldn't have checked the forums at uni)
ac
P.S. For some reason, he didn't get that joke at the start of the thread :P hehe
-
February 25th, 2004, 12:26 AM
#37
I doubt that there would be any compatibility problem that would make it impossible for you to run other operating systems than windows. At the moment, intel and amd chips at least (if not others) are designed to be backwards compatible to run certain commands that were designed decades ago (using an interpreter rather than hard coded)
I don't see why there would be any problem now...
ac
-
February 25th, 2004, 04:19 PM
#38
Interesting Book
While there are dozens of software books that deal with exploiting code. I found this one a great read. I know this is about designing chips to "filter" exploits in code, but perhaps this would interest a few "geeks" out there.
-
February 25th, 2004, 08:37 PM
#39
Banned
-
February 26th, 2004, 07:51 AM
#40
Question! Wouldn't the chip have to store the data/code somewhere in RAM? I mean, it could store it on onboard memory or the pointer pairs for beginning and ending a code or data segment on onboard data (don't know if the latter is feasible unless the chip actually re-organizes the code into stricly data and code parts).
This would reduce the amount of RAM available to programs and what's to stop people from altering these settings? We could see a new era of attacks where you fool the chip into thinking that data is code. You wouldn't have to overflow anything, you could simply put code directly into the data area and then switch it to a code area.
I don't have experience with these types of systems or how they would be implemented, so I'm not qualified to really speak about this topic. It just seems that there could be some problems if the idea is implemented poorly.
Is there a sum of an inifinite geometric series? Well, that all depends on what you consider a negligible amount.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
|