March 18th, 2003, 04:24 AM
i realize that this may be of little importance to some people, but for me, it brings up some questions of efficiency between random guesses an patterned elimination so i think it may be applicable in some way.
playing the game battleship, i plaed my first game guessing squares (randomly as i could) just to see if i could beat the computer that way, with very little logic. I lost by a moderate margin to the computer.
the next game however, i played by clicking on every other square, creating a checkerboard pattern until i hit something. this time i won by good margin.
why? i figure that the computer uses some algorithm to get as near a random as possible until it hits something (maybe someone at AO could answer this). so does mean that patterns prevail over random guessing, in many i cases i would say yes. what i do think is that it shows how one could change a pattern of every possibility to one of every other . soif your looking for a large, sprawling object, is their a need to sort through every possibility or only every other (the minesweeper is the smallest, thus every other is the smallest and best in battleship)? so what im tying to say, is that if algorithms could be patterned to suit the needs of the user, they might be a lot faster. i dont know how this may be applied to securities, but i believe it may help someone optimize their code, even if by a little. i leave it up to the public to apply it (or toss it)
March 18th, 2003, 05:16 AM
If I am understanding you correctly TaPnAP, are you talking about algorithms adapting over their running time? For example, in Battleship, once all minesweepers are gone, the algorithm would begin scanning every third square, because that is the size of the smallest remaining ship? For a game such as battelship, where the number of squares is not relatively large (400 or so), reducing the algorithm from n/2 (n = number of squares) to n/3 is a large savings in time.
One could then argue that if we played battleship with 40 million squares, it wouldnt be as significant a savings in time.
This conversation could then get really indepth about big-Oh running times of algorithms, but I doubt everyone that will read this will have a good knowledge of the concept (maybe a good tutorial idea for yours truly).
I do think adaptive algorithms are a great idea, because it does cut down the amount of time an algorithm takes to run, even if it might be only shave 5 million seconds off of 40 million seconds.
Hopefully this rambling makes sense to someone!
\"When you say best friends, it means friends forever\" Brand New
\"Best friends means I pulled the trigger
Best friends means you get what you deserve\" Taking Back Sunday
March 18th, 2003, 05:34 AM
adaptive algorithms are a great application of what i called 'battleship logic' and i think its application to reducing runtime are definite realistic uses, bu i also posted in the general sense so i could apply to other things as well. oh and btw, my another idea i had for sorting a large amount of numbers in a small range (integers 1-100 etc), i never got around to implenting and testing it, would be to create a 'space' for each value, and then just storing the number of times that value pops up... but im rambling again. and btw, i think Big-O notation would be a great subjet for a tutorial.