Results 1 to 10 of 10

Thread: Programming?????

  1. #1
    Junior Member
    Join Date
    Jun 2003


    Ok...I am a noob working from the ground up. I understand the hardware, how the pooter takes 0's and 1's and turns them into data...That took a while to master (my logic is NOT great)

    So, I wanna learn to program (prolly in C or Python) but one part of the programming theory stunts any progress. I am just not getting how a computer takes the instructions in a program--being just arithmetical algorithms, functions, and whatnot--and turns it into say--an antivirus program, or a kickin' video game.

    It may not be important to most, but I learn from the ground up, and if I can't understand how a bunch of commands that look nothing like "if he is shot he dies" accomplishes just that....


    [gloworange]\"Not all that is gold glitters, and not all those who wander are lost.\" ~ Lord of the Rings[/gloworange]

  2. #2
    Senior Member
    Join Date
    Jan 2003
    The computer doesn't take the instructions in your program.. at least not directly. That's why you need an interpreter or a compiler. They tell the computer, in a language it understands, what you want it to do. Others can probably explain this much better than me, but I'll do my best to give you a basic idea. The computer operates on 1s and 0s (binary) like you said, Something is on or off.. there's no other option. Everything a computer does is based on that. When you compile you are converting the program to machine language (sometimes assembly which is then converted to machine language with an assembler).


    The lowest-level programming language (except for computers that utilize programmable microcode) Machine languages are the only languages understood by computers. While easily understood by computers, machine languages are almost impossible for humans to use because they consist entirely of numbers. Programmers, therefore, use either a high-level programming language or an assembly language. An assembly language contains the same instructions as a machine language, but the instructions and variables have names instead of being just numbers.
    Programs written in high-level languages are translated into assembly language or machine language by a compiler. Assembly language programs are translated into machine language by a program called an assembler.

    Every CPU has its own unique machine language. Programs must be rewritten or recompiled, therefore, to run on different types of computers.
    you may also wish to check out
    IT Blog: .:Computer Defense:.
    PnCHd (Pronounced Pinched): Acronym - Point 'n Click Hacked. As in: "That website was pinched" or "The skiddie pinched my computer because I forgot to patch".

  3. #3
    Senior Member n01100110's Avatar
    Join Date
    Jan 2002
    Well , im certainly no Master Programmer , but like HTRegz said , You need a compiler to tell the computer what exactly we are trying to accomplish. But with programming languages like C , there are certain functions like the if statement and the else statement. It is basically saying if this is true , then do this function. For example , this is a C program that displays output to the screen.
    #include <stdio.h>
    int main(){

    Im not here to explain how that code works , but I know just the site to get you started Download Dev C++ it's a c , c++ compiler used to make your code executable
    Have fun and keep reading..
    "Serenity is not the absence of conflict, but the ability to cope with it."

  4. #4
    Join Date
    Jul 2002
    When I first started programming (last year, so I'm by no means a pro) I had a problem understanding how complex programs could work when all the processor can do are such simple tasks. I realized the answer when I thought about the speed at which a processor can do these calculations. For instance, intel chips can do in the range of 10^9 floating point operations per second (giga-flops) but like you, Gwyddion, I am hung as to how a computer can take the ones and zeros it "sees" and do anything useful from them, no matter how fast it goes. Hopefully someone else can enlighten us
    Darwin\'s rollin\' over in his coffin, The fittest are surviving much lest often,
    Now everything seems to be reversing, And it\'s worsening!
    --nofx, American Errorist

  5. #5
    AO Curmudgeon rcgreen's Avatar
    Join Date
    Nov 2001
    How can an architect turn a brick into a cathedral?
    It is by building it from primitive building blocks,
    arranged in creative and complex ways.

    The computer's monitor is designed to take a stream of input
    voltages, and aim the electron beam at the screen according
    to the patterns in those voltages.

    The vid card produces the voltages, based on the state of binary signals
    sent to it by a program.

    The program is a series of binary numbers, which, when fed to
    the processor, causes the processor to flip millions of transistors
    on and off, like switches.

    Input from keyboard, mouse, joystick etc modify the order in which
    the transistors are activated.

    To understand it, you'd probably need to study early and primitive
    computers, to get basic theory, because we didn't arrive at the
    present without coming through the past.

    The most fundamental building block is the logic gate

    It is like a switch that is turned on or off depending on the state of one or more
    input voltages
    When you wire a bunch of them together, turning one on
    or off can cause a cascade of changes in the others, like a row of dominos.
    I came in to the world with nothing. I still have most of it.

  6. #6
    If you start learning C (i dont know if u hav) then jus keep going it and read books, lots of them, do them all over again and again.

    Eventually you'll start seeing hey so thats why ponter are used( that was me after reading 3 books 4 times each )

    And keep your head high, your 1st few programs WOULD(and MUST) be real shitty/stupid/simple.

    That way you'll learn and learn( starting by viewing someones code and modifieng it is NOT adviced-> alot of internet books start by: picking off in the middle of a prog or appz)

    And remember YOU WONT BE ABLE TO MAKE COOL GAMES OR NAH APPZ untill you have done ADVANCED C/c++.

    So dont think you gonna be a PROGGY in 1day, it takes time.

    I am not an pro @ PROGGRAMMING, that doesnt mean we cant be good PROGGYs,it just means we need to PRACTICE PRACTICE LEARN LEARN PRACTICE PRACTICE...........

    'TILL we look like mr$BILL$ and roling in the $$, then you know you fooled all of the world into thinking you good, but being good is UNDERSTANDING-> nothing more.
    IF you understand computer better than someone else hten you are better, but there is a
    point where you can no longer understand any-more(cause theres nothing more to understand)
    then ALL ARE EQUEL, so to get better then -> you need TO LEARN SOMETHING NEW,maybe another computer-related SRC.

    And then someday you'll jus say: DAMN I AM STILL NOT THE BEST!!

    SO WHEN ARE YOU THE BEST,NEVER in my opinion.

    But keep on learning and seeking.

    Oh yeah START WITH C and not C++-> its alot easier, if you struggle with CLASSES(public and private that is)

    and learn as many languages as you can.


  7. #7
    Junior Member
    Join Date
    Jun 2003

    Talking Thanks!!

    Thank you all very much for your input. It has proven very useful. I think I get it!!! YAY!!! All it took was one very tiny transistor sized light going off to get me to understand! That and lots of practice will aid the process too!!

    Thanks again, now I know where to go for help!

    [gloworange]\"Not all that is gold glitters, and not all those who wander are lost.\" ~ Lord of the Rings[/gloworange]

  8. #8
    Sure bro

    Go to> a good site if you dont have $$ to buy a book.

    Search the forums for more links,usually Memory(Ao member -if you dont know) has lots of links to these free tutorials and books , and i think(i cant remember) i got this link from him
    theres lot more in AO but i dont have it @ the moment ,cause i am now learning Web GRAPHICS and next will be java,
    so GOOD LUCK.


  9. #9
    Senior Member
    Join Date
    Nov 2001
    heres an online book:


    It has a chapter dealing with:


    This might help to enlighten you!
    Bukhari:V3B48N826 “The Prophet said, ‘Isn’t the witness of a woman equal to half of that of a man?’ The women said, ‘Yes.’ He said, ‘This is because of the deficiency of a woman’s mind.’”

  10. #10
    Senior Member Maestr0's Avatar
    Join Date
    May 2003
    A computer works by performing mathmatical operations on data stored in memory. 1's and 0's are all that is required for this(binary or base-2) .The fact that humans like base-10 and letters, has nothing to with the computer so dont blame your CPU. How can 1's and 0's become programs? Take this for example:


    Congratulations, you now have a letter 'A' from an 8-bit Nintendo. Every operation performed by a CPU is hardwired into it and is an instruction(This is your chip's instruction set). Each instruction or opcode usually accepts operands. These instructions are numbers (binary to the CPU,or hex, or octal if you're old school, doesnt matter)as are the operands.For our sake opcodes have been given mneumonic names to help programmers keep their sanity (I seem to remember reading that Steve Wozniak could program without a single letter) for instance 0010.1000 (binary) is also 0x28 which some of you ASM prgrammers may know as the opcode "SUB", which- thats right folks subtracts. The CPU maintains a pointer to the next instruction to execute via an Instruction Pointer or IP(or program counter PC. Note: Discussion of registers is beyond the scope of this post) Typically instructions execute from the top of the stack(aka your memory). This is like a stack of dishes executing the first(top) plate then the next.Keep in mind the last plate in the stack is usually the first to be placed on the stack(hence the bottom). This is what allows bufferoverflow attacks, aka some jackass puts more food on his plate then it can hold and it spills onto the plate beneath it which is then executed next(This is of course an extremely basic explanantion as many attacks will attempt to modify the instruction pointer adress to jump to their dirty ass plate somewhere in your stack) Of course these opcodes can not only be used to inspect data within the memory for a decision(this is a branch, wich typically means the address the IP points to will be changed based on inspection of some data) but modify this data as well. This means programs are nothing more than pushing little 1's and 0's around in memory.Fortunately for us we do not have to re-invent the wheel and much of the extremely simple processes such as Basic Input and Output Systems(Thats correct, your BIOS) have been written by poor wretches which preceded us(and were doubtlessly far smarter), enabling us to concentrate on more important things like Unreal Tournament. After a while of using all these opcodes people decided it was terribly in-convienent to write 4 pages of assembly code to perform long division and someone decided to just build libraries of all these basic functions and allow programmers to build programs in a way more suited to the human brain and then just convert it all later, aka a Compiler was born. One of the these was the BCPL compiler the great great great grand daddy of our beloved C++.

    "C was originally developed in 1969-1973 at Bell Labs about the same time the UNIX operating system was being developed. Its parent was the language B and its grandparent was the language BPCL. Derived from the typeless language BCPL, it evolved a type structure; created on a tiny machine as a tool to improve a meager programming environment, it has become one of the dominant languages of today. Even today it continually evolves, C++, C#, etc are all inspired by C."

    And the rest was history.I hope someone gained some sort of insight from this piece of electronic nostalgia.

    \"If computers are to become smart enough to design their own successors, initiating a process that will lead to God-like omniscience after a number of ever swifter passages from one generation of computers to the next, someone is going to have to write the software that gets the process going, and humans have given absolutely no evidence of being able to write such software.\" -Jaron Lanier

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts