Kinda had an idea for artificial intelligence, though I don't think it's original. I don't know much about AI, so I'm taking a shot in the dark. My idea comes from biology and evolution (not new in computer science).
The 'algorithm' for evolution is simple. Lets say you have a pool of cells. New cells are created only from surviving cells. The new cells have the traits of the old cells, but slightly different. If a cell poseses a trait to allow it to survive, it will make a new cell. If a cell does not poses a trait to survive, or has a bad trait, it will not make a new cell.
I had an idea (though probably not original) about making an artificial intelligence using evolution to "make it self". Lets say you wanted to make a chess evolving artificial Intelligence system. You would make a program that would read data from a file, and interpret it as a "language". For example, 023 001 022 might mean if there is a pond at 2-3, move any peice around to intercept it. Next you write a nex generation program. It would read the data from winning files, and make them slightly random. Lastly you would need to make an emulator, to play different files against eachother, and make sure only winners get to have another generation. You would make random files, and play them in the emulator until you have a good ai.
The problem I see with this is the files might become too big! It might eventually hard code all of the possibilities into the data. The language would have to be made carefully to abstact the data.
Another problem I see would be how much space it would take to predict possible moves. Since it makes moves only based on experience, it would have a hard time dynamicly predicting the next move. To get past this, you can make a hybrid ai by hard coding a possible move finder, and feeding that into the ai part of the program.
Would anyone commennt/expand on this idea?
This would make for a vary interesting school project :P