i'm a computer science undergruaduate in a Nigerian university and somethin like google.com,is the big idea i have in mind.what programming language would i require(learn) to being such a structure(website)?
every bit of details,pls
thnaks in advance

i appreciate ur thinking

thanx bro,i'd appreciate any material or info u could help me with,i definitely can't do this alone but i goin to see to it that i becomes a reality

Ok, this is one hell of a task your thinking of undertaking.

Serach engine, you will need to write a spider to scrape sites that are submitted to your serach engine, a spider to find new sites. A mass of servers to hold all the data those spiders will bring back. I would recommend using a programming language that has good support for sockets and ports if you thinking of alowing more than http requests. I have made all my spiders in perl but nothing like a spider. Your spider will also have to work out what the site is about, how relvent the data on that page is and maybe some ranking system like googles Page Rank.

Displaying results, server side code that will search the database the infomation the spiders have baught back. One hell of an algerythum to make sure the search term used brings back infomation in a relavent order and only displays relavent searched items.

As for advertising? How do you mean, sponsoured links or a directory style of adverts?

Good luck!

thanks omol for the response.yea someone else also 'prescribe' perl but all i've got now is the tutorials for dummies and still not even sufficient..if u could help with relevant materials,it would be greatly appreciated.

Ok I can recommend spidering hacks, am o'riely book ISBN 13: 9780596005771. The perl refrance material all available online. The google story.

Just a few to start. Books on server side communicationa and the break down of tcp / ip programming. I have a great book at home, when i get back home i will post up the isbn and name of it. Programming TCP / IP a complete reff off the top of my head.

Hi
Actually, we are in the BETA stages of something like that. Check out http://www.mputa.com, it actually crawls the internet and does not need user input. Right now, we are crawling only African CCTLD's as when we crawl international TLD's like .com and .net return lots of junk so we fine tuning our scoring and filtering.
I would recommend the Java programing language as it is versatile and i think if you are not ready to sit down and code for a very long time, do it as a group project.

Good Luck

i'm a computer science undergruaduate in a Nigerian university and somethin like google.com,is the big idea i have in mind.what programming language would i require(learn) to being such a structure(website)?
every bit of details,pls
thnaks in advance

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.