I've tried inurl:http but it takes forever to get even a bunch of sites right and I have to think of new keywords everytime to get the sites. Is there kind of a directory or a script i could use to filter the http from all the sites on the web. Or some sort of software?

Recommended Answers

All 5 Replies

All http sites in the world? Do you have any idea how many terabytes (zettabytes?) that is?

commented: Thanks, James. One Last thing? If that wont work then Can i create script which could filter out all the https results in a search result +0

I provided you a link to a cached version of the Open Directory Project, which attempted to classify and organize every worthwhile website that existed. It no longer exists.

There are also tools such as Ahrefs and Moz that attempt to index every page based on ranking in Google search results for the purpose of gaining information and improving SEO.

I have a follow up question for Ayush_5. Beyond the issue raised by JamesCherrill are you looking for dark web listings as well? Those are the part of the Internet not accessible by traditional web search engines also known as deep web.

Maybe you could share what you are trying to really do. Maybe you want to make your own index of all sites but to do so would have you building a small warehouse for the server farm to do the search of your databases. That's the only way to drive down the delay you mentioned up top.

Thank you guys so much for so much help. I think i got my answer.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.