I’ve tried inurl:http but it takes forever to get even a bunch of sites right and I have to think of new keywords everytime to get the sites. Is there kind of a directory or a script i could use to filter the http from all the sites on the web. Or some sort of software?
There is a tool called mass scan which basically scans every ip address there is i.e. from “0.0.0.0” to “255.255.255.255”. Using this you can easily get the list of all the sites which are http. But you would require a gigabit uplink to the internet along with some sweet hardcore processing power to perform such a massive scan.
This is a great way to get started. I would also suggest the use of a web scrapping tool, which you should be able to only filter by HTTP on.
i have tried a bunch of them most of the m didn’t had http filter