I was looking for a tutorial or any example of creating web crawler that i found this code somewhere and copied and pasted to test it:
First, it is a web crawler, right? Because when i gave it a url of a website, the output was some linkes were published on the terminal.
Second, if you test it yourself, you will see that linkes will divided into some parts with the title
Scanning depth 1 web and so on (the number will change). What is that for? What does it mean? What does depth number web means?
Third, i want to send exactly everything i see that will be printed into terminal, into a textfile, so where should i put this code:
with open('file.txt', 'w') as f: f.write()
And what shoul i type in the
and finally i have a request.
could you explain each line of code for me please, if you are familiar with any line? Even afew lines of code explanation will be really helpful because i don't understand it clear and i want to learn it well. It's a request only and will be happy if you help me with understanding it.
Thank you in advance :)