5
Contributors
4
Replies
14
Views
5 Years
Discussion Span
Last Post by JeoSaurus
0

Call a web page? Do you mean render it as in a browser? Or do you mean to download it to your system? There are a number of commands to do the latter, such as the wget command: wget page-address

0

if you need to do a simple fetch, then wget would be recommended. If you need to have more control, use curl which allows use of cookies, sending/posting data etc and other magic. If you need to interact with it as a human - then just use lynx. More details on exactly what you are looking to do from the shell script would help with a more detailed recommendation.

0

If you want to get data from a web page and do something with it, there are lots of good options.

Depending on what you want to do, one of my favorites is wget. Here's an example:

wget -qO- ipchicken.com

This gets you the raw html from ipchicken.com. You can easily parse this if you know what you're looking for, like say for instance you just want to get your IP address:

wget -qO- ipchicken.com|grep -Eo '([0-9]{1,3}\.){3}[0-9]{1,3}'

Now if you want to get output that is a little more human readable, you might look into using 'lynx' or 'elinks' with their 'dump' options, which will give you formatted text, without the html tags.

I hope this helps!
-G

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.