John Linux 20 Light Poster

Hi,

I am looking to write a internet scraper, and have considered the following languages:
Python
C++
Java

The scraper will need to:
- Retrieve HTML code from a page
- Select a link, name and description from a section of the page
- Ask for user confirmation (non gui - maybe gui later) to process the link
- If user confirms then pass the link to a linux program that can be run by calling
Code:
`

linkprocessor -l http://link.goes.here/

`
- If possible, I would like to capture the text that would normally be shown in the terminal and display it in the terminal session running this application. (and check if the link was processed successfully)

I have looked at libraries JSoup (Java), BeautifulSoup (Python), curl (C++), and have seen no complications there. I am quite new at programming anything that interracts with the terminal.

The app will need to run on linux and mac (can be compiled seperately, I don't mind that).

What language and library would you guys recommend for this out of interest. Please back up your selection with a reason or two.

Thanks