Am 13. Dec, 2002 schw=E4tzte Kyle Faber so:
> All,
>
> =09Fooling around with a little bash script and I was wondering if it is
> possible to access web-based information utilizing the cli. For example,=
how
> could I get the following to actually work:
>
> grep "some string" http://www.somesite.com/textfile.txt
>
> the script is just not complicated enough to warrant anything more than
> bash.
> Anybody have any ideas?
You've gotten a couple of answers already.
Look at curl, wget, links and lynx. All will such web pages down and toss
them to STDOUT or a file in some way. links and lynx will give you parsed
pages.
Most of curl's functionality has been split out into libs, so there are
hooks for several languages.
At least curl and links will allow you to set things like cookies.
ciao,
der.hans
--=20
# https://www.LuftHans.com/ http://www.TOLISGroup.com/
# ... make it clear I support "Free Software" and not "Open Source",
# and don't imply I agree that there is such a thing as a
# "Linux operating system". - rms