I have inserted the scsi_debug module into my kernel and now have scsi
devices, but I don't think any of them are mapped to my cd-burner. Not only
that, but it seems to add ALL of the scsi devices in /dev/ into the system.
Is there a module that I don't have that I should?
lsmod reveals:
es1371
soundcore
via-rhine
serial
memstat
ide-scsi
scsi_debug
nls_iso8859-1
Thanks
> -----Original Message-----
> From: J.L.Francois [mailto:frenchie@magusnet.gilbert.az.us]
> Sent: Friday, June 02, 2000 12:29 PM
> To: plug-discuss@lists.PLUG.phoenix.az.us
> Subject: Re: Automtic FTP FTP
>
>
> It seems like on Fri, Jun 02, 2000 at 11:53:45AM -0700, Ryan
> Denke scribbled:
> Orig Msg> I'm looking for a way to set up a cron job run a
> shell scriptto
> Orig Msg> automatically connect to another server, login with
> username and
> Orig Msg> password, and download the same file every day.
> Orig Msg>
> Orig Msg> I've read the man pages for FTP, but it just
> doesn't seem that you can
> Orig Msg> pass it username, password, and get commands and
> make it all work.
> Orig Msg> Anyone have any idea or suggestions?
> Orig Msg>
> Orig Msg>
> Orig Msg>
> Orig Msg>
> Orig Msg> _______________________________________________
> Orig Msg> Plug-discuss mailing list -
> Plug-discuss@lists.PLUG.phoenix.az.us
> Orig Msg>
> http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss
>
> GNU Wget
>
> Wget is a utility designed for retrieving binary documents
> across the Web, through the use of HTTP (Hyper Text Trans
> fer Protocol) and FTP (File Transfer Protocol), and saving
> them to disk. Wget is non-interactive, which means it can
> work in the background, while the user is not logged in,
> unlike most of web browsers (thus you may start the pro
> gram and log off, letting it do its work). Analyzing
> server responses, it distinguishes between correctly and
> incorrectly retrieved documents, and retries retrieving
> them as many times as necessary, or until a user-specified
> limit is reached. REST is used in FTP on hosts that sup
> port it. Proxy servers are supported to speed up the
> retrieval and lighten network load.
>
> Wget supports a full-featured recursion mechanism, through
> which you can retrieve large parts of the web, creating
> local copies of remote directory hierarchies. Of course,
> maximum level of recursion and other parameters can be
> specified. Infinite recursion loops are always avoided by
> hashing the retrieved data. All of this works for both
> HTTP and FTP.
>
> ftp://ftp.gnu.org/gnu/wget/
> ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/
> ftp://prep.ai.mit.edu/pub/gnu/wget/
>
> JLF Sends...
> Behold, the Internet is the greatest sum of information at mankind's
> fingertips since the Library of Alexandria. Despite this vast
> storehouse
> of knowledge at our disposal, there are still those that will send
> urban legend and blatantly false information to mailing lists and
> newsgroups without making even the slightest effort to check their
> legitimacy. At every occurance this proves to me that every
> node,wire,
> and server I help connect to the Internet to widen its expanse for
> the benefit of the masses is a complete waste of time. ( J.
> Francois )
>
>
> _______________________________________________
> Plug-discuss mailing list - Plug-discuss@lists.PLUG.phoenix.az.us
> http://lists.PLUG.phoenix.az.us/mailman/listinfo/plug-discuss
>