* urlget
@ 2002-10-15 10:21 jb1
2002-10-15 13:45 ` urlget Pat Gilliland
0 siblings, 1 reply; 5+ messages in thread
From: jb1 @ 2002-10-15 10:21 UTC (permalink / raw)
To: linux-8086
Several people have asked how to use urlget, but since no one answered I
chased it down on the web and found urlget-3.12.tar.gz. I don't remember
exactly where it came from, but it was one of the hits from a Google
search for:
+urlget +tar.gz
The command:
urlget http://192.168.1.100 > urlget.test
issued from my Linux installation created a local copy of the default
webpage as "urlget.text". There's a bug in my ELKS installation that
stalls the webserver if the webpage file length is larger than 99 bytes;
if the command stalls for you, try shortening the file.
Notes:
1. I found it necessary to create an entry for 192.168.1.100 (my ELKS
installation) in my Linux installation's /etc/hosts; otherwise urlget was
unable to resolve the IP address.
2. The local file is a literal copy, not a rendered version of the webpage
file.
3. The redirection is necessary to create a local file, otherwise the
output is to the screen (presumably stdout).
4. This is for Linux, not ELKS, and has many more command line switches.
5. The HOMEPAGE URL shown below seems to be dead.
Here's a copy of the README file:
-------- cut -------- cut -------- cut -------- cut -------- cut --------
_ _
_ _ _ __| | __ _ ___| |_
| | | | '__| |/ _` |/ _ \ __|
| |_| | | | | (_| | __/ |_
\__,_|_| |_|\__, |\___|\__| - Gets your URL!
|___/
NAME
urlget - get a file from a FTP, GOPHER or HTTP server.
SYNOPSIS
urlget [options] <url>
DESCRIPTION
urlget is a client to get documents/files from servers, using any of the
supported protocols. The command is designed to work without user
interaction or any kind of interactivity.
OPTIONS
The following options may be specified at the command line:
-d <data> (HTTP ONLY)
Sends the specified data in a POST request to the HTTP server. Note
that the data is sent exactly as specified with no extra processing.
The data is expected to be "urlencoded".
-e <url> (HTTP ONLY)
Sends the "Referer Page" information to the HTTP server. Some badly
done CGIs fail if it's not set.
-f (HTTP ONLY)
Fail silently (no output at all) on server errors. This is mostly done
like this to better enable scripts etc to better deal with failed
attempts. In normal cases when a HTTP server fails to deliver a
document, it returns a HTML document stating so (which often also
describes why and more). This flag will prevent urlget from outputting
that and fail silently instead.
-i (HTTP ONLY)
Include the HTTP-header in the output. The HTTP-header includes things
like server-name, date of the document, HTTP-version and more...
-I (HTTP ONLY)
Fetch the HTTP-header only! HTTP-servers feature the command HEAD
which this uses to get nothing but the header of a document.
-k (HTTP ONLY)
Use Keep-Alive connection. There is currently no real gain to use this.
-l (FTP ONLY)
When listing an FTP directory, this switch forces a name-only view.
Especially useful if you want to machine-parse the contents of an FTP
directory since the normal directory view doesn't use a standard look
or format.
-m <seconds>
Maximum time in seconds that you allow the whole operation to take.
This is useful for preventing your batch jobs from hanging for hours
due to slow networks or links going down.
-o <file>
Write output to <file> instead of stdout.
-O
Write output to a local file named like the remote file we get. (Only
the file part of the remote file is used, the path is cut off.)
-r <range> (HTTP ONLY)
Retrieve a byte range (i.e a partial document) from a HTTP/1.1
server. Ranges can be specified in a number of ways.
0-499 - specifies the first 500 bytes
500-999 - specifies the second 500 bytes
-500 - specifies the last 500 bytes
9500- - specifies the bytes from offset 9500 and forward
0-0,-1 - specifies the first and last byte only(*)
500-700,600-799 - specifies 300 bytes from offset 500
100-199,500-599 - specifies two separate 100 bytes ranges(*)
(*) = NOTE that this will cause the server to reply with a multipart
response!
-p <port>
Use port other than default for current protocol. This is typically
most used together with the proxy-flag (-x).
-s
Silent mode. Don't show progress meter or error messages. Makes
Urlget mute.
-t (FTP only)
Transfer the stdin data to the specified file. Urlget will read
everything from stdin until EOF and store with the supplied name.
-T <file> (FTP only)
Like -t, but this transfers the specified local file. If there is no
file part in the specified URL, Urlget will append the local file
name. NOTE that you must use a trailing / on the last directory to
really prove to Urlget that there is no file name or urlget will think
that your last directory name is the remote file name to use. That
will most likely cause the upload operation to fail.
-u <user:password>
Specify user and password to use when fetching. See below for detailed
examples of how to use this.
-U <user:password>
Specify user and password to use for Proxy authentication.
-v
Makes the fetching more verbose/talkative. Mostly usable for
debugging. Lines starting with '>' means data sent by urlget, '<'
means data received by urlget that is hidden in normal cases and lines
starting with '*' means additional info provided by urlget.
-x <host>
Use proxy. The port number to use is set to 1080 when this is used and
the port flag (-p) is not.
SIMPLE USAGE
Get the main page from netscape's web-server:
urlget http://www.netscape.com/
Get the root README file from funet's ftp-server:
urlget ftp://ftp.funet.fi/README
Get a gopher document from funet's gopher server:
urlget gopher://gopher.funet.fi
Get a web page from a server using port 8000:
urlget http://www.weirdserver.com:8000/
Get a list of the root directory of an FTP site:
urlget ftp://ftp.fts.frontec.se/
DOWNLOAD TO A FILE
Get a web page and store in a local file:
urlget -o thatpage.html http://www.netscape.com/
Get a web page and store in a local file, make the local file get the name
of the remote document (if no file name part is specified in the URL, this
will fail):
urlget -O http://www.netscape.com/index.html
USING PASSWORDS
FTP
To ftp files using name+passwd, include them in the URL like:
urlget ftp://name:passwd@machine.domain:port/full/path/to/file
or specify them with the -u flag like
urlget -u name:passwd ftp://machine.domain:port/full/path/to/file
HTTP
The HTTP URL doesn't support user and password in the URL string. Urlget
does support that anyway to provide a ftp-style interface and thus you can
pick a file like:
urlget http://name:passwd@machine.domain/full/path/to/file
or specify user and password separately like in
urlget -u name:passwd http://machine.domain/full/path/to/file
NOTE! Since HTTP URLs don't support user and password, you can't use that
style when using Urlget via a proxy. You _must_ use the -u style fetch
during such circumstances.
GOPHER
Urlget features no password support for gopher.
PROXY
Get a web page using a proxy named my-proxy that uses port 888:
urlget -p 888 -x my-proxy http://www.web.com/
Get an ftp file using a proxy named my-proxy that uses port 888:
urlget -p 888 -x my-proxy ftp://ftp.leachsite.com/README
Get a file from a HTTP server that requires user and password, using the
same proxy as above:
urlget -u user:passwd -p 888 -x my-proxy http://www.get.this/
RANGES
With HTTP 1.1 byte-ranges were introduced. Using this, a client can request
to get only one or more subparts of a specified document. Urlget supports
this with the -r flag.
Get the first 100 bytes of a document:
urlget -r 0-99 http://www.get.this/
Get the last 500 bytes of a document:
urlget -r -500 http://www.get.this/
UPLOADING
FTP
Upload all data on stdin to a specified ftp site:
urlget -t ftp://ftp.upload.com/myfile
Upload data from a specified file, login with user and password:
urlget -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
Upload a local file to the remote site, and use the local file name remote
too:
urlget -T uploadfile -u user:passwd ftp://ftp.upload.com/
HTTP
See the POST section below.
VERBOSE / DEBUG
If urlget fails where it isn't supposed to, if the servers don't let you in,
if you can't understand the responses: use the -v flag to get VERBOSE
fetching. Urlget will output lots of info and all data it sends and receives
in order to let the user see all client-server interaction.
urlget -v ftp://ftp.upload.com/
POST
It's easy to post data using urlget. This is done using the -d <data>
option. The post data must be urlencoded.
Post a simple "name" and "phone" guestbook.
urlget -d "name=Rafael%20Sagula&phone=3320780" http://www.where.com/guest.cgi
REFERER
A HTTP request has the option to include information about which address
that referred to actual page, and urlget allows the user to specify that
referrer to get specified on the command line. It is especially useful to
fool or trick stupid servers or CGI scripts that rely on that information
being available or contain certain data.
Example:
urlget -e www.coolsite.com http://www.showme.com/
BUGS
Nope. :-) But don't try excessive string lengths, we don't do many boundary
checks so those will without doubt crash this.
VERSION
This document attempts to describe the usage of urlget 3.10.
Written by Daniel Stenberg 1998-02-04.
AUTHORS
Daniel Stenberg <Daniel.Stenberg@sth.frontec.se>
Rafael Sagula <sagula@inf.ufrgs.br>
Bjorn Reese <breese@imada.ou.dk>
Johan Anderson <johan@homemail.com>
Kjell Ericson <Kjell.Ericson@sth.frontec.se>
HOMEPAGE
http://www.inf.ufrgs.br/~sagula/urlget.html
-------- cut -------- cut -------- cut -------- cut -------- cut --------
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: urlget
2002-10-15 10:21 urlget jb1
@ 2002-10-15 13:45 ` Pat Gilliland
2002-10-15 14:04 ` urlget Paul Nasrat
2002-10-16 8:22 ` urlget jb1
0 siblings, 2 replies; 5+ messages in thread
From: Pat Gilliland @ 2002-10-15 13:45 UTC (permalink / raw)
To: Linux-8086@vger.kernel.org
I am assembling an unofficial Clueless Newbie ELKS faq at
http://www.cyberus.ca/~pgillil/elks.html I will add this information
very shortly. If anyone else has any undocumented tips/information I
would be happy to put them up.
Pat Gilliland
jb1@btstream.com wrote:
>
> Several people have asked how to use urlget, but since no one answered I
> chased it down on the web and found urlget-3.12.tar.gz. I don't remember
> exactly where it came from, but it was one of the hits from a Google
> search for:
> +urlget +tar.gz
>
> The command:
> urlget http://192.168.1.100 > urlget.test
> issued from my Linux installation created a local copy of the default
> webpage as "urlget.text". There's a bug in my ELKS installation that
> stalls the webserver if the webpage file length is larger than 99 bytes;
> if the command stalls for you, try shortening the file.
>
> Notes:
> 1. I found it necessary to create an entry for 192.168.1.100 (my ELKS
> installation) in my Linux installation's /etc/hosts; otherwise urlget was
> unable to resolve the IP address.
> 2. The local file is a literal copy, not a rendered version of the webpage
> file.
> 3. The redirection is necessary to create a local file, otherwise the
> output is to the screen (presumably stdout).
> 4. This is for Linux, not ELKS, and has many more command line switches.
> 5. The HOMEPAGE URL shown below seems to be dead.
>
> Here's a copy of the README file:
> -------- cut -------- cut -------- cut -------- cut -------- cut --------
> _ _
> _ _ _ __| | __ _ ___| |_
> | | | | '__| |/ _` |/ _ \ __|
> | |_| | | | | (_| | __/ |_
> \__,_|_| |_|\__, |\___|\__| - Gets your URL!
> |___/
>
> NAME
> urlget - get a file from a FTP, GOPHER or HTTP server.
>
> SYNOPSIS
> urlget [options] <url>
>
> DESCRIPTION
> urlget is a client to get documents/files from servers, using any of the
> supported protocols. The command is designed to work without user
> interaction or any kind of interactivity.
>
> OPTIONS
> The following options may be specified at the command line:
>
> -d <data> (HTTP ONLY)
> Sends the specified data in a POST request to the HTTP server. Note
> that the data is sent exactly as specified with no extra processing.
> The data is expected to be "urlencoded".
>
> -e <url> (HTTP ONLY)
> Sends the "Referer Page" information to the HTTP server. Some badly
> done CGIs fail if it's not set.
>
> -f (HTTP ONLY)
> Fail silently (no output at all) on server errors. This is mostly done
> like this to better enable scripts etc to better deal with failed
> attempts. In normal cases when a HTTP server fails to deliver a
> document, it returns a HTML document stating so (which often also
> describes why and more). This flag will prevent urlget from outputting
> that and fail silently instead.
>
> -i (HTTP ONLY)
> Include the HTTP-header in the output. The HTTP-header includes things
> like server-name, date of the document, HTTP-version and more...
>
> -I (HTTP ONLY)
> Fetch the HTTP-header only! HTTP-servers feature the command HEAD
> which this uses to get nothing but the header of a document.
>
> -k (HTTP ONLY)
> Use Keep-Alive connection. There is currently no real gain to use this.
>
> -l (FTP ONLY)
> When listing an FTP directory, this switch forces a name-only view.
> Especially useful if you want to machine-parse the contents of an FTP
> directory since the normal directory view doesn't use a standard look
> or format.
>
> -m <seconds>
> Maximum time in seconds that you allow the whole operation to take.
> This is useful for preventing your batch jobs from hanging for hours
> due to slow networks or links going down.
>
> -o <file>
> Write output to <file> instead of stdout.
>
> -O
> Write output to a local file named like the remote file we get. (Only
> the file part of the remote file is used, the path is cut off.)
>
> -r <range> (HTTP ONLY)
> Retrieve a byte range (i.e a partial document) from a HTTP/1.1
> server. Ranges can be specified in a number of ways.
> 0-499 - specifies the first 500 bytes
> 500-999 - specifies the second 500 bytes
> -500 - specifies the last 500 bytes
> 9500- - specifies the bytes from offset 9500 and forward
> 0-0,-1 - specifies the first and last byte only(*)
> 500-700,600-799 - specifies 300 bytes from offset 500
> 100-199,500-599 - specifies two separate 100 bytes ranges(*)
>
> (*) = NOTE that this will cause the server to reply with a multipart
> response!
>
> -p <port>
> Use port other than default for current protocol. This is typically
> most used together with the proxy-flag (-x).
>
> -s
> Silent mode. Don't show progress meter or error messages. Makes
> Urlget mute.
>
> -t (FTP only)
> Transfer the stdin data to the specified file. Urlget will read
> everything from stdin until EOF and store with the supplied name.
>
> -T <file> (FTP only)
> Like -t, but this transfers the specified local file. If there is no
> file part in the specified URL, Urlget will append the local file
> name. NOTE that you must use a trailing / on the last directory to
> really prove to Urlget that there is no file name or urlget will think
> that your last directory name is the remote file name to use. That
> will most likely cause the upload operation to fail.
>
> -u <user:password>
> Specify user and password to use when fetching. See below for detailed
> examples of how to use this.
>
> -U <user:password>
> Specify user and password to use for Proxy authentication.
>
> -v
> Makes the fetching more verbose/talkative. Mostly usable for
> debugging. Lines starting with '>' means data sent by urlget, '<'
> means data received by urlget that is hidden in normal cases and lines
> starting with '*' means additional info provided by urlget.
>
> -x <host>
> Use proxy. The port number to use is set to 1080 when this is used and
> the port flag (-p) is not.
>
> SIMPLE USAGE
>
> Get the main page from netscape's web-server:
>
> urlget http://www.netscape.com/
>
> Get the root README file from funet's ftp-server:
>
> urlget ftp://ftp.funet.fi/README
>
> Get a gopher document from funet's gopher server:
>
> urlget gopher://gopher.funet.fi
>
> Get a web page from a server using port 8000:
>
> urlget http://www.weirdserver.com:8000/
>
> Get a list of the root directory of an FTP site:
>
> urlget ftp://ftp.fts.frontec.se/
>
> DOWNLOAD TO A FILE
>
> Get a web page and store in a local file:
>
> urlget -o thatpage.html http://www.netscape.com/
>
> Get a web page and store in a local file, make the local file get the name
> of the remote document (if no file name part is specified in the URL, this
> will fail):
>
> urlget -O http://www.netscape.com/index.html
>
> USING PASSWORDS
>
> FTP
>
> To ftp files using name+passwd, include them in the URL like:
>
> urlget ftp://name:passwd@machine.domain:port/full/path/to/file
>
> or specify them with the -u flag like
>
> urlget -u name:passwd ftp://machine.domain:port/full/path/to/file
>
> HTTP
>
> The HTTP URL doesn't support user and password in the URL string. Urlget
> does support that anyway to provide a ftp-style interface and thus you can
> pick a file like:
>
> urlget http://name:passwd@machine.domain/full/path/to/file
>
> or specify user and password separately like in
>
> urlget -u name:passwd http://machine.domain/full/path/to/file
>
> NOTE! Since HTTP URLs don't support user and password, you can't use that
> style when using Urlget via a proxy. You _must_ use the -u style fetch
> during such circumstances.
>
> GOPHER
>
> Urlget features no password support for gopher.
>
> PROXY
>
> Get a web page using a proxy named my-proxy that uses port 888:
>
> urlget -p 888 -x my-proxy http://www.web.com/
>
> Get an ftp file using a proxy named my-proxy that uses port 888:
>
> urlget -p 888 -x my-proxy ftp://ftp.leachsite.com/README
>
> Get a file from a HTTP server that requires user and password, using the
> same proxy as above:
>
> urlget -u user:passwd -p 888 -x my-proxy http://www.get.this/
>
> RANGES
>
> With HTTP 1.1 byte-ranges were introduced. Using this, a client can request
> to get only one or more subparts of a specified document. Urlget supports
> this with the -r flag.
>
> Get the first 100 bytes of a document:
>
> urlget -r 0-99 http://www.get.this/
>
> Get the last 500 bytes of a document:
>
> urlget -r -500 http://www.get.this/
>
> UPLOADING
>
> FTP
>
> Upload all data on stdin to a specified ftp site:
>
> urlget -t ftp://ftp.upload.com/myfile
>
> Upload data from a specified file, login with user and password:
>
> urlget -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
>
> Upload a local file to the remote site, and use the local file name remote
> too:
>
> urlget -T uploadfile -u user:passwd ftp://ftp.upload.com/
>
> HTTP
>
> See the POST section below.
>
> VERBOSE / DEBUG
>
> If urlget fails where it isn't supposed to, if the servers don't let you in,
> if you can't understand the responses: use the -v flag to get VERBOSE
> fetching. Urlget will output lots of info and all data it sends and receives
> in order to let the user see all client-server interaction.
>
> urlget -v ftp://ftp.upload.com/
>
> POST
>
> It's easy to post data using urlget. This is done using the -d <data>
> option. The post data must be urlencoded.
>
> Post a simple "name" and "phone" guestbook.
>
> urlget -d "name=Rafael%20Sagula&phone=3320780" http://www.where.com/guest.cgi
>
> REFERER
>
> A HTTP request has the option to include information about which address
> that referred to actual page, and urlget allows the user to specify that
> referrer to get specified on the command line. It is especially useful to
> fool or trick stupid servers or CGI scripts that rely on that information
> being available or contain certain data.
>
> Example:
>
> urlget -e www.coolsite.com http://www.showme.com/
>
> BUGS
> Nope. :-) But don't try excessive string lengths, we don't do many boundary
> checks so those will without doubt crash this.
>
> VERSION
> This document attempts to describe the usage of urlget 3.10.
> Written by Daniel Stenberg 1998-02-04.
>
> AUTHORS
> Daniel Stenberg <Daniel.Stenberg@sth.frontec.se>
> Rafael Sagula <sagula@inf.ufrgs.br>
> Bjorn Reese <breese@imada.ou.dk>
> Johan Anderson <johan@homemail.com>
> Kjell Ericson <Kjell.Ericson@sth.frontec.se>
>
> HOMEPAGE
> http://www.inf.ufrgs.br/~sagula/urlget.html
> -------- cut -------- cut -------- cut -------- cut -------- cut --------
>
> -
> To unsubscribe from this list: send the line "unsubscribe linux-8086" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
--
This communication is intended to be received only by the individual
or entity to whom or to which it is addressed and contains information
that is privileged and confidential. Any unauthorized use, copying,
review or disclosure is prohibited. Please notify the sender
immediately
if you have received this communication in error.
Contents of this message Copyright 2002 Patrick Gilliland
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: urlget
2002-10-15 13:45 ` urlget Pat Gilliland
@ 2002-10-15 14:04 ` Paul Nasrat
2002-10-15 14:22 ` urlget Gregg C Levine
2002-10-16 8:22 ` urlget jb1
1 sibling, 1 reply; 5+ messages in thread
From: Paul Nasrat @ 2002-10-15 14:04 UTC (permalink / raw)
To: Linux-8086@vger.kernel.org
On Tue, Oct 15, 2002 at 09:45:30AM -0400, Pat Gilliland wrote:
> I am assembling an unofficial Clueless Newbie ELKS faq at
> http://www.cyberus.ca/~pgillil/elks.html I will add this information
> very shortly. If anyone else has any undocumented tips/information I
> would be happy to put them up.
The root disk howto from my earlier email might be useful. The mkfs flag
is a big gotcha.
You might want to add the link to the flightlinux page in the why
bother? section as a real *live* use of elks.
Also you could have the running on bochs information:
Here's a sample config from Gregg (I assume from the url)
http://obiwanthejediknight.home.att.net/bochsrc.htm
Here's one from the archives:
http://rainbow.cs.unipi.gr/linux-8086-list/att-1732/01-elksdistro.txt
I'll dig up my slides from the presentation I did and put them online -
you can see if you can get any useful info out of them.
Paul
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: urlget
2002-10-15 14:04 ` urlget Paul Nasrat
@ 2002-10-15 14:22 ` Gregg C Levine
0 siblings, 0 replies; 5+ messages in thread
From: Gregg C Levine @ 2002-10-15 14:22 UTC (permalink / raw)
To: ELKS
Hello from Gregg C Levine
And you do. Be aware that particular page has been up for a very long while,
and hasn't been revised.
Gregg C Levine obiwanthejediknight@att.net
<This signature will be replaced, pending an approval from the Jedi Council.
>
----- Original Message -----
From: "Paul Nasrat" <pauln@truemesh.com>
To: <Linux-8086@vger.kernel.org>
Sent: Tuesday, October 15, 2002 10:04 AM
Subject: Re: urlget
> On Tue, Oct 15, 2002 at 09:45:30AM -0400, Pat Gilliland wrote:
> > I am assembling an unofficial Clueless Newbie ELKS faq at
> > http://www.cyberus.ca/~pgillil/elks.html I will add this information
> > very shortly. If anyone else has any undocumented tips/information I
> > would be happy to put them up.
>
> The root disk howto from my earlier email might be useful. The mkfs flag
> is a big gotcha.
>
> You might want to add the link to the flightlinux page in the why
> bother? section as a real *live* use of elks.
>
> Also you could have the running on bochs information:
>
> Here's a sample config from Gregg (I assume from the url)
>
> http://obiwanthejediknight.home.att.net/bochsrc.htm
>
> Here's one from the archives:
>
> http://rainbow.cs.unipi.gr/linux-8086-list/att-1732/01-elksdistro.txt
>
> I'll dig up my slides from the presentation I did and put them online -
> you can see if you can get any useful info out of them.
>
> Paul
> -
> To unsubscribe from this list: send the line "unsubscribe linux-8086" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
>
^ permalink raw reply [flat|nested] 5+ messages in thread
* Re: urlget
2002-10-15 13:45 ` urlget Pat Gilliland
2002-10-15 14:04 ` urlget Paul Nasrat
@ 2002-10-16 8:22 ` jb1
1 sibling, 0 replies; 5+ messages in thread
From: jb1 @ 2002-10-16 8:22 UTC (permalink / raw)
To: Pat Gilliland; +Cc: Linux-8086@vger.kernel.org
On Tue, 15 Oct 2002, Pat Gilliland wrote:
I'm glad you found this useful, but I wish you had changed the "Subject:".
It's difficult for this newbie (at least) to find information in the
mailing list archives because things tend to go so wildly off-topic. I'm
*still* looking for something about network testing in a long,
completely-unrelated thread.
Please note my typo below. "urlget.test" and "urlget.text" should, of
course, be the same filename. Feel free to change it to anything you want.
> > The command:
> > urlget http://192.168.1.100 > urlget.test
> > issued from my Linux installation created a local copy of the default
> > webpage as "urlget.text". There's a bug in my ELKS installation that
^ permalink raw reply [flat|nested] 5+ messages in thread
end of thread, other threads:[~2002-10-16 8:22 UTC | newest]
Thread overview: 5+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2002-10-15 10:21 urlget jb1
2002-10-15 13:45 ` urlget Pat Gilliland
2002-10-15 14:04 ` urlget Paul Nasrat
2002-10-15 14:22 ` urlget Gregg C Levine
2002-10-16 8:22 ` urlget jb1
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox