git.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH 0/3] http-fetch enhancements
@ 2005-09-26 17:51 Nick Hengeveld
  2005-09-26 22:29 ` Daniel Barkalow
  0 siblings, 1 reply; 4+ messages in thread
From: Nick Hengeveld @ 2005-09-26 17:51 UTC (permalink / raw)
  To: git

The following series contains some http-fetch enhancements, based on
our requirements for use of SSL client certificates and partial HTTP
transfers.

-- 
For a successful technology, reality must take precedence over public
relations, for nature cannot be fooled.

^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [PATCH 0/3] http-fetch enhancements
  2005-09-26 17:51 [PATCH 0/3] http-fetch enhancements Nick Hengeveld
@ 2005-09-26 22:29 ` Daniel Barkalow
  2005-09-28  2:39   ` Nick Hengeveld
  0 siblings, 1 reply; 4+ messages in thread
From: Daniel Barkalow @ 2005-09-26 22:29 UTC (permalink / raw)
  To: Nick Hengeveld; +Cc: git

On Mon, 26 Sep 2005, Nick Hengeveld wrote:

> The following series contains some http-fetch enhancements, based on
> our requirements for use of SSL client certificates and partial HTTP
> transfers.

If you happen to know how to have curl do multiple simultaneous downloads, 
that would be a big performance win, and I should be able to explain how 
to get this to work. I haven't gotten around to learning libcurl well 
enough to do the flow control.

	-Daniel
*This .sig left intentionally blank*

^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [PATCH 0/3] http-fetch enhancements
  2005-09-26 22:29 ` Daniel Barkalow
@ 2005-09-28  2:39   ` Nick Hengeveld
  2005-09-28  3:37     ` Daniel Barkalow
  0 siblings, 1 reply; 4+ messages in thread
From: Nick Hengeveld @ 2005-09-28  2:39 UTC (permalink / raw)
  To: Daniel Barkalow; +Cc: Nick Hengeveld, git

On Mon, Sep 26, 2005 at 03:29:02PM -0700, Daniel Barkalow wrote:

>    If you happen to know how to have curl do multiple simultaneous downloads,
>    that would be a big performance win, and I should be able to explain how
>    to get this to work. I haven't gotten around to learning libcurl well
>    enough to do the flow control.

The curl multi interface looks pretty straightforward.  What did you have
in mind as far as which requests would be running concurrently and how they
would need to be limited?

-- 
For a successful technology, reality must take precedence over public
relations, for nature cannot be fooled.

^ permalink raw reply	[flat|nested] 4+ messages in thread

* Re: [PATCH 0/3] http-fetch enhancements
  2005-09-28  2:39   ` Nick Hengeveld
@ 2005-09-28  3:37     ` Daniel Barkalow
  0 siblings, 0 replies; 4+ messages in thread
From: Daniel Barkalow @ 2005-09-28  3:37 UTC (permalink / raw)
  To: Nick Hengeveld; +Cc: Nick Hengeveld, git

On Tue, 27 Sep 2005, Nick Hengeveld wrote:

> On Mon, Sep 26, 2005 at 03:29:02PM -0700, Daniel Barkalow wrote:
> 
> >    If you happen to know how to have curl do multiple simultaneous downloads,
> >    that would be a big performance win, and I should be able to explain how
> >    to get this to work. I haven't gotten around to learning libcurl well
> >    enough to do the flow control.
> 
> The curl multi interface looks pretty straightforward.  What did you have
> in mind as far as which requests would be running concurrently and how they
> would need to be limited?

The way fetch.c calls the functions, there's a prefetch() that indicates 
that a given object is needed, and a fetch() that is responsible for 
making sure the object is available when it returns (or returning an 
error). It is arranged such that the same list of hashes is given to each 
of the functions in the same order.

One method is to send requests in prefetch() and accept responses in 
fetch(); this is what git-ssh-pull now does. This seems in practice to 
lead to ~100 outstanding requests at the high point, which is great for 
throughput, but I'm not sure how polite it is. IIRC, browsers tend to do 
~4 simultaneous connections, or at least used to.

The other method is to keep track of what you're fetching, and block in 
prefetch() if too many connections are in use until some connections are 
free, or in fetch() if that is called before the download is complete.

Note that it should theoretically be possible to make additional requests 
on the same connection, provided it's "keep alive", even before reading 
the response, so long as the code is able to figure out what happened if 
the server actually closes it (after the first request) instead of serving 
the later ones.

	-Daniel
*This .sig left intentionally blank*

^ permalink raw reply	[flat|nested] 4+ messages in thread

end of thread, other threads:[~2005-09-28  3:32 UTC | newest]

Thread overview: 4+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2005-09-26 17:51 [PATCH 0/3] http-fetch enhancements Nick Hengeveld
2005-09-26 22:29 ` Daniel Barkalow
2005-09-28  2:39   ` Nick Hengeveld
2005-09-28  3:37     ` Daniel Barkalow

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).