* resume downloads
@ 2015-05-10 21:55 Thiago Farina
2015-05-10 22:19 ` Junio C Hamano
0 siblings, 1 reply; 3+ messages in thread
From: Thiago Farina @ 2015-05-10 21:55 UTC (permalink / raw)
To: Git Mailing List
Hi,
Is there links to discussion on this? I mean, is resume downloads a
feature that is still being considered?
Being able to download huge repos like WebKit, Linux, LibreOffice in
small parts seems like a good feature to me.
--
Thiago Farina
^ permalink raw reply [flat|nested] 3+ messages in thread
* Re: resume downloads
2015-05-10 21:55 resume downloads Thiago Farina
@ 2015-05-10 22:19 ` Junio C Hamano
2015-05-12 9:54 ` Sitaram Chamarty
0 siblings, 1 reply; 3+ messages in thread
From: Junio C Hamano @ 2015-05-10 22:19 UTC (permalink / raw)
To: Thiago Farina; +Cc: Git Mailing List
The current thinking is to model this after the "repo" tool.
Prepare a reasonably up-to-date bundle file on the server side,
add a protocol capability to advertise the URL to download that
bundle from upload-pack, and have "git clone" to pay attention to it.
Then, a "git clone" could become:
- If the capability advertises such a prebuilt bundle, spawn "curl"
or "wget" internally to fetch it. This can be resumed when the
connection goes down and will grab majority of the data necessary.
- Extract the bundle into temporary area inside .git/refs/ to help
the next step.
- Internally do a "git fetch" to the original server. Thanks to the
bundle transfer that has already happened, this step will become
a small incremental update.
- Then prune away the temporary .git/refs/ refs that were in the
bundle, as these are not the up-to-date refs that exist on the
server side.
A few points that need to be considered by whoever is doing this
are:
- Where to download the bundle, so that after killing "git clone"
that is still in the bundle-download phase, the next invocation
of "git clone" can notice and resume the bundle-download;
- What kind of transfer protocols do we want to support? Is http
and https from CDN sufficient? In other words, what exactly
should the new capability say to point at the prebuilt bundle?
These (and probably there are several others) are not something
that "repo" does not have to worry about, but would become
issues when we try to fold this into "git clone".
On Sun, May 10, 2015 at 2:55 PM, Thiago Farina <tfransosi@gmail.com> wrote:
> Hi,
>
> Is there links to discussion on this? I mean, is resume downloads a
> feature that is still being considered?
>
> Being able to download huge repos like WebKit, Linux, LibreOffice in
> small parts seems like a good feature to me.
>
> --
> Thiago Farina
> --
> To unsubscribe from this list: send the line "unsubscribe git" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
^ permalink raw reply [flat|nested] 3+ messages in thread
* Re: resume downloads
2015-05-10 22:19 ` Junio C Hamano
@ 2015-05-12 9:54 ` Sitaram Chamarty
0 siblings, 0 replies; 3+ messages in thread
From: Sitaram Chamarty @ 2015-05-12 9:54 UTC (permalink / raw)
To: Junio C Hamano, Thiago Farina; +Cc: Git Mailing List
On 05/11/2015 03:49 AM, Junio C Hamano wrote:
> The current thinking is to model this after the "repo" tool.
> Prepare a reasonably up-to-date bundle file on the server side,
<shameless plug (but not "commercial")>
For people using gitolite, the server side issues of generating a
reasonably up-to-date bundle *and* enabling it for resumable download
using rsync (with the same ssh key used to gain gitolite access), can
all be handled by gitolite.
</shameless plug>
Of course the client side issues still remain; gitolite can't help
there.
> add a protocol capability to advertise the URL to download that
> bundle from upload-pack, and have "git clone" to pay attention to it.
>
> Then, a "git clone" could become:
>
> - If the capability advertises such a prebuilt bundle, spawn "curl"
> or "wget" internally to fetch it. This can be resumed when the
> connection goes down and will grab majority of the data necessary.
>
> - Extract the bundle into temporary area inside .git/refs/ to help
> the next step.
>
> - Internally do a "git fetch" to the original server. Thanks to the
> bundle transfer that has already happened, this step will become
> a small incremental update.
>
> - Then prune away the temporary .git/refs/ refs that were in the
> bundle, as these are not the up-to-date refs that exist on the
> server side.
>
> A few points that need to be considered by whoever is doing this
> are:
>
> - Where to download the bundle, so that after killing "git clone"
> that is still in the bundle-download phase, the next invocation
> of "git clone" can notice and resume the bundle-download;
>
> - What kind of transfer protocols do we want to support? Is http
> and https from CDN sufficient? In other words, what exactly
> should the new capability say to point at the prebuilt bundle?
>
> These (and probably there are several others) are not something
> that "repo" does not have to worry about, but would become
> issues when we try to fold this into "git clone".
>
>
>
> On Sun, May 10, 2015 at 2:55 PM, Thiago Farina <tfransosi@gmail.com> wrote:
>> Hi,
>>
>> Is there links to discussion on this? I mean, is resume downloads a
>> feature that is still being considered?
>>
>> Being able to download huge repos like WebKit, Linux, LibreOffice in
>> small parts seems like a good feature to me.
>>
>> --
>> Thiago Farina
>> --
>> To unsubscribe from this list: send the line "unsubscribe git" in
>> the body of a message to majordomo@vger.kernel.org
>> More majordomo info at http://vger.kernel.org/majordomo-info.html
> --
> To unsubscribe from this list: send the line "unsubscribe git" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
>
^ permalink raw reply [flat|nested] 3+ messages in thread
end of thread, other threads:[~2015-05-12 9:54 UTC | newest]
Thread overview: 3+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2015-05-10 21:55 resume downloads Thiago Farina
2015-05-10 22:19 ` Junio C Hamano
2015-05-12 9:54 ` Sitaram Chamarty
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).