* git update-ref --stdin : too many open files
@ 2014-12-20 10:24 Loic Dachary
2014-12-22 21:22 ` Junio C Hamano
0 siblings, 1 reply; 8+ messages in thread
From: Loic Dachary @ 2014-12-20 10:24 UTC (permalink / raw)
To: git
[-- Attachment #1: Type: text/plain, Size: 680 bytes --]
Hi,
Steps to reproduce:
$ git --version
git version 1.9.1
$ wc -l /tmp/1
9090 /tmp/1
$ head /tmp/1
delete refs/pull/1/head
create refs/heads/pull/1 86b715f346e52920ca7c9dfe65424eb9946ebd61
delete refs/pull/1/merge
create refs/merges/1 c0633abdc5311354c9729374e0ba25c97a89f69e
...
$ ulimit -n
1024
$ git update-ref --stdin < /tmp/1
fatal: Unable to create '/home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too many open files
$ head -20 /tmp/1 | git update-ref --stdin
$ echo $?
0
The workaround is to increase ulimit -n
git update-ref --stdin should probably close some files.
Cheers
--
Loïc Dachary, Artisan Logiciel Libre
[-- Attachment #2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 198 bytes --]
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-20 10:24 git update-ref --stdin : too many open files Loic Dachary
@ 2014-12-22 21:22 ` Junio C Hamano
2014-12-23 2:41 ` Stefan Beller
2014-12-24 2:11 ` Stefan Beller
0 siblings, 2 replies; 8+ messages in thread
From: Junio C Hamano @ 2014-12-22 21:22 UTC (permalink / raw)
To: Stefan Beller; +Cc: git, Loic Dachary
Loic Dachary <loic@dachary.org> writes:
> Hi,
>
> Steps to reproduce:
>
> $ git --version
> git version 1.9.1
> $ wc -l /tmp/1
> 9090 /tmp/1
> $ head /tmp/1
> delete refs/pull/1/head
> create refs/heads/pull/1 86b715f346e52920ca7c9dfe65424eb9946ebd61
> delete refs/pull/1/merge
> create refs/merges/1 c0633abdc5311354c9729374e0ba25c97a89f69e
> ...
> $ ulimit -n
> 1024
> $ git update-ref --stdin < /tmp/1
> fatal: Unable to create
> /home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too
> many open files
> $ head -20 /tmp/1 | git update-ref --stdin
> $ echo $?
> 0
>
> The workaround is to increase ulimit -n
>
> git update-ref --stdin should probably close some files.
>
> Cheers
Sounds like the recent "ref update in a transaction" issue to me.
Stefan, want to take a look? I think we do need to keep the .lock
files without renaming while in transaction, but we do not have to
keep them open, so I suspect that a fix may be to split the commit
function into two (one to close but not rename, the other to
finalize by renaming) or something.
Also the version of transaction series we have queued seem to lock
these refs very late in the process, but as we discussed privately
a few weeks ago, we would want to move the lock much earlier, when
the first update is attempted.
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-22 21:22 ` Junio C Hamano
@ 2014-12-23 2:41 ` Stefan Beller
2014-12-23 3:22 ` Jonathan Nieder
2014-12-23 15:57 ` Junio C Hamano
2014-12-24 2:11 ` Stefan Beller
1 sibling, 2 replies; 8+ messages in thread
From: Stefan Beller @ 2014-12-23 2:41 UTC (permalink / raw)
To: Junio C Hamano, Stefan Beller; +Cc: git, Loic Dachary
On 22.12.2014 13:22, Junio C Hamano wrote:
> Loic Dachary <loic@dachary.org> writes:
>
>> Hi,
>>
>> Steps to reproduce:
>>
>> $ git --version
>> git version 1.9.1
>> $ wc -l /tmp/1
>> 9090 /tmp/1
>> $ head /tmp/1
>> delete refs/pull/1/head
>> create refs/heads/pull/1 86b715f346e52920ca7c9dfe65424eb9946ebd61
>> delete refs/pull/1/merge
>> create refs/merges/1 c0633abdc5311354c9729374e0ba25c97a89f69e
>> ...
>> $ ulimit -n
>> 1024
>> $ git update-ref --stdin < /tmp/1
>> fatal: Unable to create
>> /home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too
>> many open files
>> $ head -20 /tmp/1 | git update-ref --stdin
>> $ echo $?
>> 0
>>
>> The workaround is to increase ulimit -n
>>
>> git update-ref --stdin should probably close some files.
>>
>> Cheers
>
> Sounds like the recent "ref update in a transaction" issue to me.
>
> Stefan, want to take a look? I think we do need to keep the .lock
> files without renaming while in transaction, but we do not have to
> keep them open, so I suspect that a fix may be to split the commit
> function into two (one to close but not rename, the other to
> finalize by renaming) or something.
Sounds reasonable. Though by closing the file we're giving up again a
bit of safety. If we close the file everyone could tamper with the lock
file. (Sure they are not supposed to touch it, but they could)
>
> Also the version of transaction series we have queued seem to lock
> these refs very late in the process, but as we discussed privately
> a few weeks ago, we would want to move the lock much earlier, when
> the first update is attempted.
I'll look into that tomorrow.
> --
> To unsubscribe from this list: send the line "unsubscribe git" in
> the body of a message to majordomo@vger.kernel.org
> More majordomo info at http://vger.kernel.org/majordomo-info.html
>
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-23 2:41 ` Stefan Beller
@ 2014-12-23 3:22 ` Jonathan Nieder
2014-12-23 15:57 ` Junio C Hamano
1 sibling, 0 replies; 8+ messages in thread
From: Jonathan Nieder @ 2014-12-23 3:22 UTC (permalink / raw)
To: Stefan Beller; +Cc: Junio C Hamano, Stefan Beller, git, Loic Dachary
Hi Stefan,
Stefan Beller wrote:
> On 22.12.2014 13:22, Junio C Hamano wrote:
>> Loic Dachary <loic@dachary.org> writes:
>>> fatal: Unable to create /home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too many open files
[...]
>> Stefan, want to take a look? I think we do need to keep the .lock
>> files without renaming while in transaction, but we do not have to
>> keep them open, so I suspect that a fix may be to split the commit
>> function into two (one to close but not rename, the other to
>> finalize by renaming) or something.
Makes sense.
> Sounds reasonable. Though by closing the file we're giving up again a
> bit of safety. If we close the file everyone could tamper with the lock
> file. (Sure they are not supposed to touch it, but they could)
At least on Linux, keeping a file open doesn't offer any protection
against someone else deleting it. (It also doesn't offer any
protection against someone updating the ref directly. ;-) Opening the
corresponding .lock file with O_EXCL is part of the contract for
updating refs.)
Thanks,
Jonathan
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-23 2:41 ` Stefan Beller
2014-12-23 3:22 ` Jonathan Nieder
@ 2014-12-23 15:57 ` Junio C Hamano
1 sibling, 0 replies; 8+ messages in thread
From: Junio C Hamano @ 2014-12-23 15:57 UTC (permalink / raw)
To: Stefan Beller; +Cc: Stefan Beller, git, Loic Dachary
Stefan Beller <stefanbeller@gmail.com> writes:
> Sounds reasonable. Though by closing the file we're giving up again a
> bit of safety. If we close the file everyone could tamper with the lock
> file. (Sure they are not supposed to touch it, but they could)
There are locking primitives (SysV mandatory locking) that require
you to keep the file you have lock on open for you to retain the
ownership of the lock, and that kind of lock does prevent random
other processes from simultaneously accessing the locked file.
But that is not what our locks are designed around; our locks rely
only on "open(O_EXCL|O_CREAT) fails if it already exists". And
between keeping a lockfile open and closing but not removing a
lockfile, there is no difference how the lockfile that still exists
prevents open(O_EXCL|O_CREAT) by other processes from succeeding.
So we are not giving up any safety at all, which is a good thing ;-).
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-22 21:22 ` Junio C Hamano
2014-12-23 2:41 ` Stefan Beller
@ 2014-12-24 2:11 ` Stefan Beller
2014-12-29 1:28 ` Michael Haggerty
1 sibling, 1 reply; 8+ messages in thread
From: Stefan Beller @ 2014-12-24 2:11 UTC (permalink / raw)
To: Junio C Hamano, Stefan Beller; +Cc: git, Loic Dachary
On 22.12.2014 13:22, Junio C Hamano wrote:
> Loic Dachary <loic@dachary.org> writes:
>
>> Hi,
>>
>> Steps to reproduce:
>>
>> $ git --version
>> git version 1.9.1
>> $ wc -l /tmp/1
>> 9090 /tmp/1
>> $ head /tmp/1
>> delete refs/pull/1/head
>> create refs/heads/pull/1 86b715f346e52920ca7c9dfe65424eb9946ebd61
>> delete refs/pull/1/merge
>> create refs/merges/1 c0633abdc5311354c9729374e0ba25c97a89f69e
>> ...
>> $ ulimit -n
>> 1024
>> $ git update-ref --stdin < /tmp/1
>> fatal: Unable to create
>> /home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too
>> many open files
>> $ head -20 /tmp/1 | git update-ref --stdin
>> $ echo $?
>> 0
>>
>> The workaround is to increase ulimit -n
>>
>> git update-ref --stdin should probably close some files.
>>
>> Cheers
>
> Sounds like the recent "ref update in a transaction" issue to me.
>
> Stefan, want to take a look? I think we do need to keep the .lock
> files without renaming while in transaction, but we do not have to
> keep them open, so I suspect that a fix may be to split the commit
> function into two (one to close but not rename, the other to
> finalize by renaming) or something.
>
> Also the version of transaction series we have queued seem to lock
> these refs very late in the process, but as we discussed privately
> a few weeks ago, we would want to move the lock much earlier, when
> the first update is attempted.
So I decided the first thing to do was to put this case into the test
suite. so I typed in good faith:
test_expect_success 'but does it scale?' '
for i in $(seq 1 1234567) ; do
git branch branch_${i}
echo "delete refs/heads/branch_${i}" >>large_input
done
git update-ref --stdin <large_input
'
And there I have problems with my hard disk having more than a million
files in one directory. So once I get rid of that I'll come up with a
better way to test and fix this problem.
Thanks,
Stefan
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-24 2:11 ` Stefan Beller
@ 2014-12-29 1:28 ` Michael Haggerty
2014-12-29 22:56 ` Stefan Beller
0 siblings, 1 reply; 8+ messages in thread
From: Michael Haggerty @ 2014-12-29 1:28 UTC (permalink / raw)
To: Stefan Beller, Junio C Hamano, Stefan Beller; +Cc: git, Loic Dachary
On 12/24/2014 03:11 AM, Stefan Beller wrote:
> On 22.12.2014 13:22, Junio C Hamano wrote:
>> Loic Dachary <loic@dachary.org> writes:
>>
>>> Hi,
>>>
>>> Steps to reproduce:
>>>
>>> $ git --version
>>> git version 1.9.1
>>> $ wc -l /tmp/1
>>> 9090 /tmp/1
>>> $ head /tmp/1
>>> delete refs/pull/1/head
>>> create refs/heads/pull/1 86b715f346e52920ca7c9dfe65424eb9946ebd61
>>> delete refs/pull/1/merge
>>> create refs/merges/1 c0633abdc5311354c9729374e0ba25c97a89f69e
>>> ...
>>> $ ulimit -n
>>> 1024
>>> $ git update-ref --stdin < /tmp/1
>>> fatal: Unable to create
>>> /home/gitmirror/repositories/Ceph/ceph/refs/heads/pull/1917.lock': Too
>>> many open files
>>> $ head -20 /tmp/1 | git update-ref --stdin
>>> $ echo $?
>>> 0
>>>
>>> The workaround is to increase ulimit -n
>>>
>>> git update-ref --stdin should probably close some files.
>>>
>>> Cheers
>>
>> Sounds like the recent "ref update in a transaction" issue to me.
>>
>> Stefan, want to take a look? I think we do need to keep the .lock
>> files without renaming while in transaction, but we do not have to
>> keep them open, so I suspect that a fix may be to split the commit
>> function into two (one to close but not rename, the other to
>> finalize by renaming) or something.
>>
>> Also the version of transaction series we have queued seem to lock
>> these refs very late in the process, but as we discussed privately
>> a few weeks ago, we would want to move the lock much earlier, when
>> the first update is attempted.
>
> So I decided the first thing to do was to put this case into the test
> suite. so I typed in good faith:
>
> test_expect_success 'but does it scale?' '
> for i in $(seq 1 1234567) ; do
> git branch branch_${i}
> echo "delete refs/heads/branch_${i}" >>large_input
> done
> git update-ref --stdin <large_input
> '
>
> And there I have problems with my hard disk having more than a million
> files in one directory. So once I get rid of that I'll come up with a
> better way to test and fix this problem.
I suggest something like the following to demonstrate the failure. Note
the attempt to set "ulimit -l" to a lower value to make the test more
tractable. (Given that change, sharding the references, which is also
demonstrated here, is perhaps superfluous.)
test_expect_failure 'big transaction does not burst open file limit' '
(
# Temporarily consider it a failure if we cannot reduce
# the allowed number of open files:
test "$(ulimit -n)" -le 1024 || ulimit -n 1024 || exit 1
for i in $(seq 33)
do
for j in $(seq 32)
do
echo "create refs/heads/$i/$j HEAD"
done
done >large_input &&
git update-ref --stdin <large_input &&
git rev-parse --verify -q refs/heads/33/32
)
'
After the bug is fixed, this can be changed to
test_expect_success 'big transaction does not burst open file limit' '
(
# Try to reduce the allowed number of open files, but even if
# that is not possible, run the test anyway:
test "$(ulimit -n)" -le 1024 || ulimit -n 1024
for i in $(seq 33)
do
for j in $(seq 32)
do
echo "create refs/heads/$i/$j HEAD"
done
done >large_input &&
git update-ref --stdin <large_input &&
git rev-parse --verify -q refs/heads/33/32
)
'
It might make sense to test both the "create" and the "delete" code
paths, as their locking sequence differs.
I'm doing some work in this area, so I should be able to work on the
bugfix in the not-too-distant future. My feeling is that the bug is
unlikely to affect many current users, though it definitely should be
fixed before sb/atomic-push is merged.
Michael
--
Michael Haggerty
mhagger@alum.mit.edu
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: git update-ref --stdin : too many open files
2014-12-29 1:28 ` Michael Haggerty
@ 2014-12-29 22:56 ` Stefan Beller
0 siblings, 0 replies; 8+ messages in thread
From: Stefan Beller @ 2014-12-29 22:56 UTC (permalink / raw)
To: Michael Haggerty
Cc: Stefan Beller, Junio C Hamano, git@vger.kernel.org, Loic Dachary
On Sun, Dec 28, 2014 at 5:28 PM, Michael Haggerty <mhagger@alum.mit.edu> wrote:
>
> I'm doing some work in this area, so I should be able to work on the
> bugfix in the not-too-distant future. My feeling is that the bug is
> unlikely to affect many current users, though it definitely should be
> fixed before sb/atomic-push is merged.
>
So are you heading for the bug fix?
I'd then drop this off my todo list.
Thanks,
Stefan
^ permalink raw reply [flat|nested] 8+ messages in thread
end of thread, other threads:[~2014-12-29 23:00 UTC | newest]
Thread overview: 8+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2014-12-20 10:24 git update-ref --stdin : too many open files Loic Dachary
2014-12-22 21:22 ` Junio C Hamano
2014-12-23 2:41 ` Stefan Beller
2014-12-23 3:22 ` Jonathan Nieder
2014-12-23 15:57 ` Junio C Hamano
2014-12-24 2:11 ` Stefan Beller
2014-12-29 1:28 ` Michael Haggerty
2014-12-29 22:56 ` Stefan Beller
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).