From: Francis Moreau <francis.moro@gmail.com>
To: Michael J Gruber <git@drmicha.warpmail.net>
Cc: Git Mailing List <git@vger.kernel.org>
Subject: Re: Can't build doc anymore (v1.7.3.2)
Date: Tue, 23 Nov 2010 12:57:46 +0100 [thread overview]
Message-ID: <m2r5ecjlqd.fsf@gmail.com> (raw)
In-Reply-To: <4CEBA872.2020001@drmicha.warpmail.net> (Michael J. Gruber's message of "Tue, 23 Nov 2010 12:41:38 +0100")
Michael J Gruber <git@drmicha.warpmail.net> writes:
> Francis Moreau venit, vidit, dixit 23.11.2010 12:24:
>> Michael J Gruber <git@drmicha.warpmail.net> writes:
>>
>>> Francis Moreau venit, vidit, dixit 23.11.2010 10:32:
>
>>>>
>>>> $ make prefix=/usr/local NO_CURL=1 ASCIIDOC8=y DOCBOOK2X_TEXI=db2x_docbook2texi ASCIIDOC_NO_ROFF=y XMLTO_EXTRA="--skip-validation" V=1 doc
>>>> make -C Documentation all
>>>> make[1]: Entering directory `/home/fmoreau/git/Documentation'
>>>> make -C ../ GIT-VERSION-FILE
>>>> make[2]: Entering directory `/home/fmoreau/git'
>>>> make[2]: `GIT-VERSION-FILE' is up to date.
>>>> make[2]: Leaving directory `/home/fmoreau/git'
>>>> rm -f git-fetch.1 && \
>>>> xmlto -m manpage-normal.xsl --skip-validation man git-fetch.xml
>>>> I/O error : Attempt to load network entity http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl
>>>> warning: failed to load external entity "http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl"
>>>> compilation error: file /tmp/xmlto-xsl.A7kzn5 line 4 element import
>>>> xsl:import : unable to load http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl
>>>> make[1]: *** [git-fetch.1] Error 1
>>>> make[1]: Leaving directory `/home/fmoreau/git/Documentation'
>>>> make: *** [doc] Error 2
>>>
>>> This is weird for several reasons.
>>>
>>> Can you wget or curl these files?
>>
>> Yes I can.
>>
>>>
>>> Besides, I can build the doc even without network access, even though my
>>> /tmp/xmlto... has the same import statement.
>>>
>>> Can you check with which options your xmlto calls your xsltproc? Mine
>>> has "--nonet".
>>
>> I can see the following ones:
>>
>> --nonet
>> --xinclude
>>
>>> Do you have libxslt-1.1.26-3.fc14.x86_64, and is your xsltproc the one
>>> from that package?
>>
>> $ rpm -qa | grep libxslt
>> libxslt-devel-1.1.26-3.fc14.x86_64
>> libxslt-1.1.26-3.fc14.x86_64
>>
>> $ which xsltproc
>> /usr/bin/xsltproc
>>
>> $ rpm -qf /usr/bin/xsltproc
>> libxslt-1.1.26-3.fc14.x86_64
>>
>
> I'm pretty stomped then. The only remaining suggestions:
>
> - remove xml-commons-resolver and try again
> - try as a different user
still fails.
>
> Otherwise, an strace of xsltproc might give some hints...
Here it is:
stat("/tmp/xmlto-xsl.aSCQgY", {st_mode=S_IFREG|0600, st_size=346, ...}) = 0
stat("/tmp/xmlto-xsl.aSCQgY", {st_mode=S_IFREG|0600, st_size=346, ...}) = 0
stat("/tmp/xmlto-xsl.aSCQgY", {st_mode=S_IFREG|0600, st_size=346, ...}) = 0
stat("/tmp/xmlto-xsl.aSCQgY", {st_mode=S_IFREG|0600, st_size=346, ...}) = 0
open("/tmp/xmlto-xsl.aSCQgY", O_RDONLY) = 3
lseek(3, 0, SEEK_CUR) = 0
read(3, "<?xml version='1.0'?>\n<xsl:style"..., 8192) = 346
read(3, "", 7846) = 0
close(3) = 0
stat("http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl", 0x7fffa8779e50) = -1 ENOENT (No such file or directory)
stat("http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl", 0x7fffa8779d90) = -1 ENOENT (No such file or directory)
stat("/etc/xml/catalog", {st_mode=S_IFREG|0644, st_size=819, ...}) = 0
open("/etc/xml/catalog", O_RDONLY) = 3
lseek(3, 0, SEEK_CUR) = 0
read(3, "<?xml version=\"1.0\"?>\n<!DOCTYPE "..., 8192) = 819
read(3, "", 7373) = 0
close(3) = 0
stat("http://docbook.sourceforge.net/release/xsl/current/manpages/docbook.xsl", 0x7fffa8779d90) = -1 ENOENT (No such file or directory)
write(2, "I/O ", 4) = 4
write(2, "error : ", 8) = 8
write(2, "Attempt to load network entity h"..., 103) = 103
write(2, "warning: ", 9) = 9
write(2, "failed to load external entity \""..., 105) = 105
write(2, "compilation error: file /tmp/xml"..., 68) = 68
write(2, "xsl:import : unable to load http"..., 100) = 100
This looks weird, it does stat(2) on an URL...
--
Francis
next prev parent reply other threads:[~2010-11-23 11:59 UTC|newest]
Thread overview: 26+ messages / expand[flat|nested] mbox.gz Atom feed top
2010-11-22 19:45 Can't build doc anymore (v1.7.3.2) Francis Moreau
2010-11-22 20:35 ` Drew Northup
2010-11-22 20:44 ` Francis Moreau
2010-11-22 21:24 ` Drew Northup
2010-11-23 7:50 ` Francis Moreau
2010-11-23 8:45 ` Michael J Gruber
2010-11-23 9:05 ` Francis Moreau
2010-11-23 9:20 ` Michael J Gruber
2010-11-23 9:32 ` Francis Moreau
2010-11-23 10:01 ` Michael J Gruber
2010-11-23 11:24 ` Francis Moreau
2010-11-23 11:41 ` Michael J Gruber
2010-11-23 11:57 ` Francis Moreau [this message]
2010-11-23 12:04 ` Michael J Gruber
2010-11-23 12:31 ` Francis Moreau
2010-11-23 13:13 ` Drew Northup
2010-11-23 13:59 ` Michael J Gruber
2010-11-23 16:36 ` Francis Moreau
2010-11-23 17:06 ` Drew Northup
2010-11-23 20:48 ` Francis Moreau
2010-11-24 19:17 ` Junio C Hamano
2010-11-25 8:13 ` git
2010-11-25 19:54 ` Francis Moreau
2010-11-23 16:30 ` Francis Moreau
2010-11-22 21:01 ` Pascal Obry
2010-11-22 21:14 ` Francis Moreau
Reply instructions:
You may reply publicly to this message via plain-text email
using any one of the following methods:
* Save the following mbox file, import it into your mail client,
and reply-to-all from there: mbox
Avoid top-posting and favor interleaved quoting:
https://en.wikipedia.org/wiki/Posting_style#Interleaved_style
* Reply using the --to, --cc, and --in-reply-to
switches of git-send-email(1):
git send-email \
--in-reply-to=m2r5ecjlqd.fsf@gmail.com \
--to=francis.moro@gmail.com \
--cc=git@drmicha.warpmail.net \
--cc=git@vger.kernel.org \
/path/to/YOUR_REPLY
https://kernel.org/pub/software/scm/git/docs/git-send-email.html
* If your mail client supports setting the In-Reply-To header
via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line
before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).