Mail Archives: djgpp/1997/09/14/17:06:25
At 05:34 PM 9/14/97 +0300, you wrote:
<Snipped>
>> Agreed. Or better yet, automated with macros for configurations (like
>> non-LFN DOS) that need it.
>
>I don't see how the latter can be done. The problems with filenames
>happen when you unzip the distribution, and you can't easily run a
>conversion script at that point. Fixing the names should be done by
>the package maintainer. If Emacs can do that, so can the rest.
Hm-m-m. Good point. Without the ability to rename files still inside the
zip or tar.gz file, the damage is already done. I will have to re-think
that one, and examine what emacs does before I speak further on that subject.
>> Or tests that make sure the available tools support the functions
>> needed. (See the perl configure script for examples of this.) Then
>> the configuration just can't be rebuilt without the minimum versions
>> and/or capabilities of the needed tools.
>
>Right, but this is a lot of work, and I sincerely doubt that anyone
>who ports a package will invest such an effort.
Well, the perl originators and porters did it. There is nothing that says
we can't build upon their work. We don't always have to re-invent every
wheel. In fact, it is probably more in the spirit of GNU software to
continue to build (with appropriate attribution, of course) on the work
already done by others.
>> All GNU configure scripts *assume* a port of /bin/sh is available.
>
>/bin/sh is standard on Unix systems. Assuming its availability is
>like assuming there is COMMAND.COM on DOS.
Agreed, which is why we need to align ourselves with that Unix assumption if
we're going to continue to port their software.
>> >Look how religiously do GNU people stick to tools that are available on
>> >all platforms. Using your argument, they would require people to install
>> >GNU tools before building other packages.
>>
>> And using my argument above, perhaps they should. Or at least to
>> insist on a minimum set of functionality from native versions of
>> tools.
>
>Such a requirement would be impractical and will make a lot of angry users
>out there.
Well, maybe it *will* anger a few users out there. But why do you say
"impractical"? Because of the number of tools one needs to obtain? How are
10 (or 20 or 30) binary downloads different from one binary download? Once
they are done, they are done, and one can proceed to unzip and use the tools
without further ado (assuming the binary porters have done their job, of
course).
OK, more tools means more potential configuration of those tools, and more
importantly, *mis*-configuration, leading to problems with re-builds. But
in the final analysis, a re-builder needs to understand those tools anyway,
at least enough to use and set them up correctly.
Look, there is a lot of knowledge needed to attempt a re-build. I've said
before I do not consider it a job to be taken lightly, and I mean that. The
high learning curve is daunting, and we are all at different points on that
curve. Your minimalist approach makes it easier on the first-time or less
experienced re-builder. My suggested approach makes it easier on the more
experienced re-builder. There probably need to be packages built with your
approach, to allow novices something to cut their teeth on. I'm just saying
there need to be packages (particularly the most complicated, like gcc or
perl) that are aimed at the more experienced (and tool-aware) re-builders,
as well, not least because the maintenance of a minimalist re-building
package for a complex program like gcc or perl is so huge.
With respect,
Peter
- Raw text -