[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: New package managment
Ingo Ruhnke wrote:
> > BAD (but typical) STORY:
> >
> > I want to run pingus (say)
> > ...I download...
> > ./configure ; make ; make install
> > ...it says I need 'clanlib'...
> > ...where is the clanlib home page?
> > ...search using Google/Yahoo/whatever...
^^^^^^^^^^^^
(BTW: I didn't mean to pick on Pingus as a particularly
bad or particularly good example. It's just the thing I
happen to have grabbed most recently.)
> Thats no longer needed, it will print out the url in a current
> version and so make the download simpler, but it will be still
> required to download all the stuff yourself.
Yes - that *helps* some. But not everyone does that.
> > BETTER STORY:
> >
> > You go to the Pingus site and download a *tiny* script:
> >
> > pingus.autoweb
> >
> > ...which checks to see if clanlib is installed and if
> > not - knows a good place to download it from - so it
> > downloads a script from the clanlib site:
>
> The idea is great an addition could be to download the tarball,
> instead of the script and than have a script, lets call it
> 'autobuild', which adds a gui to the complete download/build
> process.
I wanted it to download the teeny-tiny script first because
if (for example) Clanlib needs Hermes - but Hermes won't
build on my machine (as was the case under Linux 2.0.0 with
an AMD K6-2 machine when I tried all this a few months ago),
there is no point in waiting for all of Pingus and all of
Clanlib to download. You need to grab the bottommost
layer first, make sure that all builds and installs OK,
then pop back up one layer, build that, and then ONLY
download each layer once you know it's useful.
Some people have REALLY slow net connections and others
pay big $$$ for bandwidth.
It's inconceivable that a script could be more than a
couple of Kbytes - so downloading that up-front is a
worth-while investment.
Another nice thing is that the script could be sent
without tar or compression - which takes out another
step in the installation process.
> Hm, I have to think some more about autobuild, but it could be a neat
> thing in combination with autoweb. But as you said the biggest problem
> at the moment is the download process, so a autoweb, would much more
> important.
It's not hard - if you presume the existance of wget - which I
think is a standard part of Linux these days.
> > If such a scheme were to become more widespread, it would do GREAT
> > things for source-based packages. People simply **HATE** following
> > the paper trail to get all the libraries that a complex modern game
> > needs.
>
> Yep, the thing is that the process itself is trivial, but it takes
> time to search throu all README's and INSTALL's to collect all the
> libs, which are needed for the build and automatic mechanisem would be
> really nice.
The thing that pisses me off the most recently is how hard it is
to find the authoritative home page for a package.
I wes trying to find a description of the Zlib API yesterday,
so I go into Google/Linux and type ZLIB (and subsequently
ZLIB HOME PAGE)...and got back THOUSANDS of hits - almost
all of which were mirrors of the RedHat or SuSE RPM's. Totally
useless. I finally got there by remembering that it was linked
from the GIMP home page...whose URL I remember.
> Hm, don't like the idea of the Netscape plugin, since autoweb would
> IMHO require user interaction (Really download 100MB? [y/n]), so it
> would be needed to run it at the shell.
True. I don't know how much of that a plugin could do - I've
never gotten into that. It would have to work without a plugin
too though because there are other browsers out there that don't
support Netscape's plugin architecture.
--
Steve Baker (817)619-2657 (Vox/Vox-Mail)
Raytheon Systems Inc. (817)619-2466 (Fax)
Work: sjbaker@hti.com http://www.hti.com
Home: sjbaker1@airmail.net http://web2.airmail.net/sjbaker1