[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: New package managment (fbsd ports)
Erik wrote:
>
> okie, I got to play with fbsd's "ports" all today trying to get a system up and
> running. It's very very similar to this 'autoweb' idea, so I guess I'll try to
> summarize what it does and why I perceive as its deficiencies, as it may be
> more logical to merge the fbsd ports collection into linux instead of
> implementing a whole new scheme.
If there is something out there that does the right thing - then I'm all
in favor
of it.
> With the distro, there's a /usr/ports heirarchy built. there's a Makefile, a
> directory called distfiles where the downloaded source is stored, and several
> directories broken by catagory (graphics, devel, archivers, x11, x11-servers,
> x11-wm, etc. a whole slew). inside of each directory is a directory for each
> package. inside of that is a makefile, and md5 file, and some other neglegible
> stuff which I'm not too terribly concerned with yet :)
>
> I went to install windowmaker, which depends on several packages. so I did
> cd /usr/ports/x11-wm/windowmaker
> make
> the makefile in the windowmaker directory contains info on where the main
> source distro is, which version, what it depends upon, and where the
> appropriate patches are.
But this requires that I have already made a "windowmaker" directory and
downloaded the Makefile for it - right? Either that or every distro has
to have directories and Makefiles for ALL the packages there will ever
be.
I suppose we could make the autoload script create a directory and a
Makefile in the /usr/ports approved way:
eg windowmaker.al contains:
mkdir -p /usr/ports/x11-wm/windowmaker
cd /usr/ports/x11-wm/windowmaker
cat >Makefile <<HERE
...stuff...
HERE
make
> The makefile includes some other makefile that does
> the magic (I haven't looked at it yet). okie, I run "make"
> it says it can't find windowmaker-0.60.0.tar.gz on the system, so it proceeds
> to download it, unpack it, and then check the dependancies.
> it tells me I don't have libtiff installs, so it goes to
> /usr/ports/graphics/libtiff and runs 'make' there...
But in our case, that directory/Makefile pair might not exist either, so
we still need something (Preferably something downloaded from the
libtiff
web site) to create that directory and *it's* Makefile.
>, which proceeds to download
> libtiff, patch it, and check dependancies of libtiff
> it tells me I don't have libjpeg installed, so it does it again
> then it builds libtiff, then it builds windowmaker (after other dependancies)
> this was all done automagic, and if an md5sum fails, it stops the entire
> process right there. Everything was installed into the /usr/local directory and
> the permissions looked pretty tight. It's possable to over-ride md5sum checks
> with a parm to make. running "make clean" in /usr/ports proceeds to walk all
> the directories and clean things up.
>
> The deficiencies that I perceived are
>
> 1. if uses a program called 'fetch' which craps out on some servers. I ended up
> installing ncftp3 so I could get the files into /usr/ports/distfiles by hand.
wget seems a pretty solid tool for this kind of thing. It beats any kind
of
FTP-like tool because it knows how to get things via http as well as
ftp.
> 2. Some packages were outdated. It wanted libgif 3.0, but libgif 4.1.0 is the
> freshest and 3.0 was a little difficult to find (esr doesn't seem to have an
> account where the fbsd ports thought he should). trying to kludge in 4.1.0 with
> some makefile editing didn't work, the patch files were a bit sensative.
That's the reason I'd like to have the script for actually fetching a
particular
version of a library to be stored on the library's web site.
> 3. it downloaded the file before recursing into dependancies. This didn't seem
> like a problem as I was doing it, but as was mentioned, if a low level
> dependancy cannot be met, then that's a whole lot of download for nothing.
Yes - It seems pretty dumb to dowload things top-down, the bottom-up
approach
works better since you don't end up downloading a game if for some
reason you
can't run it.
Another thing I can see as a problem is that my proposal somewhat
depends on
all the library managers adding an autoload script to their web sites.
There
is obviously going to be a period when that won't happen (especially if
the
scheme is slow to take off).
Hence, the scheme has to allow the autoload script to be stored
somewhere
different from the library it refers to. Assuming we can manage that,
there
would be the option for the game writer to create his own autoload
scripts
for libraries that don't have such scripts maintained on their own
sites.
Another possibility would be for some kind person to provide autoload
scripts for a LARGE number of libraries and other programs that don't
have
autoload files of their own.
Ideally though, those files should be distributed across the web so that
each library maintainer can maintain his or her own autoload files.
--
Steve Baker http://web2.airmail.net/sjbaker1
sjbaker1@airmail.net (home) http://www.woodsoup.org/~sbaker
sjbaker@hti.com (work)