[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: New package managment (fbsd ports)
Erik <br0ke@math.smsu.edu> writes:
> this would differ from debs and rpms how? :) And how about dependancies? if the
> pingus autoweb isn't updated iwth the newest version of clanlib referenced,
> then it will get an old version of clanlib... RPM's have been critisized
> because they have no real central repository, and debian packages aren't
> exactly cutting edge... I think it's acceptable, common, de-facto, and
> implementing it as a common cvs repository would be a step in the right
> direction.
CVS would be *far* to much overkill. On FreeBSD the ports system may
be work well and it might be a good idea. But it is simply a package
system, that much much more than a download helper.
The thing is that I can do with a debian nearly exact the same as with
freebsd ports, maybe more, I don't know. I can automaticly download
binary and source packaged, get there dependencys resolved, get the
dependencys downloaded, etc. On debian I can install nearly every
software package with just:
$ apt-get install my_favarite_package
Thats wonderfull, but it wouldn't help here, what we need is a tool to
resolve library dependencies automaticly, but in the simplest possible
way and that is a shell script.
> it's not incredibly scalable. There will be a point when any method will fail
> due to package number.
autoload shell scripts would still work, since thay would do
everything distributed over other autoload scripts.
> Suppose the debian maintainers decided not to use your package? suppose the
> redhat ppl decided not to? the suse ppl? We put this kind of trust in people
> with alterior motives already. If some non-profit commitee was formed, they
> could provide some form of quality control.
No, you are getting something wrong, the idea of autoload is, to have
a download helper (maybe also a build helper), which workes recursivly
over the autoload scripts of the main packages autoload script.
The Ports system would be something comparable to the distributions,
but that wouldn't solve any problems for the people, which just want
to try the newest software package, which was just announce on freshmeat or
happypenguin.org, since it wouldn't be found in the ports hierarchy.
>> However, the more I think about it, the more I think the scheme I
>> outlined yesterday is superior.
> you only think that cuz it's yours :)
No, since is simple much better suited to solve the problems, ports
can do that.
autoload would just one shell script that downloads the other autoload
scripts and finally downloads the required libraries, simple and it
could work. For ports we would need a CVS server, a maintainer,
people would need to set up a ports hierachy on there hard
drive.... Just to much downloading the package form a webpage and
finding the required libraries myself is more easily.
> If the autoweb way gets implemented instead of the ports way. And a program
> calls for libblah, and automatically downloads and installs version x of
> libblah. Then another program needs libblah of version y, what happens? does it
> know that a different version was installed? does it upgrade, or attempt dual
> residance? Does it install the new one partially over the old one, breaking the
> first program? How do you enforce sane dependancy checking?
It would just download the library that is needed for the software
package, if another programm needs another version of that library
than it would download the other version, but after installation that
is no problem, since linux can handle different library versions. The
problem would be just in the build stage.
But autoload can really give a good solution here, since autoload
would also fail were a human would fail, it can't do any miracles.
--
http://dark.x.dtu.dk/~grumbel/pingus/ |
Ingo Ruhnke <grumbel@gmx.de> http://home.pages.de/~grumbel/ |
------------------------------------------------------------------------+