
Autopackage is completely broken. On that issue, the Debian people are entirely right.ZephyrXero wrote:I beg you developers, please don't act like the Debian guys are and give Autopackage a fair shot!

Hmm...that's funny, I just installed Inkscape with Autopackage and it's running perfectly fine....strange.....doesn't feel broken.ciaranm wrote:Autopackage is completely broken. On that issue, the Debian people are entirely right.ZephyrXero wrote:I beg you developers, please don't act like the Debian guys are and give Autopackage a fair shot!
It didn't install into /usr/local (broken), it didn't correctly handle dynamic linking (broken), it didn't correctly handle dependencies (broken), you ran arbitrary shell code just to be able to view the contents of the package (broken), it didn't correctly handle optional nth-level deps (broken), it didn't correctly handle filesystem layout (broken), it only runs on x86 (broken), it can clobber arbitrary files (broken), it can remove arbitrary files on uninstall (broken) and it installs straight to the live fs (broken). That you can get away with installing things with it sometimes doesn't mean the technology is any good.ZephyrXero wrote:Hmm...that's funny, I just installed Inkscape with Autopackage and it's running perfectly fine....strange.....doesn't feel broken.ciaranm wrote:Autopackage is completely broken. On that issue, the Debian people are entirely right.ZephyrXero wrote:I beg you developers, please don't act like the Debian guys are and give Autopackage a fair shot!

Well, if portage and autopackage were made to work together, this stuff wouldn't be a problem anymore... [See this article's comments]It didn't install into /usr/local (broken), it didn't correctly handle dynamic linking (broken), it didn't correctly handle dependencies (broken), it didn't correctly handle filesystem layout (broken), it can clobber arbitrary files (broken), it can remove arbitrary files on uninstall (broken)
Eh? The whole 'design' is totally broken. If they were to reimplement the entire thing from scratch and do it properly then we'd reconsider. As it stands, it's so flawed there's no way we can get it fixed short of scrapping the whole thing.ZephyrXero wrote:Well, if portage and autopackage were made to work together, this stuff wouldn't be a problem anymore... [See this article's comments]It didn't install into /usr/local (broken), it didn't correctly handle dynamic linking (broken), it didn't correctly handle dependencies (broken), it didn't correctly handle filesystem layout (broken), it can clobber arbitrary files (broken), it can remove arbitrary files on uninstall (broken)
If you guys would just give it a fair chance it might have a chance at succeeding. Hence the begging and pleeding in earlier post...

Huh? You don't rely upon Gentoo at all. Quit the FUD, that's not how to get autopackage accepted.ZephyrXero wrote:And the design of relying on your Operating System developers for every program you ever want to install isn't broken? The "repository" system is holding Linux back from ever making it into the mainstream.

You shouldn't be installing autopackage-built binaries on Gentoo systems, since they're broken. Instead, you should get the source from upstream and build it correctly yourself, and install it that way into /usr/local. Alternatively, make an ebuild.ZephyrXero wrote:If it's not in the portage tree, we can't emerge it... where's the FUD in that? You want me to go and find some obscure "overlay" and add it into portage? How is that any more "safe" than autopackage? I'd rather just download my programs from their original developers. I like my OS and my programs seperate, that's why I quit using Winblows...

Eh? No, you give your Ubuntu friend a source tarball, the same way we've been doing for zillions of years. You can't sanely compile apps that'll run on an Ubuntu system on a Gentoo box, nor apps that'll run on a Gentoo system from an Ubuntu box because we use different core libraries and gcc versions. Are you going to static link everything?ZephyrXero wrote:Do you just not get it? lol...... There are thousands of distros besides Gentoo. So, when my buddy running an Ubuntu system wants a program I can just make him an ebuild right? Until portage works with every single distro there is, it's just a proprietary package manager and worthless to anyone who doesn't use Gentoo. This mindset and elitest attitude has to end now if we ever want to make Linux something more than a hobbiest's operating system. I really hope there are more open minded developers out there... use a little foresight. The world is bigger than you are.
Portage has optional collision detection as a FEATURES setting.Hypnos wrote:* No sandboxing -- an unacceptable vulnerability. (OTOH, does Portage do collision detection? Blocks must be specified in *DEPEND.)
Naah, that's a duff argument. Per-setup builds are far cleaner.* Hackish solutions to binary incompatibilities. The Autopackage devs do have a point, that ABI stability is necessary for a smooth user experience, and something to which FOSS should aspire.
Very interesting. And these problems are quite important...ciaranm wrote:It didn't install into /usr/local (broken), it didn't correctly handle dynamic linking (broken), it didn't correctly handle dependencies (broken), you ran arbitrary shell code just to be able to view the contents of the package (broken), it didn't correctly handle optional nth-level deps (broken), it didn't correctly handle filesystem layout (broken), it only runs on x86 (broken), it can clobber arbitrary files (broken), it can remove arbitrary files on uninstall (broken) and it installs straight to the live fs (broken). That you can get away with installing things with it sometimes doesn't mean the technology is any good.ZephyrXero wrote:Hmm...that's funny, I just installed Inkscape with Autopackage and it's running perfectly fine....strange.....doesn't feel broken.ciaranm wrote:Autopackage is completely broken. On that issue, the Debian people are entirely right.ZephyrXero wrote:I beg you developers, please don't act like the Debian guys are and give Autopackage a fair shot!
So it does -- thank you.ciaranm wrote:Portage has optional collision detection as a FEATURES setting.
It's easier for package devs and distribution integrators, but not for users. The more often and trasparently things "magically work" for the user, the better -- this involves interacting with the user to know his wishes, and then implementing them transparently. So far, the only models I know of that have attained this are monolithic platform (e.g., Mac) and user-as-developer source-based (e.g., traditional Unix and Gentoo).Naah, that's a duff argument. Per-setup builds are far cleaner.
Oh come on, that bug was only opened about three years ago, you don't expect our portage guys to be that fast do you?Hypnos wrote:How are higher level conditional deps coming along? E.g., for package X to have some feature, its dep Y must be emerged with a specific USE flag or autodetected library (one example: gnome-db, and the DB glue libs it depends on). Right now there are nice eclass tools for detecting the lack of features, but you can only spit a warning or die.
Magically work: 'emerge foo'. The only reason it works on the Mac is because they do a shitload of static linkage, which *really* eats your disc space.It's easier for package devs and distribution integrators, but not for users. The more often and trasparently things "magically work" for the user, the better -- this involves interacting with the user to know his wishes, and then implementing them transparently. So far, the only models I know of that have attained this are monolithic platform (e.g., Mac) and user-as-developer source-based (e.g., traditional Unix and Gentoo).Naah, that's a duff argument. Per-setup builds are far cleaner.
Perish the thought!ciaranm wrote:Oh come on, that bug was only opened about three years ago, you don't expect our portage guys to be that fast do you?
I do use Gentoo because things mostly magically work, and when I want to do something novel the extra work is incremental and easy to share. Unfortunately, compiling stuff, esp. C++ code, just takes too damn long.Magically work: 'emerge foo'. The only reason it works on the Mac is because they do a shitload of static linkage, which *really* eats your disc space.
However, these are (a) all opinions and (b) all misguided and based on incomplete knowledge of the facts. Let's see.It didn't install into /usr/local (broken), it didn't correctly handle dynamic linking (broken), it didn't correctly handle dependencies (broken), you ran arbitrary shell code just to be able to view the contents of the package (broken), it didn't correctly handle optional nth-level deps (broken), it didn't correctly handle filesystem layout (broken), it only runs on x86 (broken), it can clobber arbitrary files (broken), it can remove arbitrary files on uninstall (broken) and it installs straight to the live fs (broken). That you can get away with installing things with it sometimes doesn't mean the technology is any good.
This is necessary because some people are wed to the centralised distribution model. Autopackage isn't just a program, it's pushing for an change to the model where every distro tries to include every program - something that mathematically just cannot scale. So a significant part of our website is about justfying why the status quo is flawed and why distributed packaging is better.When the news came up on Autopackage, I read info on the site and most info was trying to justify it... instead of saying what it actually do. The first thing I thought is that when you have to justify its role a lot, when maybe, it does not have much place in the world.
There is some preliminary support for multiple architectures, but so far all the developers are using x86. Also we're moving away from the idea of having RPM style .i386.package, x86-64.package, .ppc.package files etc - this has poor usability and as soon as new architectures are introduced, you are back to having small numbers of people trying to recompile huge amounts of software. So an LLVM based model seems better, especially as it would result in fully optimised binaries for everyone as the final native ELF images are produced "just in time". Anybody could help implement this.From my opinion, it seems to be like for people who wants to install stuff à la Windows on their x86... Also, I didn't quite get the point why it is x86-only if the goal they mention is to have the upstream build for everyone?
However less stupid than not having working menu entries, file associations, icons or in some cases, binaries. If distros actually supported /usr/local as a first class citizen it would not be needed. See above for my previous reply.Installing in /usr is stupid. I am pretty sure commercial Unix apps not distributed by the distributor gets in /usr/local or /opt. In case of OS X, apps are self-contained in their own directory and they don't get installed in /usr...
Most things? Compared to the universe of software that has been written throughout history, it has nearly nothing. I have also grave concerns about the quality of down-stream packaging. Several times now I've had to correct seriously wrong ebuilds even though I do not use Gentoo, simply because end users were coming to projects I am involved in (not autopackage) with "bugs" that turned out to be caused entirely by broken ebuilds. This is not unique to Gentoo. Debian has caused us similar grief in the past.Gentoo has most things available in portage... should not be too difficult installing other apps since alls deps are probably already there...
Gentoo is not at all buggy in that respect. We handle /usr vs /usr/local entirely correctly. Autopackage does not because it attempts to work around screwups in other distributions that are not a problem in Gentoo.mikehearn wrote:(1) We install to /usr/local because so many distros are broken with respect to this prefix. When a Mandrake developer, apparently more enlightened than you, asked how to fix it, I provided a list and he set about doing so. That means that once Mandrake is fixed, autopackage will install to /usr/local on that distribution. This is a problem entirely of your own making: if distros weren't so universally buggy, it would not have been necessary.
Solving one dynamic linking issue is not solving every dynamic linking issue.(2) It not only correctly handles dynamic linking, it actually extends it so ELF can support DSO level weak linkage, using relaytool. This is a feature ELF would not normally have as it only supports symbolic weak linkage (instead of at the dyntags level).
It tries to check for dependencies, may or may not get them right (and probably won't, since autopackage knows sod all about Gentoo), and any resolution it does leads to even more broken stuff being misinstalled.(3) It handles dependencies just fine, it checks for their presence and if a dep check fails, it will attempt to resolve it. At the moment it can't use portage to resolve dependencies. That's not some fundamental unarguable happening, simply put nobody wrote the code yet.
Content from the tree is trusted. Content from upstream package distributors is not. You expect me to trust every monkey with a website who says they provide Linux software?(4) You ran arbitrary shell code - unlike, say, portage which consists of running even larger amounts of shell code. Have you personally reviewed every line of the code in emerge? If you have, do you think many others have? If you want to view the contents of the package, you can read the stub which is only a very small amount of script to verify that it's not been hax0red first. As the contents of the package aren't meaningful to anything but the package scripts, not being able to do this automatically is a non-issue.
No, it only runs on x86 because the design is so hideously bad that it needs arch specific code.(5) It only runs on x86 because x86 users are by far the majority so they got support first. Adding support for other architectures has been partially completed, but before we can do it fully we need somebody to step up and support it on that architecture. Better ways of solving the problem have been researched: see the notes in the TODO list and mail archives on EiC and LLVM. The use of LLVM would mean that autopackages would become CPU architecture "neutral" as well as multi-distro, and the resulting binaries would be fully optimised for the target hardware, whether that's x86 or AMD64 or PowerPC[64]. As LLVM bytecode images are smaller than most native code images, and the LLVM optimisation engine can do optimisations that GCC currently cannot such as auto-vectorisation, this will have many other benefits for even x86 users as well.
Exactly. So it clobbers arbitrary files. Such as, say, overwriting your webserver configuration file with something completely inaccurate. So what if you get a backup? It's still crapping all over your live filesystem.(6) It can clobber arbitrary files - wrong. If you are overwriting an existing package any files will be backed up into the database and restored on uninstall. Upcoming versions check for pre-installed packages and will offer to uninstall them if they were installed via your package manager. If you are afraid of evil shell scripts see below.
Autopackage has no concept of shared ownership. So uninstalling can break other packages. Bad.(7) It can remove arbitrary files on uninstall - so what? The software you're running comes from upstream, if you don't trust them to write your packages why are you running their code at all? Remember that you can view uninstall scripts and logs using only "less".
Eh? No. Not like nearly every other package manager at all.(8) It installs straight to the live FS ..... like nearly every other package manager. This is only "broken" from your very unusual perspective. We've had zero requests to change this.
No, I've done the research, and I know a heck of a lot of reasons why autopackage is utterly broken.Basically you're looking for reasons to dislike it, having done apparently no research at all. Well done!
Eh? It's not even remotely broken, considering overlay. And anything that doesn't involve compiling code means everyone has to use exactly the same versions of everything built exactly the same way.I don't find any of these reasons convincing, and I certainly don't find it more broken than attempting to package every program in the known universe into portage. That is not only broken by implementation but broken by design. Your fallback of "just send the source tarballs" is pathetic and weak - some of us want Linux to be easy to use so our friends and family can run it. Anything that involves compiling code on end user machines is therefore verboten.