Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
speeding up the build process
View unanswered posts
View posts from last 24 hours

 
Reply to topic    Gentoo Forums Forum Index Portage & Programming
View previous topic :: View next topic  
Author Message
grey_dot
Tux's lil' helper
Tux's lil' helper


Joined: 15 Jul 2012
Posts: 142

PostPosted: Wed Oct 09, 2013 1:44 pm    Post subject: speeding up the build process Reply with quote

After spending some time drooling and looking at the world emerging on an arm board, I noticed that the slowest thing isn't the compilation itself (I use distcc), but the buttload of configure scripts each checking the presense of an enormous amount of stuff that might have been absent on some dead unix system 30 years ago.

Any way to speed things up? I've tried switching /bin/sh symlink to dash, but no luck because portage stubbornly runs configure with bash, so a way to force dash usage is welcome just as any other advice. Thanks.

P.S. Cross-compilation is not an answer because of libtool and some other things that make life so painful.
Back to top
View user's profile Send private message
gringo
Advocate
Advocate


Joined: 27 Apr 2003
Posts: 3706

PostPosted: Wed Oct 09, 2013 1:57 pm    Post subject: Reply with quote

im very interested in some input too. I use distcc heavily and the hole configure crap just takes ages on some of my old stuff.
maybe someone wants to pick up again confcache ? :-)

cheers
Back to top
View user's profile Send private message
666threesixes666
Veteran
Veteran


Joined: 31 May 2011
Posts: 1223
Location: 42.68n 85.41w

PostPosted: Wed Oct 09, 2013 3:37 pm    Post subject: Reply with quote

email lennart poettering @ redhat and tell him to paralelize configure scripts. i hate that configure can't be ran across all of my cores.
_________________
cat /etc/*-release
Funtoo Linux - baselayout 2.2.0
consider this warning no. 1
http://ecx.images-amazon.com/images/I/81Ku-vxIb3L._SL1500_.jpg
http://wiki.gentoo.org/wiki/Special:Contributions/666threesixes666
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Wed Oct 09, 2013 4:21 pm    Post subject: Reply with quote

gringo wrote:
maybe someone wants to pick up again confcache ? :-)

You can manage confcache manually: You can export CONFIG_SITE=/path/to/some/file and set in this file some results like e.g.
Code:
: ${gl_cv_func_getcwd_path_max=yes}

However, almost anything nontirvial breaks some package or another (even without crosscompiling!), so you will get the same breakages as already confcache gave...
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Wed Oct 09, 2013 4:28 pm    Post subject: Re: speeding up the build process Reply with quote

grey_dot wrote:
I've tried switching /bin/sh symlink to dash, but no luck because portage stubbornly runs configure with bash

No, portage uses /bin/sh. However, configure scripts generated by current versions of autoconf start themselves again using bash if possible. You can export CONFIG_SHELL=/bin/dash if you want that some badly written configure scripts break with dash. :wink:
The time is lost in compiliing and running test programs, not in the shell code.
Back to top
View user's profile Send private message
geki
Advocate
Advocate


Joined: 13 May 2004
Posts: 2113
Location: Germania

PostPosted: Wed Oct 09, 2013 8:12 pm    Post subject: Reply with quote

remembering the days of configuring courier packages. :roll:

just a stupid idea ...

afaict from a quick search for confcache it had a global cache; surely not good.
what about a package based cache? invalidate by changed configure script/setup?

ah well, I do not want to know where that fails. :lol:
_________________
boost|trans-follow xcb|instruction set analyzer
___
the self obscures the truth. I am no one, and so my words alone come from the truth of this world. amen.
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Wed Oct 09, 2013 9:55 pm    Post subject: Reply with quote

geki wrote:
what about a package based cache? invalidate by changed configure script/setup?

One could follow the ccache strategy: Setting up a cache based on the checksum of ./configure, of the passed parameters, and of the environment (to catch CFLAGS etc) has good chances to work rather reliably. On the other hand, I am not sure how good the hit/miss-rate will be (especially since I guess that checksumming the whole environment is too sensible, but I do not know a good strategy what to exclude).
I guess in the mean the hit rate is worse than ccache, and moreover, in case of a hit the saving of time is less than in the ccache case. So a lot of work for a rather mild effect. However, if somebody is willing to do the work... :wink: ...the disk space requirements are probably ralatively low: One just has to save a compressed form of the output file after the configure run...
Back to top
View user's profile Send private message
Yamakuzure
Veteran
Veteran


Joined: 21 Jun 2006
Posts: 1238
Location: Bardowick, Germany

PostPosted: Thu Oct 10, 2013 3:07 pm    Post subject: Reply with quote

Everything done compiling would end up in ccache already, but of course package based du to the paths ending up in the hash.

However, the only way to speed up autoconf configure scripts is to teach people to not add any stupid test autoscan suggests.

However, we developers are lazy, so we'll just use the first suggestion anyway. ;)
_________________
I *do* know that I easily aggravate people due to my condensed writing. Rule of thumb: If I wrote anything that can be understood in two different ways, and one way offends you, then I meant the other! ;)
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Thu Oct 10, 2013 8:53 pm    Post subject: Reply with quote

Yamakuzure wrote:
Everything done compiling would end up in ccache already, but of course package based du to the paths ending up in the hash.

Not sure whether there was a misunderstanding: I was speaking about storing the result of the ./configure run in the cache (i.e. things like the cachefile which ./configure stores for itself - I forgot the name in the moment); one might even go further and for projects with easily parsable AC_CONFIG_FILES, one might store the files listed there and completely skip the configure run in case of a cache hit.
Quote:
However, the only way to speed up autoconf configure scripts is to teach people to not add any stupid test autoscan suggests.

Unfortunately, many of the tests are required because they set things in e.g. config.h. Moreover, many other tests are done as dependency of other tests which are required. For many projects it would be really hard work trying to simplify configure.ac or to parallelize it. Moreover, things like xorg already have quite minimal configure scripts, cut down to tests for systems which are really supported.
Back to top
View user's profile Send private message
grey_dot
Tux's lil' helper
Tux's lil' helper


Joined: 15 Jul 2012
Posts: 142

PostPosted: Thu Oct 10, 2013 9:39 pm    Post subject: Reply with quote

mv wrote:
Setting up a cache based on the checksum of ./configure, of the passed parameters, and of the environment (to catch CFLAGS etc) has good chances to work rather reliably.


This is a very bad idea, because configure scripts might change very little (e.g. a comment line) and we have a cache miss. OTOH lots of configure scripts check for the same things like sys/types.h header. Eliminating this one single check alone could save a lot of time.

Much more practical approach would be storing the results of the configure script like config.h header sections, paths, etc, and providing those when building the package as well as other packages too (tests are mostly the same, remember).
Back to top
View user's profile Send private message
grey_dot
Tux's lil' helper
Tux's lil' helper


Joined: 15 Jul 2012
Posts: 142

PostPosted: Thu Oct 10, 2013 9:47 pm    Post subject: Re: speeding up the build process Reply with quote

mv wrote:
some badly written configure scripts break with dash


My personal experience says that almost every fifth configure script is broken. Most of them ignore portage root setting (or is it a portage bug?) and try to link test programs with the libraries from /lib or /usr/lib instead of /target/lib, which breaks cross-compiling. Others try to actually run the test binaries, which breaks cross-compiling even more. GnuPG configure script fetchs pth path from /usr/bin/pth-config and not /target/usr/bin/pth-config (again, portage bug?), and refuses to work further if you build gnupg for 32-bit platform while on 64-bit and vice versa. Some other software packages configure scripts are broken in even more stupid way.

I wish there was a way of eliminating the autoconf madness entirely.
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Fri Oct 11, 2013 2:59 am    Post subject: Reply with quote

grey_dot wrote:
mv wrote:
Setting up a cache based on the checksum of ./configure, of the passed parameters, and of the environment (to catch CFLAGS etc) has good chances to work rather reliably.


This is a very bad idea, because configure scripts might change very little (e.g. a comment line) and we have a cache miss.

As I have written, the hit/miss-rate will be about the same as for ccache.
Quote:
Much more practical approach would be storing the results of the configure script like config.h header sections, paths, etc, and providing those when building the package as well as other packages too (tests are mostly the same, remember).

And if you have ever tried to do this e.g. with the CONFIG_SITE method which I described (or if you tried to use confcache) you will see that this simply does not work, since for practically every such test there are some projects in which the defaults fail for some reason or another. Sometimes this is badly written autoconf code, sometimes it lies in the nature of things because some project needs something not-so-standard.
Quote:
Others try to actually run the test binaries, which breaks cross-compiling even more.

The purpose of a configure script is to check whether certain features can be used. Sometimes this is compatible with cross-compiling but often it is not. For instance if you want to check whether a library has a known feature or a known failure or whether a compiler exhibits a known bug you often have to run them. If you know for which system you are cross-compiling and whether this system exhibits the tested feature or bug you can easily inject the result to configure (the configure script were broken if this were not possible), but you cannot expect those checks to magically guess the correct results when cross-compiling.
Quote:
I wish there was a way of eliminating the autoconf madness entirely.

No problem if you live in an ideal world in which never bugs have to be worked around, all features are standardized and every tiny tool and header file follows this standard.
Back to top
View user's profile Send private message
grey_dot
Tux's lil' helper
Tux's lil' helper


Joined: 15 Jul 2012
Posts: 142

PostPosted: Tue Oct 15, 2013 9:43 pm    Post subject: Reply with quote

mv wrote:
Quote:
I wish there was a way of eliminating the autoconf madness entirely.

No problem if you live in an ideal world in which never bugs have to be worked around, all features are standardized and every tiny tool and header file follows this standard.


In an ideal world people fix bugs instead of creating a pile of dirty fragile kludges. I wish someday linux people will do that too.
Back to top
View user's profile Send private message
Hu
Watchman
Watchman


Joined: 06 Mar 2007
Posts: 8594

PostPosted: Tue Oct 15, 2013 10:05 pm    Post subject: Reply with quote

Setting aside the needless provocation, you ignored the situation where end users run old buggy libraries that were long ago fixed by upstream. I would prefer to have my programs fail early with a request to upgrade the supporting library, rather than assume that since upstream fixed it, all users will be running the fix and no one will send me bogus bug reports arising from using outdated copies of the supporting library.
Back to top
View user's profile Send private message
mv
Advocate
Advocate


Joined: 20 Apr 2005
Posts: 3789

PostPosted: Wed Oct 16, 2013 6:31 am    Post subject: Reply with quote

grey_dot wrote:
In an ideal world people fix bugs instead of creating a pile of dirty fragile kludges.

Yes, in an ideal world. In the real world there are many systems with broken shells, compilers, and libraries which people for some reason or another do not or cannot upgrade.
Quote:
I wish someday linux people will do that too.

You find such things more often on traditional unix systems (e.g. old suns, just to name one example) than on linux.
Back to top
View user's profile Send private message
grey_dot
Tux's lil' helper
Tux's lil' helper


Joined: 15 Jul 2012
Posts: 142

PostPosted: Fri Oct 18, 2013 6:57 am    Post subject: Reply with quote

mv wrote:
grey_dot wrote:
In an ideal world people fix bugs instead of creating a pile of dirty fragile kludges.

Yes, in an ideal world. In the real world there are many systems with broken shells, compilers, and libraries which people for some reason or another do not or cannot upgrade.


And the autoconf solution - namely bloating every other package with an infinite number of kludges, most of which dont work, is just ridiculous.

mv wrote:
Quote:
I wish someday linux people will do that too.

You find such things more often on traditional unix systems (e.g. old suns, just to name one example) than on linux.


Sun is dead. So are most other unix systems, except for a bunch of BSDs. Speaking of which, I cant find a single configure script except for some gcc/gnu software in my openbsd source tree.

Code:
~/devel/openbsd/src> find . -type f -name "configure" | wc -l
61


Almost all of them are in gnu directory (gcc, gdb, gnu binutils, several not-yet-replaced libraries). Same with FreeBSD. Does those count as traditional unices? And I can't really remember having a problem with building/cross-compiling those with pure make.

So it is mostly GNU world infected with this disease called autotools, which were for some reason mistaken for a decent build system though it is clearly not.
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Portage & Programming All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum