View previous topic :: View next topic |
Author |
Message |
TGL Bodhisattva
Joined: 02 Jun 2002 Posts: 1978 Location: Rennes, France
|
Posted: Mon Sep 01, 2003 1:54 pm Post subject: Tip for 56K users: fetch somewhere, emerge at home |
|
|
Following a discussion on gentoo-users-fr mailing list, I've updated some old scripts which I've though I might share. They are are intented to people who have at home a slow internet connection, but have access (at work or anywhere else) to a fast one, and some kind of removable media device.
The first script, emerge-uri.sh, is to be run at home, and outputs a list of urls sets (including your favorite mirrors) for the files a given emerge command has to fetch. Code: | #!/bin/bash
GOOD_FILES="`emerge -pf --nospinner $* 2>/dev/null | sed -n 's:>>> md5 src_uri \;-) ::p'`"
URI_LIST="`(emerge -pf --nospinner $* 1>/dev/null ; echo >&2) 2>&1 \
| sed 's:[^[:print:]]:\n:' \
| sed -n '/^http:/p ; /^ftp:/p'`"
for file in ${GOOD_FILES}; do
URI_LIST="`echo "$URI_LIST" | grep -v $file`"
done
echo "$URI_LIST" |
To save the list, use standard output redirection, like in this example: Code: | ./emerge-uri.sh -u world > world-update-list |
The second script, download-uri.sh, is to be run at where the fast connection is available. It takes an urls sets list file as argument, and for each file, browse the set until it can download it (using wget). Code: | #!/bin/bash
if [ ! -f "$1" ] ; then
echo "Can't read uri file \"$1\"" >&2
exit 1
fi
cat $1 \
| ( read uri
errors=""
while [ -n "$uri" ]; do
found=""
for u in $uri; do
wget $u && found="yes" && break
done
[ -z "$found" ] && errors="$errors!!! ${uri##*/}\n"
read uri
done
if [ -n "${errors}" ] ; then
echo -e "\n!!! Here are files I couldn't download:\n${errors}" >&2
exit 1
fi ) |
Here is an example on how to download the files for your world update: Code: | ./download-uri.sh world-update-list |
Now, all you have to do is to put this files on some removable media, take them back home, and put them in /usr/portage/distfiles, and you are ready to emerge.
Last edited by TGL on Mon Sep 01, 2003 2:29 pm; edited 1 time in total |
|
Back to top |
|
|
Lovechild Advocate
Joined: 17 May 2002 Posts: 2858 Location: Århus, Denmark
|
Posted: Mon Sep 01, 2003 2:00 pm Post subject: |
|
|
It might be added here that asking for permission to abuse the company bw like this might be a good idea, or at least run the job using "at" or sleep so it would download stuff in a timeperiod where other peoples work is unlikely to be affected by usage - misusing the company resources can be a very quick way to get oneself fired. |
|
Back to top |
|
|
TGL Bodhisattva
Joined: 02 Jun 2002 Posts: 1978 Location: Rennes, France
|
Posted: Mon Sep 01, 2003 2:28 pm Post subject: |
|
|
You're right, sure. I'll edit the title to change from "at work" to "somewhere" |
|
Back to top |
|
|
asph l33t
Joined: 25 Aug 2003 Posts: 741 Location: Barcelona, Spain
|
Posted: Tue Sep 02, 2003 3:38 pm Post subject: great |
|
|
thanks for this script, will save a lot of time
btw, i'm not using it at work, lol (but at the university ) _________________ gentoo sex is updatedb; locate; talk; date; cd; strip; look; touch; finger; unzip; uptime; gawk; head; emerge --oneshot condom; mount; fsck; gasp; more; yes; yes; yes; more; umount; emerge -C condom; make clean; sleep |
|
Back to top |
|
|
Wechner n00b
Joined: 16 Sep 2003 Posts: 19 Location: Earth -> Germany -> Einbeck
|
Posted: Wed Dec 10, 2003 6:17 pm Post subject: |
|
|
hey guys
I myself had to solve the same problem. so i wrote a script in python. start it like
or
Code: | $ ./fetchscript [package-name] |
It queries the files to download, removes already fetched files from the list and creates at last to shell scripts which have the names [package-name].1st.sh and [package-name].2nd.sh . The script ending with .1st.sh contains the commands with links to the primary download location the script ending with .2nd.sh the secondary download locations. all you have to do now is to take them to school / university / ..., execute them and take the fetched files back home.
Package name may also be "world" or "system".
And here comes the script:
Code: | #!/usr/bin/env python
import sys
def abort(msg):
print 'ERROR:', msg
print 'Aborting ...'
sys.exit(1);
try:
import commands
except ImportError:
abort('Could not import module "commands"!')
try:
import string
except ImportError:
abort('Could not import module "string"!')
def promptPackage():
while True:
package = raw_input('Please enter a valid package name: ')
if validatePkg(package):
return package
else:
print 'Package does not exist! Try again.'
def validatePkg(package):
print 'Validating package\'s existence ...',
cmd = 'emerge -p ' + package
outp = commands.getoutput(cmd)
if outp.find('!!! Error calculating dependencies.') == -1:
print 'ok'
return True
else:
print 'failed'
return False
print '### fetchscript.py v. 0.5 ###'
print 'Copyright (C) 2003, Martin Wegner'
print 'Released under the terms of the GNU General Public License (GPL).'
try:
package = ''
if len(sys.argv) == 2:
package = sys.argv[1]
if not validatePkg(package):
package = promptPackage()
else:
package = promptPackage()
print 'Querying file list for package', package, '...',
cmd = 'emerge -pf ' + package
outp = commands.getoutput(cmd)
print 'ok'
print 'Creating lists for primary and secondary download locations ...'
lines = outp.splitlines()
primary = []
secondary = []
for i in range(len(lines)):
if lines[i].find('>>> md5 src_uri') != -1:
length = len(primary)
j = 0
while j < length:
if primary[j].find(lines[i][20:]) != -1:
del primary[j]
del secondary[j]
length = len(primary)
print lines[i][20:], 'already fetched'
j+=1
# primary = primary[:-1]
# secondary = secondary[:-1]
continue
if lines[i].find('://') == -1: # omit lines that contain no links
continue
uris = lines[i].split(' ')
primary+=[ uris[0] ]
if len(uris) >= 2:
secondary+=[ uris[1] ]
else:
secondary+=[ uris[0] ]
print primary
print secondary
if len(primary) == 0 and len(secondary) == 0:
print '-- Nothing to be downloaded. Done.'
sys.exit(0)
print '-- Creating shell scripts', package + '.1st.sh', 'and', package + '.2nd.sh', '...',
script = file(package + '.1st.sh', 'w')
script.write('#!/bin/sh\n\n')
for uri in primary:
script.write('wget -c ' + uri + '\n')
script.close()
script = file(package + '.2nd.sh', 'w')
script.write('#!/bin/sh\n\n')
for uri in secondary:
script.write('wget -c ' + uri + '\n')
script.close()
print 'ok'
print '-- All jobs done.'
except KeyboardInterrupt:
abort('Interrupted!')
|
hope it will be helpful.
if you should discover any bugs feel free to send feedback to wechner@users.sourceforge.net
for latest release visit http://www.mwegner.de.ms/index.php?page=codes&sub=compi&code=3 _________________ > HUMAN KNOWLEDGE BELONGS TO THE WORLD!
> (Programming|Script|Markup)-Languages: C, C++, PHP, Python, (X)?HTML, CSS, JavaScript, XML, XSL(T)?
Including: SDL, wxWidgets, MySQL
> Visit our project The Craft |
|
Back to top |
|
|
jago25_98 Apprentice
Joined: 23 Aug 2002 Posts: 180
|
|
Back to top |
|
|
RudyG n00b
Joined: 25 Feb 2004 Posts: 32
|
Posted: Thu Apr 01, 2004 7:01 am Post subject: |
|
|
i used these scripts and they work great!!! portage at home, download at school, it's great |
|
Back to top |
|
|
_Vinz_ n00b
Joined: 22 Jan 2004 Posts: 20 Location: Vannes, France
|
Posted: Fri Jul 23, 2004 12:31 pm Post subject: |
|
|
Hiya,
I use the easier method of :
(at home) Code: | emerge -puf world 2>&1 > emerge_list |
(at work) Code: | cat emerge_list | xargs wget -nc -nd |
The -nd wget option prevents wget to recreate the directory structure of the mirror and the -nc option makes it so that it will not redownload a previously fetched file.
IIRC this is the method advised by the Gentoo handbook (and the use of the emerge -pf options)
Vinz
EDIT : indeed my method does not check for md5 validity of downloaded packages |
|
Back to top |
|
|
skiman1979 n00b
Joined: 11 May 2004 Posts: 37 Location: PA, USA
|
Posted: Wed Nov 17, 2004 3:12 pm Post subject: emerge problems on dialup |
|
|
First off, I have a syntax question. What is '2>&1' in the command 'emerge -puf world 2>&1 > emerge_list? Wouldn't you just redirect the output from emerge to the emerge_list file? I've never seen the 2>&1 syntax before.
I recently tried to follow the instructions from http://gentoo-wiki.com/TIP_Gentoo_for_dialup_users to update my gentoo system. I downloaded the latest portage snapshot on a broadband connection, took it home, and extracted it to /usr/. Then I executed 'emerge regen' which took forever. I saw some red text fly off the screen too quick to read.
Once that was done, I tried to 'emerge -fpu world 2> links.txt' and it created the file for me. However, emerge also complained that my profile needed to be updated. /etc/make.profile was a symlink to a 2004.0 profile, but emerge said to update the link to point to a 2004.0 profile but with a different path. (I'm not at my system right now so I don't have the exact paths. Both under /usr/portage I believe.) So I updated the profile link.
My problem is that now when I execute 'emerge -puv world' (to see what will be downloaded, file sizes, etc. it returns a long list something to the effect of 'QA Note: awk in global space: sys-apps/devel/gcc...' (Sorry if my quotes are a little off. I'm working from memory.)
Do I still take the created links.txt file to a brodband connection, download with wget, dump them in /usr/portage/distfiles, and proceed with emerge -u world? Or did something go wrong? If anyone could help me I'd appreciate it. Thanks. |
|
Back to top |
|
|
skiman1979 n00b
Joined: 11 May 2004 Posts: 37 Location: PA, USA
|
Posted: Thu Nov 18, 2004 1:28 pm Post subject: QA Notice: problem fixed |
|
|
Accidentially posted this as a new topic earlier... (the new topic and reply buttons are so close together... )
I found a page on another forum that told me the 'QA Notice:" messages are aimed at the developers and that the user should just ignore them. So I'll just continue with the instructions found from the above link. _________________ Having a smoking section in a public restaurant is much like having a peeing section in a public swimming pool. |
|
Back to top |
|
|
flybynite l33t
Joined: 06 Dec 2002 Posts: 620
|
Posted: Fri Nov 19, 2004 2:36 am Post subject: |
|
|
This is a topic that just keeps coming back. Here is a simple fix
Without network/slow network: Must be run as root
Code: |
for url in $(http_proxy=null ftp_proxy=null emerge -uDf world 2>&1); do echo "$url" | egrep 'http://|ftp://'; done > download_list
|
With network:
Code: |
wget -nc -nd -i download_list
|
This will really only download the files not present on the box without the network, and will try all mirrors portage knows about till it finds the file....
Is this still a simple script and meets all your needs? |
|
Back to top |
|
|
skiman1979 n00b
Joined: 11 May 2004 Posts: 37 Location: PA, USA
|
Posted: Fri Nov 19, 2004 1:48 pm Post subject: emerge scripts |
|
|
You say those scripts *really* just download the URLs for the files that haven't been updated yet. What more would 'emerge -ufp world 2> emerge_list' download that your script doesn't? At any rate, I'll give it a shot when I get a chance. Thanks. _________________ Having a smoking section in a public restaurant is much like having a peeing section in a public swimming pool. |
|
Back to top |
|
|
flybynite l33t
Joined: 06 Dec 2002 Posts: 620
|
Posted: Fri Nov 19, 2004 6:46 pm Post subject: Re: emerge scripts |
|
|
skiman1979 wrote: | You say those scripts *really* just download the URLs for the files that haven't been updated yet. What more would 'emerge -ufp world 2> emerge_list' download that your script doesn't? . |
The problem is 'emerge -ufp world' will download EVERY file needed to update world even if the file is already downloaded!!!
Here is an example, I'll fetch every file to install/update netcat:
(Note the transfer speeds (17.56 MB/s)!!! )
Code: |
# emerge -f netcat
Calculating dependencies ...done!
>>> emerge (1 of 3) net-libs/libnet-1.1.2.1 to /
>>> Downloading http://gentoo.osuosl.org/distfiles/libnet-1.1.2.1.tar.gz
--12:32:41-- http://gentoo.osuosl.org/distfiles/libnet-1.1.2.1.tar.gz
=> `/usr/portage/distfiles/libnet-1.1.2.1.tar.gz'
Resolving gate1.homenet.com... 127.0.0.1
Connecting to gate1.homenet.com[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 1,021,236
100%[================================================================================>] 1,021,236 --.--K/s
12:32:41 (8.07 MB/s) - `/usr/portage/distfiles/libnet-1.1.2.1.tar.gz' saved [1021236/1021236]
>>> libnet-1.1.2.1.tar.gz size ;-)
>>> libnet-1.1.2.1.tar.gz MD5 ;-)
>>> md5 src_uri ;-) libnet-1.1.2.1.tar.gz
>>> emerge (2 of 3) dev-libs/libmix-2.05 to /
>>> Downloading http://gentoo.osuosl.org/distfiles/libmix-205.tgz
--12:32:41-- http://gentoo.osuosl.org/distfiles/libmix-205.tgz
=> `/usr/portage/distfiles/libmix-205.tgz'
Resolving gate1.homenet.com... 127.0.0.1
Connecting to gate1.homenet.com[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 79,860
100%[================================================================================>] 79,860 --.--K/s
12:32:41 (17.48 MB/s) - `/usr/portage/distfiles/libmix-205.tgz' saved [79860/79860]
>>> libmix-205.tgz size ;-)
>>> libmix-205.tgz MD5 ;-)
>>> md5 src_uri ;-) libmix-205.tgz
>>> emerge (3 of 3) net-analyzer/netcat-110-r6 to /
>>> Downloading http://gentoo.osuosl.org/distfiles/nc-v6-20000918.patch.gz
--12:32:42-- http://gentoo.osuosl.org/distfiles/nc-v6-20000918.patch.gz
=> `/usr/portage/distfiles/nc-v6-20000918.patch.gz'
Resolving gate1.homenet.com... 127.0.0.1
Connecting to gate1.homenet.com[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 8,740
100%[================================================================================>] 8,740 --.--K/s
12:32:42 (17.12 MB/s) - `/usr/portage/distfiles/nc-v6-20000918.patch.gz' saved [8740/8740]
>>> nc-v6-20000918.patch.gz size ;-)
>>> nc-v6-20000918.patch.gz MD5 ;-)
>>> Downloading http://gentoo.osuosl.org/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2
--12:32:42-- http://gentoo.osuosl.org/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2
=> `/usr/portage/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2'
Resolving gate1.homenet.com... 127.0.0.1
Connecting to gate1.homenet.com[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 13,176
100%[================================================================================>] 13,176 --.--K/s
12:32:42 (8.76 MB/s) - `/usr/portage/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2' saved [13176/13176]
>>> netcat-110-r6-gentoo-deb-patches.tbz2 size ;-)
>>> netcat-110-r6-gentoo-deb-patches.tbz2 MD5 ;-)
>>> Downloading http://gentoo.osuosl.org/distfiles/nc110.tgz
--12:32:42-- http://gentoo.osuosl.org/distfiles/nc110.tgz
=> `/usr/portage/distfiles/nc110.tgz'
Resolving gate1.homenet.com... 127.0.0.1
Connecting to gate1.homenet.com[127.0.0.1]:8080... connected.
Proxy request sent, awaiting response... 200 OK
Length: 75,267
100%[================================================================================>] 75,267 --.--K/s
12:32:42 (17.56 MB/s) - `/usr/portage/distfiles/nc110.tgz' saved [75267/75267]
>>> nc110.tgz size ;-)
>>> nc110.tgz MD5 ;-)
>>> md5 src_uri ;-) nc110.tgz
>>> md5 src_uri ;-) nc-v6-20000918.patch.gz
>>> md5 src_uri ;-) netcat-110-r6-gentoo-deb-patches.tbz2
|
So every file has been downloaded and is present and md5 checked. Now watch as 'emerge -ufp' will download duplicate files!! Every one of these files is a duplicate download. You could waste your time downloading duplicates files if you use 'emerge -ufp'!!
Code: |
gate1 root # emerge -ufp netcat
Calculating dependencies ...done!
http://gentoo.osuosl.org/distfiles/libnet-1.1.2.1.tar.gz http://gentoo.osuosl.org/distfiles/libnet-1.1.2.1.tar.gz http://www.gtlib.cc.gatech.edu/pub/gentoo/distfiles/libnet-1.1.2.1.tar.gz http://www.packetfactory.net/libnet/dist/libnet-1.1.2.1.tar.gz
http://gentoo.osuosl.org/distfiles/libmix-205.tgz http://gentoo.osuosl.org/distfiles/libmix-205.tgz http://www.gtlib.cc.gatech.edu/pub/gentoo/distfiles/libmix-205.tgz http://mixter.void.ru/libmix-205.tgz
http://gentoo.osuosl.org/distfiles/nc-v6-20000918.patch.gz http://gentoo.osuosl.org/distfiles/nc-v6-20000918.patch.gz http://www.gtlib.cc.gatech.edu/pub/gentoo/distfiles/nc-v6-20000918.patch.gz ftp://sith.mimuw.edu.pl/pub/users/baggins/IPv6/nc-v6-20000918.patch.gz
http://gentoo.osuosl.org/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://gentoo.osuosl.org/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://www.gtlib.cc.gatech.edu/pub/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://ftp.gentoo.or.kr/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://gentoo.mirrored.ca/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://ftp.snt.utwente.nl/pub/os/linux/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://ftp.rez-gif.supelec.fr/pub/Linux/distrib/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://gentoo.oregonstate.edu/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://ftp.oregonstate.edu/pub/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://ftp.tu-clausthal.de/pub/linux/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://open-systems.ufl.edu/mirrors/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://gentoo.mirrored.ca/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://cudlug.cudenver.edu/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://lug.mtu.edu/gentoo/source/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://mirrors.tds.net/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://ftp.gtlib.cc.gatech.edu/pub/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://gentoo.mirrors.pair.com/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://ftp.snt.utwente.nl/pub/os/linux/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 ftp://cudlug.cudenver.edu/pub/mirrors/distributions/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2 http://distro.ibiblio.org/pub/linux/distributions/gentoo/distfiles/netcat-110-r6-gentoo-deb-patches.tbz2
http://gentoo.osuosl.org/distfiles/nc110.tgz http://gentoo.osuosl.org/distfiles/nc110.tgz http://www.gtlib.cc.gatech.edu/pub/gentoo/distfiles/nc110.tgz http://www.atstake.com/research/tools/network_utilities/nc110.tgz
|
Why would you want to download all those packages again and transfer them to the slow network box when many/all of the files are already on the slow network box!!
My script will only download packages that are really needed to update, and will not download duplicate packages like 'emerge -ufp' world |
|
Back to top |
|
|
skiman1979 n00b
Joined: 11 May 2004 Posts: 37 Location: PA, USA
|
Posted: Mon Nov 22, 2004 1:15 pm Post subject: emerge -ufp world |
|
|
if 'emerge -ufp world' downloads files that are already in /usr/portage/distfiles, then would that be a bug? If your distfiles folder already has the most recent package, it shouldn't be downloaded I'd think. I'll give those scripts a try once I finish installing my system. I had to reinstall because the upgrade wasn't working right. Now the install isn't cooperating with me either, but that's another story. Seems some of the files needed for the install are not on the CD, even though it is supposed to be a networkless install. :-/ _________________ Having a smoking section in a public restaurant is much like having a peeing section in a public swimming pool. |
|
Back to top |
|
|
jfindlay n00b
Joined: 12 Feb 2005 Posts: 2
|
Posted: Sat Feb 12, 2005 7:37 pm Post subject: |
|
|
Quote: | This will really only download the files not present on the box without the network, and will try all mirrors portage knows about till it finds the file....
Is this still a simple script and meets all your needs?
|
I've run your script, and it stil seems to populate the output file with duplicate URLs, even after commenting out the GENTOO_MIRRORS definition in /etc/make.conf. |
|
Back to top |
|
|
spOOwn Apprentice
Joined: 02 Nov 2002 Posts: 259 Location: Belgium
|
Posted: Fri Mar 04, 2005 7:56 pm Post subject: |
|
|
the python's script of Wechner works great for me !!! Thanks to him |
|
Back to top |
|
|
Nitro_146 Apprentice
Joined: 02 Mar 2005 Posts: 221 Location: Digne les bains, France
|
Posted: Wed Mar 16, 2005 10:45 pm Post subject: |
|
|
Is this supposed to work with most linux distros or only gentoo for the "somewhere" machine ? _________________ Linux, cause booting is for adding new hardware |
|
Back to top |
|
|
flybynite l33t
Joined: 06 Dec 2002 Posts: 620
|
Posted: Sat Mar 19, 2005 10:18 am Post subject: |
|
|
jfindlay wrote: | Quote: | This will really only download the files not present on the box without the network, and will try all mirrors portage knows about till it finds the file....
|
I've run your script, and it stil seems to populate the output file with duplicate URLs, even after commenting out the GENTOO_MIRRORS definition in /etc/make.conf. |
There are many more mirrors in gentoo than the ones listed in GENTOO_MIRRORS. Yes, the output file has DUPLICATE URLS, but only the first one that succeeds is actually downloaded. This is a feature. See man wget, option -nc, although the manpage is difficult to understand, but trust me, it works and only one copy is downloaded....
Nitro_146 wrote: | Is this supposed to work with most linux distros or only gentoo for the "somewhere" machine ? |
My simple solutions doesn't require gentoo, only wget. |
|
Back to top |
|
|
regeya Apprentice
Joined: 28 Jul 2002 Posts: 270 Location: Desoto, IL, USA
|
Posted: Fri May 20, 2005 4:11 pm Post subject: Re: emerge scripts |
|
|
flybynite wrote: | skiman1979 wrote: | You say those scripts *really* just download the URLs for the files that haven't been updated yet. What more would 'emerge -ufp world 2> emerge_list' download that your script doesn't? . |
The problem is 'emerge -ufp world' will download EVERY file needed to update world even if the file is already downloaded!!!
8< - - - snip 8< - - - snip - - - 8<
Why would you want to download all those packages again and transfer them to the slow network box when many/all of the files are already on the slow network box!! |
Code: | emerge -ufpD world 2>&1 | awk '{ print $1}' | sort | uniq | tee distlist.txt |
I'll check that later but I think that removes the duplicates from the list. Also it should only update those things that aren't installed yet. I ran this last night to make sure it didn't download absolutely everything, and it didn't as far as I could tell. You make me wonder if I should double-check my results when I get home, though.
Geez, cut back on the caffeine. _________________ Why, yes, I am a bore. |
|
Back to top |
|
|
A.S. Pushkin Guru
Joined: 09 Nov 2002 Posts: 418 Location: dx/dt, dy/dt, dz/dt, t
|
Posted: Fri Jun 03, 2005 4:34 am Post subject: And take advantage of deltup |
|
|
I have a terribly slow dialup.
Thanks to deltup and related scripts I have managed to update several very large sources; xorg, kernel and others. I did have to impose on a friend with cable modem for OO, but otherwise, deltup has been great!
BTW, do not use a CFD for big files like OO. Too slow.
My $.02 cents. _________________ ASPushkin
"In a time of universal deceit - telling the truth is a revolutionary act." -- George Orwell |
|
Back to top |
|
|
transcat n00b
Joined: 10 Feb 2006 Posts: 3
|
Posted: Sat Apr 01, 2006 10:22 pm Post subject: many of these techniques don't really work! |
|
|
I just wanted to post my experiences with the download methods described in this thread.
1. There are circumstances under which the technique of using "emerge -f $packages" will not work on a disconnected machine. In particular, if an ebuild requires more than one source file, emerge will attempt to download the first one (which will fail because the machine is not on a network), will skip fetching the other source files for that ebuild, and will move on to the next merge. The result is that, using this method on a disconnected machine, only URLs for the first source file for each ebuild/merge will appear in the output.
2. Using Code: | FETCHCOMMAND='echo "${URI}" >> /tmp/fetchlist' RESUMECOMMAND='echo "${URI}" >> /tmp/fetchlist' emerge -f $packages | will not work, either, for the same reason as for #1 above.
3. Using the "emerge -fp $packages 2> foo" method as described above in this thread will not work with the latest version of portage. This is because portage outputs multiple URLs on a single line. When wget is given this input (-i), it will url-encode the spaces (to %20), and (obviously) any file with multiple sources will return 404 (HTTP Not Found).
A post in this thread says
Quote: |
Yes, the output file has DUPLICATE URLS, but only the first one that succeeds is actually downloaded. This is a feature. See man wget, option -nc, although the manpage is difficult to understand, but trust me, it works and only one copy is downloaded....
|
But that's only partially true. wget cannot handle multiple URLs on a single line of (-i) input, but it can ignore extra URLs for a file if they are listed on separate lines.
4. The "emerge -fp $packages" method can...with some modification...be made to work.
Code: |
# store a newline in a shell variable (needed by the cut command)
nl='
'
# make sure to substitute $packages, $DISTDIR, and tempfile* as appropriate
# (this example uses temporary files named tempfile1, tempfile2, and tempfile3)
emerge -fp $packages 2> tempfile1
# ...and you still get to see all the stdout from emerge
cat tempfile1 | grep -v '^!!!' | cut -d " " -f 1- --output-delim="$nl" | grep -v '^$' | awk -F / '{print $NF" "$0}' | sort -r | uniq > tempfile2
# this strips out fetch-restrict messages, turn spaces into newlines,
# removes all the blank lines (feel free to translate these steps to sed),
# sorts the list by source file name, and removes lines with identical URLs
# (the awk, sort, and uniq are not necessary, but help to optimize retrieval)
# Option #1: Use THIS if you want to get all the URLs,
# REGARDLESS of what you have in your DISTFILES directory.
cat tempfile2 | cut -d " " -f 2 > tempfile3
# Option #2: Use THIS to skip files already in your DISTFILES directory.
last=""
echo=0
cat tempfile2 | while read file url; do test "$file" != "$last" && echo=1 && test -f "$DISTDIR/$file" && echo=0; test "$echo" -eq 1 && echo "$url"; last="$file"; done > tempfile3
# then, on the connected machine, run:
wget -T 20 -t 2 -c -nc -nd -i tempfile2
# using, of course, whatever values for -T and -t you think appropriate
|
I've tested this code (with appropriate settings for the variables), and it seems to function correctly on a disconnected machine. Good luck to you! |
|
Back to top |
|
|
spOOwn Apprentice
Joined: 02 Nov 2002 Posts: 259 Location: Belgium
|
Posted: Fri May 12, 2006 10:09 pm Post subject: |
|
|
Wechner wrote: | hey guys
hope it will be helpful.
|
Sure, very usefull, great tools for me
Good work |
|
Back to top |
|
|
yngwin Retired Dev
Joined: 19 Dec 2002 Posts: 4572 Location: Suzhou, China
|
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|