Forums

Skip to content

Advanced search
  • Quick links
    • Unanswered topics
    • Active topics
    • Search
  • FAQ
  • Login
  • Register
  • Board index Discussion & Documentation Documentation, Tips & Tricks
  • Search

HOWTO: Central Gentoo Mirror for your Internal Network

Unofficial documentation for various parts of Gentoo Linux. Note: This is not a support forum.
Post Reply
Advanced search
134 posts
  • Previous
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • Next
Author
Message
rickj
Guru
Guru
User avatar
Posts: 432
Joined: Thu Feb 06, 2003 8:30 pm
Location: Calgary, Alberta, Canada
Contact:
Contact rickj
Website

  • Quote

Post by rickj » Mon Jan 05, 2004 7:07 pm

The original post is a great method, and works well for me. Saves time and net bandwidth.

Just a minor buglet:
FETCHCOMMAND=“rsync rsync://<your Portage gateway’s IP or DNS>/gentoo-packages/\${FILE} {DISTDIR}”
seems to be missing a $, mine works as:

Code: Select all

FETCHCOMMAND=“rsync rsync://<your Portage gateway’s IP or DNS>/gentoo-packages/\${FILE} ${DISTDIR}”
Thanks for a truly useful HOWTO.
Top
megalomax
n00b
n00b
User avatar
Posts: 52
Joined: Thu Feb 06, 2003 3:23 am
Location: germany

what to do when the package is not on the serve?

  • Quote

Post by megalomax » Fri Jan 16, 2004 3:13 pm

Hi there!

I really LOVE your setup. But all the other suggestions are really confusing me like hell 8O

I still try to figure out what to do, if a package is not on the server, thus needs to be downloaded by the server and not by the client...

Is there a way to do it with the original method..? Do I need some bash IF...THEN routines (I only know very basic bash stuff, sorry)...
Or maybe to send the package back to the server when the client downloaded it (by scp or something)

Or would I have to use a different approach???

thanks for your input...

man, I love these forums
:wink:
Top
megalomax
n00b
n00b
User avatar
Posts: 52
Joined: Thu Feb 06, 2003 3:23 am
Location: germany

@savage and all the other helpful wizards around here...

  • Quote

Post by megalomax » Fri Jan 16, 2004 4:10 pm

Hi again ...

I don't wanna spam here, but I just saw the nice work of savage...
I followed his instructions, but I'm not sure what I get on my machine.

I did some testing, and this is what happend...

machineServer: no distfile of a certain package (<fam> in my case) present
machineClinet: old version of <fam> is present, needs updating...

1) client: emerge -U fam

2) clinet tries to get the package from the server... not present...

3) client downloads the package and installs it

4) package still not present on the server...


Is there something wrong with my make.conf?
Should I see an erro when the php-Solution fails for some reason?
I just recently upgradet to apache2, but I don't know if this has something to do with all this...
Is the location of the apache-htdocs dir differernt in this case?

:roll:

cheers
Top
Ateo
Advocate
Advocate
Posts: 2022
Joined: Mon Jun 02, 2003 11:47 pm
Location: Vegas Baby!

  • Quote

Post by Ateo » Sun Jan 25, 2004 3:58 am

This works excellent. I've just cut my stage 1 install times on workstations by 1/3. Thanks for the howto!!!

Gentoo Rocks!
Top
savage
Apprentice
Apprentice
Posts: 161
Joined: Wed Jan 01, 2003 4:44 pm

an update is coming!

  • Quote

Post by savage » Mon Jan 26, 2004 5:06 pm

megalomax - just saw your message;

am looking into what has to be done in a current gentoo setup - will update origional post and let you know when it works.
Top
savage
Apprentice
Apprentice
Posts: 161
Joined: Wed Jan 01, 2003 4:44 pm

A way to get files that actually works in a network environm

  • Quote

Post by savage » Mon Jan 26, 2004 7:21 pm

ok folks,
hollar if I am missing something or making no sense. I seem to do that sometimes :-)

This is a way that was posted earlier by me (and now that I am back in Gentoo) it doesn't seem to work. Here is what is working for me now. Please give me feedback if you want / need changes.

Code: Select all

<?
//put in /var/www/localhost/htdocs/getFile.php on server
$packageSrc = trim($_GET["packageSrc"]);
$packageName = strrchr($packageSrc,"/");
$packageName = trim($packageName, "/");

if($packageName != "")
{
  @$fileHandle = fopen("/usr/portage/distfiles/" . $packageName, "r");
  if(!$fileHandle)
  {
    exec(escapeshellcmd("/usr/local/sbin/getPackageFromMirror $packageSrc"));
    @$fileHandle = fopen("/usr/portage/distfiles/" . $packageName, "r");
    if(!$fileHandle)
    {
      print "Unable to get File from remote Server\n";
      exit;
    }
    else
    {
      fpassthru($fileHandle);
      exit;
    }
  }
  else
  {
    fpassthru($fileHandle);
    exit;
  }
}
?>
and the C program:

Code: Select all

/*
put in /usr/local/src
cd to /usr/local/src
gcc -s -o getPackageFromMirror getPackageFromMirror.c
cp -v getPackageFromMirror /usr/local/sbin/
chown -v root.root /usr/local/sbin/getPackageFromMirror
chmod -v 4775 /usr/local/sbin/getPackageFromMirror
*/
#include <stdio.h>
#include <unistd.h>
#include <string.h>
#include <strings.h>
#include <errno.h>

int main(int argc, char* argv[])
{
  extern int errno;
  int i=0;
  char wgetCommand[1024];
  char fileTarget[1024];
  char* tok;

  if ((argc < 2) || (argc > 3))
  {
    printf("Usage: %s <packageToRetrieve> [packageName]\n", argv[0]);
    return(-1);
  }

  memset(wgetCommand,0,1024);
  memset(fileTarget,0,1024);

  snprintf(wgetCommand,1024,argv[1]);
  if(argc == 3)
  {
    snprintf(fileTarget,1024,"/usr/portage/distfiles/%s",argv[2]);
  }
  else
  {
    tok = strrchr(wgetCommand,'/');
    if(tok != NULL)
    {
    	snprintf(fileTarget,1024,"/usr/portage/distfiles/%s",tok+1);
    }
  }
  execl("/usr/bin/wget","wget","-q","-N","-O",fileTarget,wgetCommand, NULL);
  //printf("%s\n",strerror(errno));
} 
put this on your "proxy box"
and update your "FETCHCOMMAND" in "/etc/make.conf" to be:

Code: Select all

FETCHCOMMAND="/usr/bin/wget -t 5 -O \${DISTDIR}/\${FILE} http://[proxyBoxHere]/getFile.php?packageSrc=\${URI}"
Let me know.

Edit: added exit; s after termination - thanks to Aneurysm9
Last edited by savage on Sun Feb 08, 2004 10:31 pm, edited 2 times in total.
Top
Aneurysm9
n00b
n00b
Posts: 21
Joined: Sun Feb 08, 2004 2:00 am

  • Quote

Post by Aneurysm9 » Sun Feb 08, 2004 2:07 am

I'm not sure if it's related to my PHP setup or to your script, but it was adding a newline to the end of the files it was sending me, resulting in failed MD5 checks. I eventually figured out that adding "exit;" at the end of the main if loop fixed the problem.
Top
not_registered
Tux's lil' helper
Tux's lil' helper
Posts: 148
Joined: Tue Feb 04, 2003 10:26 am

  • Quote

Post by not_registered » Sat Mar 06, 2004 5:02 am

I don't know what I'm talking about, but can't you use SQUID to do this somehow?
It's Floam, it's Floam. It's flying foam!
Top
savage
Apprentice
Apprentice
Posts: 161
Joined: Wed Jan 01, 2003 4:44 pm

Squid question

  • Quote

Post by savage » Mon Mar 08, 2004 1:06 pm

Yes!

You can use squid to do this, but all of the files are stored in a human unreadable format in caching directories (looks like ac3x5rvfdaiwldk) instead of reiserfsprogs-xxxx; etc. Also, when you are burning hard disk space to store that and your distfiles on a computer, you are shelling out 2x as much hard disk space as if you do it the way above.

savage
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

Another way

  • Quote

Post by linkfromthepast » Thu Mar 18, 2004 3:43 pm

Here's a little perl script I wrote to take care of the package download and serving. We use this on our internal network of approx 100 nodes. So far everything seems to work correctly, although I'm sure there is alot of room for improvement. As far as security is concerned, there is none. I'm sure alot can be built in, but we filter by address w/ ipfilter so I didn't feel it was necessary. For some this may be complete garbage, others may use it, but if you find anything in it useful then I think it was worth posting.

File: dist.pl

Code: Select all


#!/usr/bin/perl
########################################################
#       GENTOO LOCAL MIRROR
#This script provides localmirror functionality for gentoo
#Simply set the mirror on the client machines to the webserver
#that is server this script
########################################################

########################################################
#       OUTLINE
# 1.) Get request for a file, if the file is not present, a 404 error takes place and through .htaccess this script is called
# 2.) Then the script downloads the file from one of the mirrors listed below, saves it in a web accessible directory, and sends
#       a Location tag to wget that emerge is using to redirect it to the file
# 3.) If the file DOES exist, the script simply redirects the browser
#
#
########################################################

########################################################
#       INSTALL
# 1.) Put the .htaccess file in the directory which you want your Gentoo mirror in make.conf to point to
# 2.) Edit .htaccess to point to this script
# 3.) Make sure in apache.conf there is an entry allowing the script to run in that directory
#       EXAMPLE: ScriptAlias /dist /var/www/localhost/htdocs/
########################################################



use CGI;

#used to redirect client to new localtion of file
$address="http:///your.address/dir";
#location of wget used to get files
$wgetlocation = "/usr/bin/wget";
#switches passed to wget
$wgetswitches = "-nc -c -t 5 --passive-ftp ";
#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
$distdir = "distfiles/";
#mirrors to use to get the gentoo files from
@mirrors = ("ftp://ibiblio.org/pub/Linux/distributions/gentoo/"," ftp://mirror.iawnet.sandia.gov/pub/gentoo/","ftp://gentoo.ccccom.com"," http://128.213.5.34/gentoo/");



#this is the ENV var which holds the address that was attempting to be accessed
$url = $ENV{"REQUEST_URI"};
#split the input
@parts = split(/\//, $url);
#count the parts
$count = $#parts;
#get the last part, which is the filename
$filename = $parts[$count];

#do we need distdir?
#$url = $mirrors[0].$distdir.$url;

if(!(-e $wgetputdir."/".$filename))
{
#create the url for the file to get with wget
$url = $mirrors[0].$url;
$command = $wgetlocation." ".$wgetswitches." ".$url." -P ".$wgetputdir;
#run the command
open(FILE, "$command |");
$output = <FILE>;
close(FILE);
}

#create a new CGI object for redirect
$query = new CGI;
#redirect
print $query->redirect($address.$filename);


File: .htaccess

Code: Select all


ErrorDocument 404 /dist/dist.pl
Top
La`res
Arch/Herd Tester
Arch/Herd Tester
User avatar
Posts: 79
Joined: Mon Aug 11, 2003 11:46 pm
Location: ::0

  • Quote

Post by La`res » Mon Mar 29, 2004 12:03 am

linkfromthepast - Could you be more detailed in the Install intructions. They seem a little vague to me. Mind you I'm relitavely new to apache.
Lares Moreau <lares.moreau@gmail.com>
LRU: 400755 http://counter.li.org
lares/irc.freenode.net
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

  • Quote

Post by linkfromthepast » Wed Apr 07, 2004 4:35 pm

Which part are you having problems with?

1.) Perl script goes in a directory on your server which you have set to execute scripts in the apache.conf.
2.) Change the variables in the Perl script based on your setup.
3.) Put the .htaccess in the folder where all your dist files will live.

So when a client requests a file in that directory, and the file does not exist, the .htaccess is used, which calls the script to download the file, then redirects the client to the file to download. Sorry for the run on :)

Also, one thing I've noticed is if you don't have a bandwidth, either Perl or Apache stalls the wget process. I'm leaning towards Apache because it actually owns the wget process, but not quite sure. So as long as you can download your dist file in under ~30secs you'll be ok. Although larger dist files like those for KDE should probably be done manually until the problem is fixed.

If that is still to broad an explanation, please post some specific questions. Good luck :)
Top
modnemo
n00b
n00b
Posts: 18
Joined: Sun Aug 10, 2003 10:45 pm

Emerge problem....

  • Quote

Post by modnemo » Thu Apr 08, 2004 4:04 pm

I can rsync files no problem using rsync...

Code: Select all

rsync rsync://192.168.0.1/gentoo-packages/zip23.tar.gz
But when I emerge anything I get this error...

Code: Select all

>>> emerge (1 of 20) net-fs/samba-2.2.8a to /
!!! File system problem. (Bad Symlink?)
!!! Fetching may fail: [Errno 2] No such file or directory: ''
any ideas?
Top
modnemo
n00b
n00b
Posts: 18
Joined: Sun Aug 10, 2003 10:45 pm

In make.conf type #USER=ID10T

  • Quote

Post by modnemo » Thu Apr 08, 2004 6:05 pm

OK....so nevermind my previous post...really dumb error.

Make sure (absolutey sure) that if you are typing in the variables in your make.conf file you use ${FILE} and not $(FILE) becasue it doesn't work.

Let me reiterate myself...if you are having weird errors while doing an emerge, but emerge sync works just fine... make sure when using the shell variables use { and not (

Thanks for the HOWTO, it was awesome...I was looking for a solution to emerge packages on my Fujitsu Stylistic 1200 tablet, without having to connect it to the internet (wireless support sucks for ADM8211 based cards). I looked into NFS but its alot of setup and kernel-required options which I didn't install. I was able to do the portage/rsync server in about 10 steps (and an hour of frustration trying to figure out what I typed wrong :D ).
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

  • Quote

Post by linkfromthepast » Thu Apr 08, 2004 7:04 pm

modnemo : Which method/setup did you use?
Top
Merlin-TC
l33t
l33t
User avatar
Posts: 603
Joined: Fri May 16, 2003 3:29 pm
Location: Germany

  • Quote

Post by Merlin-TC » Sat Apr 10, 2004 5:21 am

First of all thanks a lot for that guide.
I am using a local rsync server now to sync the tree and use an nfs share for the distfiles.

What I'd like to know is if there is an option in the rsync server that let's me cache the portage tree somehow.
The machine it's on is a k6-3 450 with 256MB ram but the harddist is kinda slow so the bottleneck is the harddrive.
Is there any way to chache at least some parts of the portage tree into ram or to preread it?
Top
razamatan
Apprentice
Apprentice
User avatar
Posts: 160
Joined: Fri Feb 28, 2003 8:51 am
Contact:
Contact razamatan
Website

  • Quote

Post by razamatan » Fri Apr 23, 2004 2:17 am

what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.
a razamatan doth speaketh,
"Never attribute to malice, that which can be adequately explained by stupidity"
Top
freshy98
Apprentice
Apprentice
User avatar
Posts: 274
Joined: Thu Jul 11, 2002 2:05 pm
Location: The Netherlands

  • Quote

Post by freshy98 » Sun Apr 25, 2004 8:16 pm

linkfromthepast wrote:Which part are you having problems with?

1.) Perl script goes in a directory on your server which you have set to execute scripts in the apache.conf.
2.) Change the variables in the Perl script based on your setup.
3.) Put the .htaccess in the folder where all your dist files will live.

So when a client requests a file in that directory, and the file does not exist, the .htaccess is used, which calls the script to download the file, then redirects the client to the file to download. Sorry for the run on :)

Also, one thing I've noticed is if you don't have a bandwidth, either Perl or Apache stalls the wget process. I'm leaning towards Apache because it actually owns the wget process, but not quite sure. So as long as you can download your dist file in under ~30secs you'll be ok. Although larger dist files like those for KDE should probably be done manually until the problem is fixed.

If that is still to broad an explanation, please post some specific questions. Good luck :)
Could you please explain where to put the files? what is /dist? does it need be named that way, or is it an example?

In the Perl script you talk about

Code: Select all

#used to redirect client to new localtion of file
$address="http:///your.address/dir";
while a little bit further you talk about

Code: Select all

#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
Isn't the /dir from the address line the /distfiles from the wgetputdir line?
It is very confusing.

Please try to explain a bit more thorough and use examples from your own sytem(s).

Thnx
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Top
arkane
l33t
l33t
User avatar
Posts: 918
Joined: Tue Apr 30, 2002 9:00 pm
Location: Phoenix, AZ

  • Quote

Post by arkane » Mon Apr 26, 2004 5:02 am

razamatan wrote:what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.
rsync --rsh="ssh -C" -plarvz username@servermachine:/usr/local/portage /usr/local/portage

That'd do it... set-up the SSH keys between the machines if you want to automate it. When you say *just* this directory, you mean not the subdirectories of it? Because IMHO that'd be pointless....
Top
razamatan
Apprentice
Apprentice
User avatar
Posts: 160
Joined: Fri Feb 28, 2003 8:51 am
Contact:
Contact razamatan
Website

  • Quote

Post by razamatan » Mon Apr 26, 2004 5:11 am

arkane wrote:
razamatan wrote:what if i want to synchronize /usr/local/portage (a portage overlay)? i tend to roll my own ebuilds, so i'd like to sync this directory (and *just* this directory) between two machines.
rsync --rsh="ssh -C" -plarvz username@servermachine:/usr/local/portage /usr/local/portage

That'd do it... set-up the SSH keys between the machines if you want to automate it. When you say *just* this directory, you mean not the subdirectories of it? Because IMHO that'd be pointless....
yes, recursive, cus it'd be pointless otherwise... :wink:

i tried this method, but it doesn't work..

Code: Select all

rsync --rsh="ssh -C" -uavz username@servermachine:/usr/local/portage/ /usr/local/portage
it complains about permissions (writing locally), but i have write perms via group membership..[/code]
a razamatan doth speaketh,
"Never attribute to malice, that which can be adequately explained by stupidity"
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

  • Quote

Post by linkfromthepast » Mon Apr 26, 2004 5:34 pm

You are correct, $address is the actual web address of the directory. $wgetputdir is the absolute filesystem path for the directory represented by $address. For example, /var/www/localhost/htdocs/distfiles would be the absolute path. But Apache is configured for /var/www/localhost/htdocs to be the root. So http://your.address/dir would be /var/www/localhost/htdocs/dir. You can change to name to whatever you like.

The overall purpose of this is so that when a client requests a file in http://your.address/dir and the file does not exist, the Perl script downloads the file and tells the client to try again now that the file has been downloaded. This is what the .htaccess file is for.
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

  • Quote

Post by linkfromthepast » Mon Apr 26, 2004 5:48 pm

Also keep in mind that all web addresses are relative to the root of the web server.

#used to redirect client to new localtion of file
$address="http:///your.address/dir";
#location of wget used to get files
$wgetlocation = "/usr/bin/wget";
#switches passed to wget
$wgetswitches = "-nc -c -t 5 --passive-ftp ";
#directory to put the new dist files in
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
$distdir = "distfiles/";
#mirrors to use to get the gentoo files from
@mirrors = ("ftp://ibiblio.org/pub/Linux/distributions/gentoo/"," ftp://mirror.iawnet.sandia.gov/pub/gent ... ccccom.com"," http://128.213.5.34/gentoo/");

$address, $distdir are relative paths
$wgetlocation, $wgetputdir are absolute paths

You might notice that $distdir isn't needed, so you don't need to configure it. I'm not sure why I left it in the script.

$address should be configured with the web address the client will use for trying to download the file. You can test this w/ a web browser.

$wgetputdir is the location wget puts the files it downloads. So if a client requests http://127.0.0.1/gentoo-files/x.y.z.tar.gz then wget will download the file into the $wgetputdir directory.

Example:

$address = http://127.0.0.1/distfiles
$wgetputdir = /var/www/localhost/htdocs/distfiles

One last note, you need to change the @mirrors servers to the servers that are the fastest for you. These servers may not be the quickest, they're just default servers I plucked from the /etc/make.conf
Top
freshy98
Apprentice
Apprentice
User avatar
Posts: 274
Joined: Thu Jul 11, 2002 2:05 pm
Location: The Netherlands

  • Quote

Post by freshy98 » Wed Apr 28, 2004 8:54 am

ok, let me see if I get this right.
instead of /usr/portage/distfiles I now need a /var/www/localhost/htdocs/distfiles which holds the .htaccess.

I think I will make a /var/www/localhost/htdocs/distfiles a symlink to /usr/portage/distfiles.
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Top
freshy98
Apprentice
Apprentice
User avatar
Posts: 274
Joined: Thu Jul 11, 2002 2:05 pm
Location: The Netherlands

  • Quote

Post by freshy98 » Wed Apr 28, 2004 12:35 pm

linkfromthepast, it does not seem to work for me. I editited the perl script to my liking but when I do a emerge -f package for example, it just freezes.

I have this on my portage gateway (192.168.1.20):

Code: Select all

GENTOO_MIRROS="http://192.168.1.20/distfiles/"
SYNC="rsync://192.168.1.20/gentoo-portage"
PORTDIR=/usr/portage
DISTDIR=/usr/portage/distfiles
PKGDIR=/usr/portage/packages
plus I have a symlink that tells /var/www/localhost/htdocs/distfiles is actually /usr/portage/distfiles. Here I also have the .htaccess file.

The perl script is in /dist/dist.pl.

Code: Select all

$address="http:///192.168.1.20/distfiles";
$wgetputdir = "/var/www/localhost/htdocs/distfiles";
@mirrors = ("ftp.easynet.nl/mirror/gentoo/","ftp.snt.utwente.nl/pub/os/linux/gentoo/","etc,etc");

On the client machine I have this in /etc/make.conf:

Code: Select all

GENTOO_MIRRORS="http://192.168.1.20:8080/distfiles/"
SYNC="rsync://192.168.1.20/gentoo-portage"
PORTDIR=/usr/portage
DISTDIR=${PORTDIR}/distfiles
/usr/portage/distfiles is shared via nfs on the portage gateway and mounted on the client in /usr/portage/distfiles.

Can you help me out please?
Mac Pro single quad 2.8GHz, 6GB RAM, 8800GT. MacBook. Plus way too many SUN/Cobatl/SGI and a lonely Alpha.
Top
linkfromthepast
n00b
n00b
Posts: 23
Joined: Thu Mar 18, 2004 3:04 pm

  • Quote

Post by linkfromthepast » Wed Apr 28, 2004 10:06 pm

Have you tried going to the link with a regular web browser to see what the response is (http://192.168.1.20/distfiles)? Is wget running on the gateway? Are you sure apache has write permission to the /usr/portage/distfiles directory?

BTW, I'm not sure if you noticed but $address="http:///192.168.1.20/distfiles"; shoud only have 2 //, its my mistake. And GENTOO_MIRROS="http://192.168.1.20/distfiles/" , is missing a R. :)

Let me know if/when you've tried all that and the responses.
Top
Post Reply

134 posts
  • Previous
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • Next

Return to “Documentation, Tips & Tricks”

Jump to
  • Assistance
  • ↳   News & Announcements
  • ↳   Frequently Asked Questions
  • ↳   Installing Gentoo
  • ↳   Multimedia
  • ↳   Desktop Environments
  • ↳   Networking & Security
  • ↳   Kernel & Hardware
  • ↳   Portage & Programming
  • ↳   Gamers & Players
  • ↳   Other Things Gentoo
  • ↳   Unsupported Software
  • Discussion & Documentation
  • ↳   Documentation, Tips & Tricks
  • ↳   Gentoo Chat
  • ↳   Gentoo Forums Feedback
  • ↳   Duplicate Threads
  • International Gentoo Users
  • ↳   中文 (Chinese)
  • ↳   Dutch
  • ↳   Finnish
  • ↳   French
  • ↳   Deutsches Forum (German)
  • ↳   Diskussionsforum
  • ↳   Deutsche Dokumentation
  • ↳   Greek
  • ↳   Forum italiano (Italian)
  • ↳   Forum di discussione italiano
  • ↳   Risorse italiane (documentazione e tools)
  • ↳   Polskie forum (Polish)
  • ↳   Instalacja i sprzęt
  • ↳   Polish OTW
  • ↳   Portuguese
  • ↳   Documentação, Ferramentas e Dicas
  • ↳   Russian
  • ↳   Scandinavian
  • ↳   Spanish
  • ↳   Other Languages
  • Architectures & Platforms
  • ↳   Gentoo on ARM
  • ↳   Gentoo on PPC
  • ↳   Gentoo on Sparc
  • ↳   Gentoo on Alternative Architectures
  • ↳   Gentoo on AMD64
  • ↳   Gentoo for Mac OS X (Portage for Mac OS X)
  • Board index
  • All times are UTC
  • Delete cookies

© 2001–2026 Gentoo Foundation, Inc.

Powered by phpBB® Forum Software © phpBB Limited

Privacy Policy

 

 

magic