Gentoo Forums
Gentoo Forums
Gentoo Forums
Quick Search: in
Mozilla/Netscape and Large file downloads
View unanswered posts
View posts from last 24 hours
View posts from last 7 days

 
Reply to topic    Gentoo Forums Forum Index Desktop Environments
View previous topic :: View next topic  
Author Message
pavera
Tux's lil' helper
Tux's lil' helper


Joined: 27 May 2002
Posts: 84

PostPosted: Wed Jul 24, 2002 8:27 am    Post subject: Mozilla/Netscape and Large file downloads Reply with quote

I have a problem, I run Mozilla, and am trying desperately to eradicate windows from my network, however I have a small problem. when I want to download an updated distribution (IE RedHat limbo at this point) but really any file, that is large (say like an ISO that is 650MB) Mozilla sloggs the whole file into ram while it waits for it to finish, this crashes my machine, it only has 128MB of RAM and 256MB of swap... so 650 fills the memory, and the computer dies. is there any way to get Mozilla to save a file directly to the hard drive? The lack of this feature in *any* OSS browser forces me to keep 1 machine running windows, so I can download new ISOs so I can update my other machines running linux.
Back to top
View user's profile Send private message
dioxmat
Bodhisattva
Bodhisattva


Joined: 04 May 2002
Posts: 709
Location: /home/mat

PostPosted: Wed Jul 24, 2002 8:49 am    Post subject: Reply with quote

known mozilla bug.
use wget for large files :)
_________________
mat
Back to top
View user's profile Send private message
lotas
Tux's lil' helper
Tux's lil' helper


Joined: 23 Jul 2002
Posts: 121
Location: Tallaght, Dublin, Ireland

PostPosted: Wed Jul 24, 2002 11:12 am    Post subject: Reply with quote

Try prozilla.
Code:
emerge prozilla
to get it and then run
Code:
proz http://www.site.org/file/i/want.iso
to download it. can be used in conjunction with emerge too. check the docs on emerge for that, i think. [/code]
_________________
Lotas T Smartman
www.lotas-smartman.net
www.the-hairy-one.com
www.lsn-blog.tk
Dual Athlon 2Gz, 1Gb ram, 120Gb hdd, GeForce FX5200, DVD+R/-R/+RW/-RW, CDR/RW
Back to top
View user's profile Send private message
Pigeon
Guru
Guru


Joined: 21 Jun 2002
Posts: 307

PostPosted: Thu Jul 25, 2002 6:15 am    Post subject: Reply with quote

Prozilla's good, and you can integrate it into Galeon fairly easily. (or wget or whatever else...) Just do Settings -> Prefs -> Handlers -> Downloading -> Command -> xterm -e proz -s -P %f %s
Back to top
View user's profile Send private message
pavera
Tux's lil' helper
Tux's lil' helper


Joined: 27 May 2002
Posts: 84

PostPosted: Thu Jul 25, 2002 6:42 am    Post subject: Reply with quote

I am trying to download from redhat network (they are faster than any of the mirrors I have found I always get my full DSL speed of 70KBps from them, whereas from most mirrors I get around 10-15KBps) unfortunately this throws a wrench in the works, as I have to authenticate my username and password with their server... I see wget allows you to pass a username and password, however it doesn't work, I still get a 403 forbidden error from the server.
does prozilla allow you to pass username/password? does it actually work?
Back to top
View user's profile Send private message
Pigeon
Guru
Guru


Joined: 21 Jun 2002
Posts: 307

PostPosted: Thu Jul 25, 2002 9:22 am    Post subject: Reply with quote

The man page mentions it- hosts/usernames/pw's are kept in ~/.netrc but doesn't give format info.... I'd *guess* the format would be hostname:username:pw(or hashed pw) but whatever... I'd try their webpage, but it's puking atm. (prozilla.delrom.ro)

BTW- prozilla does multiple part downloads, so you can do the download in 8 parts if you wanted to, and get plenty of speed even if you're limited to slow mirrors.
Back to top
View user's profile Send private message
dioxmat
Bodhisattva
Bodhisattva


Joined: 04 May 2002
Posts: 709
Location: /home/mat

PostPosted: Thu Jul 25, 2002 10:10 am    Post subject: Reply with quote

wget works fine with user/password.
just put them in the URI as you would do with any browser.
see ftp://ftp.isi.edu/in-notes/rfc2396.txt
_________________
mat
Back to top
View user's profile Send private message
pavera
Tux's lil' helper
Tux's lil' helper


Joined: 27 May 2002
Posts: 84

PostPosted: Thu Jul 25, 2002 10:38 am    Post subject: Reply with quote

an english example of that would be nice :)
the command I used was:
Code:
#wget --http-user=******* --http-passwd=****** https://rhn.redhat.com/path/to/limbo1.iso

I read in the man page that you can put it all in the URL, but it didn't give an example as to how... and as nice and beautiful as RFCs are, I don't read them, nor do I understand them.
nor do I have any desire to start now.
Back to top
View user's profile Send private message
dioxmat
Bodhisattva
Bodhisattva


Joined: 04 May 2002
Posts: 709
Location: /home/mat

PostPosted: Thu Jul 25, 2002 11:23 am    Post subject: Reply with quote

pavera wrote:
an english example of that would be nice :)
the command I used was:
Code:
#wget --http-user=******* --http-passwd=****** https://rhn.redhat.com/path/to/limbo1.iso


wget http://user:password@www.site.com/blah/

never tried with https, that might be the problem.
_________________
mat
Back to top
View user's profile Send private message
pavera
Tux's lil' helper
Tux's lil' helper


Joined: 27 May 2002
Posts: 84

PostPosted: Thu Jul 25, 2002 11:52 am    Post subject: Reply with quote

I think its just Red Hat Network...
the link that actually downloads the ISO looks like this:
Code:

https://rhn.redhat.com/network/channel/download_isos.pxt/
limbo-i386-disc1.iso?filename=redhat/linux/beta/limbo/en/iso/
i386/limbo-i386-disc1.iso&pxt_trap=rhn:ftp_download_cb&
token=e36156f903deba45c454be55b62f9f15x39fffa0fadc9230083671500cd0ed5d6
&pxt_session_id=11325999x93d03dd07021794f38108d22097c3510&
save_as=foo/limbo-i386-disc1.iso

so... it does all sorts of stuff on the server I think.. and wget probably can't navigate that (at any rate.. I can't get it to...)
thanks for the help anyway... I think I'll just keep 1 machine running windows until the OSS browsers get a little better...
Back to top
View user's profile Send private message
bazik
Retired Dev
Retired Dev


Joined: 22 Jul 2002
Posts: 277
Location: Behind you.

PostPosted: Thu Jul 25, 2002 12:23 pm    Post subject: Re: Mozilla/Netscape and Large file downloads Reply with quote

pavera wrote:
Mozilla sloggs the whole file into ram while it waits for it to finish, this crashes my machine, it only has 128MB of RAM and 256MB of swap... so 650 fills the memory, and the computer dies.


<useless comment>
Get more RAM! 1536MB here... too bad my board doesn't support more, because RAM is such cheap nowadays. I could get additional 1536MB for $100 :)
</useless comment>
_________________
Gentoo Linux/Sparc Developer
http://dev.gentoo.org/~bazik/
Back to top
View user's profile Send private message
Kiff
n00b
n00b


Joined: 22 Jul 2002
Posts: 73

PostPosted: Thu Jul 25, 2002 12:36 pm    Post subject: Reply with quote

<useless comment on useless comment>
You're one lucky b*tch there! For 100$ dollars I'd get only 512Mb or so here in Belgium :(
</useless comment on useless comment>

is that error also present in 1.1 version of mozilla? if so, don't they have a solution yet?
Back to top
View user's profile Send private message
Malakin
Veteran
Veteran


Joined: 14 Apr 2002
Posts: 1692
Location: Victoria BC Canada

PostPosted: Thu Jul 25, 2002 1:38 pm    Post subject: Reply with quote

I've always used gftp for large files so I can resume them if the download gets disconnected for some reason (as opposed to a web browser).
Back to top
View user's profile Send private message
pavera
Tux's lil' helper
Tux's lil' helper


Joined: 27 May 2002
Posts: 84

PostPosted: Thu Jul 25, 2002 2:25 pm    Post subject: Reply with quote

I have seen the error in all versions of Mozilla and netscape....
I finally did get wget to work, the secret is single quotes around the URL
so wget 'https://rhn.redhat.com/all_that_f***ing_crap_I_copied_eariler'
its the ampersands in the URL that mess it up, so just so everyone knows, if you want to wget from a url that has ampersands (in other words one that has received or is going to pass variables in the URL) enclose the URL in single quotes.
and it just works, so thats nice, but it took me 3 days on the rhn-users mailing list to get that out of them...
Anyway, the problem I have with using gftp is that I have a subscription to redhat network that I pay for to keep a few critical systems up to date (I use gentoo on my desktop, because its fast and beautiful, but I've had too many problems with emerge --update system and emerge --update world to trust gentoo yet on a critical server that needs to be up 24/7.) therefore my servers run redhat (my favorite server distro, gentoo takes the cake on the desktop). Anyway, I use RHN to get the latest versions of redhat and test them on a test box I've got before they go into production, so I figure if I'm paying I should get to use their servers to get the ISOs at full speed and not have to search for the fastest ftp mirror at the time I'm downloading... but RHN has issues with all OSS browsers and it took me 3 days to get someone to answer my question about how exactly to use wget with RHN.. so theres the whole story... :)
enjoy!
Back to top
View user's profile Send private message
rizzo
Retired Dev
Retired Dev


Joined: 30 Apr 2002
Posts: 1067
Location: Manitowoc, WI, USA

PostPosted: Thu Jul 25, 2002 2:37 pm    Post subject: Re: Mozilla/Netscape and Large file downloads Reply with quote

bAZiK wrote:
I could get additional 1536MB for $100 :)


Without even looking ... WHERE?
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Gentoo Forums Forum Index Desktop Environments All times are GMT
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum