View previous topic :: View next topic |
Author |
Message |
vexatious Tux's lil' helper
Joined: 24 Aug 2010 Posts: 85
|
Posted: Sun Aug 25, 2013 6:56 pm Post subject: Impossible to use custom resolutions with FGLRX? |
|
|
Having a real tough time getting custom resolutions (modelines) working with FGLRX.
It seems whenever I do 'startx', /var/Xorg.0.log gets spammed with Code: | fglrx(0): Not using mode "1600x1016@109i" (unknown reason) | , and all my other modelines (have quite a bit).
Tried using in /etc/X11/xorg.conf: Code: | Option "IgnoreEDID" "true"
Option "EnableRandR12" "false"
Option "NoDDC" "true" | under 'Device' section, and edited /etc/ati/amdpcsdb (randr12=sfalse) to no avail.
Some of my modelines are working (only two out of about 30 or so) but even for these, FGLRX is being dumb about it (and every resolution for that matter). In other words, every resolution gets a 65hz mode and causes the GPU to scale the resolution, within the largest resolution supported by the monitor. This even goes for the very few custom resolutions that do show up, despite the fact they're supposed to be 60hz or lower. This makes it really annoying since I have to change the desktop resolution manually before running a fullscreen application with proper settings, using a 60hz or lower refresh rate (driver tries to use the highest supported refresh rate first).
I tried adding custom modelines via xrandr but everytime I get a dumb error: Code: | X Error of failed request: BadMatch (invalid parameter attributes)
Major opcode of failed request: 155 (RANDR)
Minor opcode of failed request: 18 (RRAddOutputMode)
Serial number of failed request: 47
Current serial number in output stream: 48
| this happens even with standard vesa resolutions like 640x400@70hz. What the heck? This stuff use to bloody work when I used an onboard HD 4250.
My guess is that an option in /etc/ati/amdpcsbd prevents the driver from ignoring monitor limitations or something else is buggered. I want it to strictly use what I specify are the monitor limits, but I don't know how.
Everything was perfect with my old trustee 9800 gtx and nvidia binary drivers, and I just upgraded to the HD 7750. Was really happy since the new card uses almost a third the amount of power the 9800 gtx used (55 watts vs 140), it's faster, much quieter (9800 sounded like a hair dryer), new opengl and directx features, and cooler (got an XFX one with a dual slot heatsink). But this modeline stuff is really killing it after two days, and I'm about ready to throw the 7750 like a frisbee and unload a couple meals over it, from where the sun don't shine. Maybe I should tick AMD off and get my money back instead?
Would truly be grateful for any help.
Regards _________________ Gentoo
Slackware |
|
Back to top |
|
|
DONAHUE Watchman
Joined: 09 Dec 2006 Posts: 7651 Location: Goose Creek SC
|
Posted: Sun Aug 25, 2013 9:49 pm Post subject: |
|
|
using amd catalyst control center after running ati-config --initial is supposed to provide all the resolutions the card/monitor support. on the face of it the custom modelines aren't supported. _________________ Defund the FCC. |
|
Back to top |
|
|
DONAHUE Watchman
Joined: 09 Dec 2006 Posts: 7651 Location: Goose Creek SC
|
Posted: Sun Aug 25, 2013 9:50 pm Post subject: |
|
|
duped _________________ Defund the FCC.
Last edited by DONAHUE on Sun Aug 25, 2013 11:25 pm; edited 1 time in total |
|
Back to top |
|
|
vexatious Tux's lil' helper
Joined: 24 Aug 2010 Posts: 85
|
Posted: Sun Aug 25, 2013 11:18 pm Post subject: |
|
|
You're right but that's not technically true either.
It gets all the info from the EDID of the monitor. But I don't want to use that since I know my monitor (monitors) are capable of a lot more than what's reported. I believe FGLRX gets info from DDC too.
For example, I can do 256x240 @ 120hz, 640x200@120hz, 800x256@164hz, etc. I also have many interlaced modes that I can't use because of the way FGLRX and xrandr conflict. None of these modes are normally reported and CRT's can do virtually any resolution within the horizontal and vertical sync ranges (EDID only reports very standard stuff).
Perhaps I should've mentioned I'm using a CRT... _________________ Gentoo
Slackware |
|
Back to top |
|
|
DONAHUE Watchman
Joined: 09 Dec 2006 Posts: 7651 Location: Goose Creek SC
|
Posted: Sun Aug 25, 2013 11:29 pm Post subject: |
|
|
Doesn't help you, but tells me why you are adventuring. Good luck. _________________ Defund the FCC. |
|
Back to top |
|
|
vexatious Tux's lil' helper
Joined: 24 Aug 2010 Posts: 85
|
Posted: Wed Aug 28, 2013 7:41 am Post subject: |
|
|
Well I adventured with Mesa-9.2.0-rc2 and did manage to get custom resolutions working. Unfortunately, only software acceleration worked no matter how many times I rebuilt Mesa and xorg (kept getting: missing glx blah blah). After spending over four days on this, I find it totally unnacceptable. Was hoping to contribute and improve open source drivers for this card before I bought it, but I'm left without any acceleration or any motivation. Four days spent troubleshooting driver installations and I'm left with nowhere to start development. Even when fglrx did work without custom modelines (I did manage to get two out of about 20 working ), there seemed to be some irq conflict or similar causing things to lock-up randomly. Think it had something to do with the hdmi audio.
I'm going back to Nvidia since I've used their cards for almost six years on Linux without problems (over a decade on windows). I'd rather reverse engineer something that lets me use it, instead of dealing full-time troubleshooting driver installations and possible irq conflicts (if that's even what it is).
Perhaps an OP can delete this thread
UPDATE-SOLVED!:
Wow I finally got hardware acceleration working! Using Mesa-9.2.0-rc2 without radeonsi. Used plain radeon driver and now I can use custom modelines and have hardware acceleration. Thanks for help and suggestions DONAHUE, really appreciate when someone tries to help.
Cheers! _________________ Gentoo
Slackware |
|
Back to top |
|
|
|