Here's the problem- I've been playing around with getting my integrated, PCI, piece-of-junk, 16mb *max* shared memory intel i810 graphics chip to have glx and dri. So far I've learned that it only liked 16 bit color depth, not to use the kernel's built in dri, and install x11-drm.
Here's my xorg.conf:
Code: Select all
Section "ServerLayout"
/* my server layout, it works */
EndSection
Section "Files"
/* paths to fonts */
EndSection
Section "Module"
Load "extmod"
Load "dri"
Load "dbe"
Load "record"
Load "xtrap"
Load "glx"
Load "type1"
Load "freetype"
Load "GLcore"
Load "bitmap"
Load "ddc"
Load "vbe"
EndSection
Section "Monitor"
Identifier "Monitor0"
VendorName "Monitor Vendor"
ModelName "Monitor Model"
HorizSync 30 - 70
VertRefresh 50-160
EndSection
Section "Device"
Option "DRI" "True"
Identifier "Card0"
Driver "i810"
VendorName "Intel Corp."
BoardName "82810E DC-133 CGC [Chipset Graphics Controller]"
BusID "PCI:0:1:0"
VideoRAM 16384
EndSection
Section "Screen"
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0"
DefaultDepth 16
Subsection "Display"
Depth 16
Modes "1024x768" "800x600" "640x480"
ViewPort 0 0
EndSubsection
EndSection
Section "DRI"
Mode 0666
EndSection
I have *nothing* enabled under character devices. I'm sure I should probably get something in there going, but I don't know what, and I was just thrashing before - some direction is needed.
In framebuffer support:
Code: Select all
<*> Support for frame buffer devices
<*> Intel 810/815 support (EXPERIMENTAL)
[*] use VESA Generalized Timing Formula
[*] Enable DDC Support
If anyone has been able to do this themselves and get stuff like tuxracer running, I would love to know your methodology. Likewise, if anyone sees where I should be going with this - please let me know.
Thanx in advance;[/code] - Bill

