Author |
Message |
nharris
|
Posted: Tue Sep 11, 2007 8:35 am |
|
Joined: Thu Sep 07, 2006 11:20 am
Posts: 389
|
I can't get the 9755 drivers in R5F27 to recognize the native resolution of my Vizio HDTV. The 8776 drivers in R5F1 did this with no problems. So, I was trying out Cecil's handy install-8776.sh script to get the older drivers back. Unfortunately, it does not seem to work and leaves my system with a dead X11. Output of the script is below. Do I need to uninstall the 9755 drivers first? If so, how do I do this?
root@mythtv:~# /usr/local/bin/install-8776.sh
(Reading database ... 85506 files and directories currently installed.)
Preparing to replace nvidia-kernel-common 20051028+1 (using .../nvidia-kernel-common_20051028+1_all.deb) ...
Unpacking replacement nvidia-kernel-common ...
dpkg: regarding .../nvidia-glx-dev_1.0.8776-1_i386.deb containing nvidia-glx-dev:
nvidia-glx-dev conflicts with libgl-dev
libgl1-mesa-dev provides libgl-dev and is installed.
dpkg: error processing /usr/src/debs/nvidia/1.0.8776/nvidia-glx-dev_1.0.8776-1_i386.deb (--install):
conflicting packages - not installing nvidia-glx-dev
Preparing to replace nvidia-glx 1.0.8776-1 (using .../nvidia-glx_1.0.8776-1_i386.deb) ...
Unpacking replacement nvidia-glx ...
Preparing to replace nvidia-kernel-2.6.18-chw-13 1.0.8776-1+2.6.18-chw-13-10.00.Custom (using .../nvidia-kernel-2.6.18-chw-13_1.0.8776-1+2.6.18-chw-13-10.00.Custom_i386.deb) ...
Unpacking replacement nvidia-kernel-2.6.18-chw-13 ...
Setting up nvidia-kernel-common (20051028+1) ...
Setting up nvidia-kernel-2.6.18-chw-13 (1.0.8776-1+2.6.18-chw-13-10.00.Custom) ...
Setting up nvidia-glx (1.0.8776-1) ...
Errors were encountered while processing:
/usr/src/debs/nvidia/1.0.8776/nvidia-glx-dev_1.0.8776-1_i386.deb
|
|
Top |
|
 |
mk500
|
Posted: Tue Sep 11, 2007 7:41 pm |
|
Joined: Sun Apr 18, 2004 2:58 pm
Posts: 79
Location:
San Francisco, CA
|
Maybe try instead:
Code: /usr/local/bin/install-nvidia-debian.sh 1.0-8776 -force
_________________ Mobile Athlon XP 2600 on A7N8X-E, pcHD2000, pcHD3000, FX5200 128MB (production) R5.5
EPIA Via EDEN 800Mhz Mini ITX, HD2000 (testing)
AMD Geode 1700 CPU, NVidia 6600GT, (testing)
HDHomerun (testing)
|
|
Top |
|
 |
cecil
|
Posted: Tue Sep 11, 2007 8:08 pm |
|
 |
Site Admin |
Joined: Fri Sep 19, 2003 6:37 pm
Posts: 2659
Location:
Whittier, Ca
|
Code: dpkg -i /usr/src/debs/nvidia/1.0.8776/nvidia-glx_1.0.8776-1_i386.deb /usr/src/debs/nvidia/1.0.8776/nvidia-kernel-2.6.18-chw-13_1.0.8776-1+2.6.18-chw-13-10.00.Custom_i386.deb
|
|
Top |
|
 |
nharris
|
Posted: Wed Sep 12, 2007 6:49 am |
|
Joined: Thu Sep 07, 2006 11:20 am
Posts: 389
|
Thanks Cecil and mk500. It turns out that my package database was all messed up. I finally did an "apt-get -f install" and it corrected things enough to run Cecil's script. Everything is working perfectly now.
Both versions of nVidia drivers work, but the 8776 version is the only one that will validate my "1366x768" native resolution.
On a side note, I love the new screenshooter and comm_pause scripts. They are extremely useful.
|
|
Top |
|
 |
mogator88
|
Posted: Wed Sep 12, 2007 12:09 pm |
|
Joined: Tue Jan 30, 2007 1:27 am
Posts: 299
|
nharris wrote: Both versions of nVidia drivers work, but the 8776 version is the only one that will validate my "1366x768" native resolution.
Do you get any overscan at the 1366x768 setting, for either the desktop, video playback, or both?
|
|
Top |
|
 |
nharris
|
Posted: Wed Sep 12, 2007 12:12 pm |
|
Joined: Thu Sep 07, 2006 11:20 am
Posts: 389
|
mogator88 wrote: Do you get any overscan at the 1366x768 setting, for either the desktop, video playback, or both?
Nope. It's just like an LCD computer monitor. In fact, I have added 1% overscan in both X & Y to the mythfrontend setup for video playback to get rid of the ugly stuff on the over-the-air HD broadcasts.
|
|
Top |
|
 |
opel70
|
Posted: Wed Sep 12, 2007 8:15 pm |
|
Joined: Tue Apr 11, 2006 7:44 am
Posts: 287
Location:
Los Angeles, CA
|
I am having a similar problem with my projector and the 9755. The 9755 drivers don't recognixe my 1360x768 modeline. I have tried using the install-nvidia-debian.sh 1.0.XXXX -force. While the script successfully downloads the appropriate driver pkg file, it doesn't do anything after that. Right after the it finishes downloading the package, X restarts. Still using the 9755 driver.
Do I need to do anything prior to using the install-nvidia-debian.sh script? Install kernel sources, etc?
Thanks,
_________________ Tim
LinHES 8.4 HDHR3 BioStar A770, AMD X2 4050e, 2GB RAM GigaByte GeForce 8400, Chaintech AV710 USB-UIRT
|
|
Top |
|
 |
jacobsta
|
Posted: Sun Sep 30, 2007 4:19 pm |
|
Joined: Wed Apr 19, 2006 10:50 am
Posts: 2
|
I was having stuttering issues with high def XvMC (still not fixed, btw), and I reverted my drivers back to 9755 and inadvertantly found the solution - change your xorg.conf to match these settings, and it will work at 1366x768
Section "Device"
Identifier "Generic Video Card"
Driver "nvidia"
BusID "PCI:1:0:0"
Option "AddARGBVisuals" "True"
Option "AddARGBGLXVisuals" "True"
Option "NoLogo" "True"
Option "ModeValidation" "NoWidthAlignmentCheck" # Important!!! need this option to use nvidia card at 1366 x 768
EndSection
Section "Monitor"
Identifier "Generic Monitor"
VendorName "VIZIO"
ModelName "VIZIO VX37L"
HorizSync 31-70
VertRefresh 50-85
Mode "1366x768" # vfreq 59.815Hz, hfreq 47.553kHz
DotClock 85.500000
HTimings 1366 1494 1624 1798
VTimings 768 770 776 795
Flags "-HSync" "+VSync"
EndMode
EndSection
Section "Screen"
Identifier "Default Screen"
Device "Generic Video Card"
Monitor "Generic Monitor"
Option "DPI" "100 x 100"
DefaultDepth 24
SubSection "Display"
Viewport 0 0
Depth 24
Modes "1366x768"
EndSubSection
SubSection "Display"
Viewport 0 0
Depth 16
Modes "1366x768"
EndSubSection
SubSection "Display"
Viewport 0 0
Depth 15
Modes "1366x768"
EndSubSection
EndSection
(from here: http://www.mythtv.org/wiki/index.php/Vizio_VX37L)
|
|
Top |
|
 |
opel70
|
Posted: Sun Sep 30, 2007 8:45 pm |
|
Joined: Tue Apr 11, 2006 7:44 am
Posts: 287
Location:
Los Angeles, CA
|
I just had to completely reinstall again, so I will give this a shot soon and see how it goes.
_________________ Tim
LinHES 8.4 HDHR3 BioStar A770, AMD X2 4050e, 2GB RAM GigaByte GeForce 8400, Chaintech AV710 USB-UIRT
|
|
Top |
|
 |
opel70
|
Posted: Mon Oct 01, 2007 3:59 pm |
|
Joined: Tue Apr 11, 2006 7:44 am
Posts: 287
Location:
Los Angeles, CA
|
OK, so having tried this I'm still not having any luck.my Xorg.0.log shows:
Code: (**) NVIDIA(0): Option "UseEDID" "FALSE" (**) NVIDIA(0): Option "MetaModes" "1360x768" (**) NVIDIA(0): Option "ExactModeTimingsDVI" "TRUE" (**) NVIDIA(0): Option "DPI" "100 x 100" (**) NVIDIA(0): Option "ModeValidation" "NoEDIDModes" (**) NVIDIA(0): Enabling RENDER acceleration (**) NVIDIA(0): Ignoring EDIDs (II) NVIDIA(GPU-0): Not probing EDID on DFP-0. (II) NVIDIA(0): NVIDIA GPU GeForce 6600 GT at PCI:1:0:0 (GPU-0) (--) NVIDIA(0): Memory: 131072 kBytes (--) NVIDIA(0): VideoBIOS: 05.43.02.66.51 (II) NVIDIA(0): Detected AGP rate: 8X (--) NVIDIA(0): Interlaced video modes are supported on this GPU (--) NVIDIA(0): Connected display device(s) on GeForce 6600 GT at PCI:1:0:0: (--) NVIDIA(0): DFP-0 (--) NVIDIA(0): DFP-0: 310.0 MHz maximum pixel clock (--) NVIDIA(0): DFP-0: Internal Dual Link TMDS (II) NVIDIA(0): Mode Validation Overrides for DFP-0: (II) NVIDIA(0): NoEdidModes (II) NVIDIA(0): Assigned Display Device: DFP-0 (WW) NVIDIA(0): No valid modes for "1360x768"; removing. (WW) NVIDIA(0): (WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode (WW) NVIDIA(0): "nvidia-auto-select".
As you can see, I am trying to force the 1360x768 mode using any method available. I need to ignore any EDID values and the methods for ignoring those values have changed from driver to driver. Right now I think I have about every method in there.
I have also tried about a dozen different modelines for 1360x768, none of which appear to be "valid" in the 9755 drivers, though some of those same lines were valid in previous driver versions.
I'm hoping not to have to revert to an older driver again, but if I have to I will.
Thanks for any help you can offer.
_________________ Tim
LinHES 8.4 HDHR3 BioStar A770, AMD X2 4050e, 2GB RAM GigaByte GeForce 8400, Chaintech AV710 USB-UIRT
|
|
Top |
|
 |
opel70
|
Posted: Mon Oct 01, 2007 4:04 pm |
|
Joined: Tue Apr 11, 2006 7:44 am
Posts: 287
Location:
Los Angeles, CA
|
OK, I just increased the verbosity of the log and now see:
Code: (II) NVIDIA(0): Mode Validation Overrides for DFP-0: (II) NVIDIA(0): NoEdidModes (II) NVIDIA(0): Frequency information for DFP-0: (II) NVIDIA(0): HorizSync : 30.000-65.000 kHz (II) NVIDIA(0): VertRefresh : 59.000-61.000 Hz (II) NVIDIA(0): (HorizSync from HorizSync in X Config Monitor section) (II) NVIDIA(0): (VertRefresh from VertRefresh in X Config Monitor (II) NVIDIA(0): section) (II) NVIDIA(0): (II) NVIDIA(0): Native backend timings for DFP-0: (II) NVIDIA(0): 640 x 480 @ 60 Hz (II) NVIDIA(0): Pixel Clock : 25.18 MHz (II) NVIDIA(0): HRes, HSyncStart : 640, 656 (II) NVIDIA(0): HSyncEnd, HTotal : 752, 800 (II) NVIDIA(0): VRes, VSyncStart : 480, 490 (II) NVIDIA(0): VSyncEnd, VTotal : 492, 525 (II) NVIDIA(0): H/V Polarity : +/+ (II) NVIDIA(0): (II) NVIDIA(0): (II) NVIDIA(0): --- Modes in ModePool for DFP-0 --- (II) NVIDIA(0): "nvidia-auto-select" : 640 x 480 @ 60.0 Hz (from: VESA) (II) NVIDIA(0): "640x480" : 640 x 480 @ 60.0 Hz (from: VESA) (II) NVIDIA(0): "640x480_60" : 640 x 480 @ 60.0 Hz (from: VESA) (II) NVIDIA(0): "640x400" : 640 x 400 @ 60.0 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "640x400d60" : 640 x 400 @ 60.0 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "640x384" : 640 x 384 @ 60.1 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "640x384d60" : 640 x 384 @ 60.1 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "512x384" : 512 x 384 @ 60.0 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "512x384d60" : 512 x 384 @ 60.0 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "400x300" : 400 x 300 @ 60.3 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "400x300d60" : 400 x 300 @ 60.3 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "320x240" : 320 x 240 @ 60.1 Hz DoubleScan (from: X Server) (II) NVIDIA(0): "320x240d60" : 320 x 240 @ 60.1 Hz DoubleScan (from: X Server) (II) NVIDIA(0): --- End of ModePool for DFP-0: --- (II) NVIDIA(0): (II) NVIDIA(0): Assigned Display Device: DFP-0 (II) NVIDIA(0): Using MetaMode string: "1360x768" (II) NVIDIA(0): Requested modes: (II) NVIDIA(0): "1360x768" (WW) NVIDIA(0): No valid modes for "1360x768"; removing.
Why is it not seeing any of the 1360x768 modeling in my xorg.conf file? And what is with that "Native Backend Timings"?
_________________ Tim
LinHES 8.4 HDHR3 BioStar A770, AMD X2 4050e, 2GB RAM GigaByte GeForce 8400, Chaintech AV710 USB-UIRT
|
|
Top |
|
 |
opel70
|
Posted: Mon Oct 01, 2007 4:28 pm |
|
Joined: Tue Apr 11, 2006 7:44 am
Posts: 287
Location:
Los Angeles, CA
|
Sorry for all of the quick posts, but I finally stumbled across the necessary setting. I now need the:
Option "ModeValidation" "NoDFPNativeResolutionCheck"
instead of:
Option "ModeValidation" "NoEDIDModes"
With this option set, everything is working perfectly at 1360x768. Finally.
|
|
Top |
|
 |