tv - Monitor Resolution says one thing, but displays another

07
2014-07
  • Stacey

    I have purchased a large TV for a monitor that claims it is 1080p and clearly states that its native resolution is 1920x1080, and when I hook it up using HDMI, everything does read that; But the image I am getting is very, very clearly not 1920x1080; I can tell this because the exact same resolution on my second monitor (a 19 inch LG) displays things at a very different size and ratio. For example, things that span the entire screen easily on the smaller screen do not fit at all on the larger one.

    When I say "smaller", I do not mean that the things on the large TV are smaller; I fully expect that, given that it is a larger screen. I mean that the actual amount of pixels it displays across the screen is inaccurate. For example, if I open up a website that is sized to fit a 1024x768 resolution on my LG screen, there is a massive amount of padding on either size as the content floats correctly in the center; But on this new TV, it fills up the entire screen as if it is the actual resolution.

    Is this even possible? I am at a loss as to what to do. I have looked over the specifications time and time and time and time again and I see absolutely no reason why this should be showing at a smaller resolution.

    The exact TV I am using is here; Sceptre 32'' LCD HDTV

    This is on Windows 8.1 (x64), using an nVidia GeForce GT 540M display adapter.

    To illustrate what I mean by "it reads that it is 1920x1080", I have taken a screenshot of the same page (this one) across both screens, with the resolution display from Windows visible as well.

    Demonstration

  • Answers
  • Psycogeek

    What is the resolution the computer is using?:
    To find out the exact resolution that the computer itself is "sending" to the tv, do a ScreenShot , or hit the PrintScreen Button and paste the image off the clipboard into some picture editing program. Then read the dimentions of the captured picture.
    That will confirm that the computers own resolution viewport thing. Screen caps to test can be done with a single monitor or dualies and you can find out what the computer sees.
    If the computer own res is showing low, check out features of your video card, using your video card software. HDMI is not always pixel perfect with the video cards, sometimes (depending on the version and manufacture) it requires a bit of tweaking. ATI for example treated HDMI as if it was going to act like an analog tv. Some early LCD digital televisions did act that way (overscans).

    DPI (not resolution) is system wide not monitor independant:
    From what you showed (luv them pics) the computer is indeed in the desired resolution, and for some reason your tv is doing an interpolation. (it also looks like it is being interpolated) That does not explain the size difference , when DPI settings are system wide not monitor independant. While it looks like your usual 720type interpolation (again) it doesnt explain what looks like a DPI change.
    Also why are the Icons of the same clearity but the browser not? have you tested what your observing with many programs, or could it be something that a browser is doing? (I dont know of any browsers that would scale thier DPi based on a monitor, but ya never know what stupid features they will put in next :-)

    Resolution can be different on different monitors:
    The windows system is completly capable of running 2 monitors at 2 different resolutions. Each monitor can therfore have a resolution set for it. OffTopic- color profiles and cleartype and refresh rates can also be different.

    TVs for a computer what fun:
    If the screen capture shows the proper res, it could be assumed (but not guarenteed) that the changes you need to make are on the TV itself. First verify the actual model number, to insure you got what you think you got (look on the back of the actual tv). Then try and find format , aspect and other settings on the tv, while nothing seems cropped or zoomed, and the tv will always deal with the picture as a whole item (again not making a lot of sence that it is only part of the image).

    Setting everywhere , find them all, learn the purpose:
    Tvs are tvs , they are not monitors, so they have a lot of settings designed to work with broadcast signals, most tvs can be set to other than pixel by pixel settings, and some will even overscan still (throwback from CRTs).
    Hooking up a TV often required tweaking both the software for the GPU, and the TVs own settings to attempt to keep everything pixel=pixel, so it could be some tweaking will be in order. (overscan, underscan, aspect, format even possible to be effected by refresh rates)

    The pictures you display are very usefull, but I could not determine for a fact what is going on, maybe some of the above will help, or help you to provide more information.

    Added:
    Windows Display Resolution:
    In the windows Display Control Panel\All Control Panel Items\Display\Screen Resolution select the monitor/tv picture (first) that you are having troubles with and check the resolution settings there. If you have 2 monitors make sure before you make changes you have the monitor (picture) selected.

    Monitor drivers (profiles):
    Usually it is not really important if the tv/monitor name is not shown (the profile driver for the monitor is not installed) "Generic" connections should still be capable of getting the correct resolutions, without the added profile for the monitor. With that said it doesnt hurt to check if the monitor came with a CD or a download off thier web that has a "monitor driver" which is just a profile for that monitor. Monitor profiles also hold color calibration information.

    What windows wont do, GPU software often will:
    The software that comes with the GPU card (or chip even) Often has many more advanced features for adjusting things than the windows own simple and to the point display setting. Because your using Nvidia, I dont know where the options are. In ATI it also has a special section for HDMI connected tvs, that can resolve some issues and differances when hooking up an actual TV (instead of a monitor).


  • Related Question

    display - How to change resolution on Vista when it keeps booting to an unsupported resolution?
  • kscott

    I have a Vista machine that used to be hooked up to a widescreen monitor, I moved and no longer have the monitor and planned to just hook it up to my widescreen TV's VGA input jack. I can see the initial DOS-type boot screen, and see the "Windows is about to boot" screen, but everything goes black and my TV displays an "unsupported video signal" message right before it would normally show me the Vista circle logo and have me select a user to log in with.

    If I boot to safe mode I can get into Vista, and can set the resolution from the safe mode 600 X 800 to a higher res and see everything fine, but no matter what settings I change in safe mode I cannot get the regular boot to honor them.

    I don't have the old monitor, or another monitor, to hook up to and change the resolution that way, but if I absolutely have to I can probably manage to get to another monitor. It seems that is the obvious fix.

    But does anyone know how to get a safe mode change to stick? Or know the keystrokes I could enter blind to get from Vista user log in to resolution change screen? Or any other back door way to change this setting?


  • Related Answers
  • MDMarra

    Instead of safe mode boot to "Low Resolution" mode. It should be a few menu options below Safe Mode in that list.

    Alternatively you can open msconfig from a command line. Navigate to the Boot tab, and put a check in the Base Video box from safe mode and reboot.

    Edit: To answer your second question about detection. A monitor identifies itself to a computer, along with its specifications and native resolution over something called an EDID. The EDID resides on a ROM chip in the monitor and is a standard.

    However OS vendors (such as Microsoft), video card manufacturers, and monitor manufacturers have typically done a mess of a job implementing it. This leads to shoddy support for advertisement of native res, etc, and when this happens, it causes problems like the one you are having.

  • A Dwarf

    Your OS is trying to force the TV to use a refresh rate/resolution it can't support. You have to check your TV manual. It will list the supported combination. Contrary to a monitor, depending on your TV set, you may not have many choices here.

    The refresh rate will be almost certainly 60 Hz. The resolution is entirely dependent on the TV screen size. Meanwhile if your video card has settings for TV display you should use those instead of setting anything from within Windows normal resolution dialog box.

    If you provide your TV make and model, we can probably help you figure out the exact settings.

  • CarlF

    You can change the settings by blind typing if your video driver is behaving sanely.

    I don't have a Vista box here to test on, but with XP you could type:

    Windows key, c, d, d, enter, ctrl-tab, ctrl-tab, ctrl-tab, ctrl-tab, tab, shift-tab, down, down, enter.
    

    This would reduce the screen resolution two steps. Use more "down" steps at the end to reduce it all the way to the minimum. Again, this is for XP, test on Vista (in Safe Mode) before trying this.

    You could also boot into Safe Mode and change the video driver to "VGA" instead of whatever accelerated driver you're using. After you reboot into normal mode, reinstall the real driver and set whatever resolution you like.

  • John T

    Since the display settings won't stick in safe mode, download NirCmd in safe mode and add a batch startup script to change the resolution:

    nircmd.exe setdisplay 1680 1050 32

    where 1680 is the width, 1050 is the height, and 32 is the color depth in bits.

  • Journeyman Geek