Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Fuzzy Avatars Solved! Please re-upload your avatar if it was fuzzy!

LCD monitor vs LCD TV

13»

Comments

  • AoriAori Carbondale, ILPosts: 1,886Member Uncommon
    Originally posted by grndzro
    5ms refresh rate is quite a difference when quickly scrolling through forums and websites. I sure as hell noticed a huge difference when I switched from my Sony FW900 CRT to LCD.

    Uh.. depending on the LCD the CRT should have been better. ignoring power draw, massive size and space heating ability of course.

    That was a good CRT and people still use them for gaming if they don't mind its drawbacks.

  • RidelynnRidelynn Fresno, CAPosts: 4,173Member Uncommon


    Originally posted by ianicus
    Originally posted by GrayGhost79 Originally posted by ianicus Originally posted by GrayGhost79 Whoa this got crazy fast...  as others have pointed out 720p and 1080i are not in any way shape or form the same thing lol...  Response time variances between TV's and Monitors have become negligible. There once was a noticable difference, but with any TV's in the past few years with the exception of some really cheap odd off brand models this is no longer the case.    TV's can be just as sharp and crisp as a monitor if tweaked correctly.    If your TV's native resolution is 720p then you want to set it to that, if you don't believe me then set it to 1366x768 and launch steam.    If the icons are to big at 720p go to control panel, display settings, and then go to "Make text and items larger or smaller" and simply select smaller. If the issue is in game then reduce UI size, pretty much every game has the option.    The main issue with using non native resolutions is that text is promblematic on TV screens. This is a general rule but it doesn't always hold true though, if text and everything is fine inside and out side of games with 1366x768 then by all means use it because there are exceptions to every rule. Every TV is different, it really depends on the brand and the specific model. If your TV lets you get away with 1080p then again use it because again not all TV's are created equal.    I have a 55inch TV I use when I feel like kicking back and using my Razer Onza and its got a much better picture than my monitor and no I'm not using a cheap generic monitor lol. I simply have a very nice TV and know what I'm doing when setting these things up :)    And again... response time differences now days are negligible. 
    the point HERE is his tv....sounds like a peice of shit.....so yeah monitor is probably his best bet...   UPDATE - I relent, do whatever you think is best OP because this forum has turned into a big opinion peice with no acctual factual relevence LOL do what feels RIGHT mate! :)
    Odd you make that comment...    I posted factual information and you posted "His TV sounds like a piece of crap so monitor is probably better".  But yes in the end it is about what you want, if you want to use the monitor use the monitor. If you want to use the TV, then use the TV.  Another option is to simply hook both up and swap around when you feel like it. Game on one while watching a movie on the other or what ever.  Find the option that suites you best. 
    I also presented factual information but people didnt seem to care, as is why I made the comment...you wanna say response times are negligable? than so is every other specification, you cant have it both ways...

    For a monitor - I agree with you to some extent; it's not that the specifications are negligible: that implies that they don't matter as characteristics. The characteristics matter very much. It's just that, as printed specifications, they are meaningless because they are arbitrary.

    Take something as simple as the color Black. That can be very different on two different monitors, due to differences in LCD manufacturing, backlighting, pre-set contrast and brightness levels, etc.

    It's not that black is negligible. It's very relevant. It's just that you can't get any feel for which monitor is more "blackier" by looking at anything on the box.

    Sure, they print specifications; contrast levels (meaningless), pixel refresh rate (no enforced standard), even the size (often measured at different spots or rounded: some 24" monitors are actually 23.7" diagonal)

    The only things you can really take away are the supported resolutions, which is pretty hard to fake.

    And who's to say which black may look better - some people may prefer a brigher monitor, others darker - each eye is different, and monitors are used in different locations for different purposes. Someone in a bright office reading Microsoft Word all day with black text on white background may have a very different opinion than someone trying to play Quake in a bedroom with the lights off.

  • grndzrogrndzro Reno, NVPosts: 1,150Member
    Originally posted by treysmooth
    Originally posted by grndzro
    Originally posted by treysmooth
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by cichy1012

    I have a 23 inch HP2310m monitor that runs a native resolution 1920x1080

    I also have mounted on the wal 3 feet from me 32 inch panasonic LCD 720p tv.

    I cant figure out what is better.

     

    Whats wierd is the tv can be set to all the HD resolution 1080p or 720 etc... or just basic 1366x768 even though specs on tv say 1366x768 max....

    Do i set it to basic or HD resolution. And if I use the tv will i see a major difference in gaming from the monitor?

    720p is in fact 1080i so that is 1920x1080

    I use a 32 inch tv as my monitorand have it sitting around 3 feet from me looks great in games.

    errr not quite, the p and the i stand for progressive scan and interlaced which are two VERY different image rendering methods, progressive being the clear winner over interlaced, its often better to display in 720p rather than 1080i as the interlacing creates digital artifacts on LCD displays.

     

    I'm aware but if you buy a 32 inch 720p and hook up a hdmi it defaults to 1080i.  In 1080 I have no artifacts and everything looks great.  If he hooks the hdmi up on that tv I gurantee it will show up 1080i and will look great thats my experince across 3 different displays from 3 brands all 720p.  To say o it will have artifacts is factually incorrect I have not seen a single artifact among any of them.

    There is no possible way the LCD crystals in a 720p TV can self replicate to 1080p resolution.

    Pictures or it didn't happen.

    I didn't say 1080p I said yet again 1080i.  If you go to your video card info on the connection it will register as 1080i.  I never said its as good as 1080p I even stated my monitor sits 3 feet from me as well.

    The artifacts are per pixel based. When you take a 1920x1080 picture and downscale it to 1280x720 you loose pixels in the conversion. To call it artifacting is probably a misnomer. You get a better picture from an actual 720p stream because it was filmed in 720p

  • GrayGhost79GrayGhost79 Webster, MAPosts: 4,813Member
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by cichy1012

    I have a 23 inch HP2310m monitor that runs a native resolution 1920x1080

    I also have mounted on the wal 3 feet from me 32 inch panasonic LCD 720p tv.

    I cant figure out what is better.

     

    Whats wierd is the tv can be set to all the HD resolution 1080p or 720 etc... or just basic 1366x768 even though specs on tv say 1366x768 max....

    Do i set it to basic or HD resolution. And if I use the tv will i see a major difference in gaming from the monitor?

    720p is in fact 1080i so that is 1920x1080

    I use a 32 inch tv as my monitorand have it sitting around 3 feet from me looks great in games.

    errr not quite, the p and the i stand for progressive scan and interlaced which are two VERY different image rendering methods, progressive being the clear winner over interlaced, its often better to display in 720p rather than 1080i as the interlacing creates digital artifacts on LCD displays.

     

    I'm aware but if you buy a 32 inch 720p and hook up a hdmi it defaults to 1080i.  In 1080 I have no artifacts and everything looks great.  If he hooks the hdmi up on that tv I gurantee it will show up 1080i and will look great thats my experince across 3 different displays from 3 brands all 720p.  To say o it will have artifacts is factually incorrect I have not seen a single artifact among any of them.

    well in my personal expierience as an installer for a television service provider I have seen artifacts and or picelization when the tv is set to 1080i.....this would be why my company advises us to always set the TV's to 720p, it provides a better viewing expierience, do yourself a favour and google interlaced versus progressive scan if you dont believe me...

    lol, I was wondering when the "I'm a professional..." comment was coming. 

    1080i will not display any artifacts or blurring if the TV has at least a semi decent processor lol. 

  • ianicusianicus Calgary, ABPosts: 472Member
    Originally posted by Ridelynn

     


    Originally posted by ianicus

    Originally posted by GrayGhost79

    Originally posted by ianicus

    Originally posted by GrayGhost79 Whoa this got crazy fast...  as others have pointed out 720p and 1080i are not in any way shape or form the same thing lol...  Response time variances between TV's and Monitors have become negligible. There once was a noticable difference, but with any TV's in the past few years with the exception of some really cheap odd off brand models this is no longer the case.    TV's can be just as sharp and crisp as a monitor if tweaked correctly.    If your TV's native resolution is 720p then you want to set it to that, if you don't believe me then set it to 1366x768 and launch steam.    If the icons are to big at 720p go to control panel, display settings, and then go to "Make text and items larger or smaller" and simply select smaller. If the issue is in game then reduce UI size, pretty much every game has the option.    The main issue with using non native resolutions is that text is promblematic on TV screens. This is a general rule but it doesn't always hold true though, if text and everything is fine inside and out side of games with 1366x768 then by all means use it because there are exceptions to every rule. Every TV is different, it really depends on the brand and the specific model. If your TV lets you get away with 1080p then again use it because again not all TV's are created equal.    I have a 55inch TV I use when I feel like kicking back and using my Razer Onza and its got a much better picture than my monitor and no I'm not using a cheap generic monitor lol. I simply have a very nice TV and know what I'm doing when setting these things up :)    And again... response time differences now days are negligible. 
    the point HERE is his tv....sounds like a peice of shit.....so yeah monitor is probably his best bet...   UPDATE - I relent, do whatever you think is best OP because this forum has turned into a big opinion peice with no acctual factual relevence LOL do what feels RIGHT mate! :)
    Odd you make that comment...    I posted factual information and you posted "His TV sounds like a piece of crap so monitor is probably better".  But yes in the end it is about what you want, if you want to use the monitor use the monitor. If you want to use the TV, then use the TV.  Another option is to simply hook both up and swap around when you feel like it. Game on one while watching a movie on the other or what ever.  Find the option that suites you best. 
    I also presented factual information but people didnt seem to care, as is why I made the comment...you wanna say response times are negligable? than so is every other specification, you cant have it both ways...

     

    For a monitor - I agree with you to some extent; it's not that the specifications are negligible: that implies that they don't matter as characteristics. The characteristics matter very much. It's just that, as printed specifications, they are meaningless because they are arbitrary.

    Take something as simple as the color Black. That can be very different on two different monitors, due to differences in LCD manufacturing, backlighting, pre-set contrast and brightness levels, etc.

    It's not that black is negligible. It's very relevant. It's just that you can't get any feel for which monitor is more "blackier" by looking at anything on the box.

    Sure, they print specifications; contrast levels (meaningless), pixel refresh rate (no enforced standard), even the size (often measured at different spots or rounded: some 24" monitors are actually 23.7" diagonal)

    The only things you can really take away are the supported resolutions, which is pretty hard to fake.

    And who's to say which black may look better - some people may prefer a brigher monitor, others darker - each eye is different, and monitors are used in different locations for different purposes. Someone in a bright office reading Microsoft Word all day with black text on white background may have a very different opinion than someone trying to play Quake in a bedroom with the lights off.

    thats kinda why they started measuring it in gray to gray, not black, if im not mistaken due to those differences, they set thier own baseline for measurment

    image

  • ianicusianicus Calgary, ABPosts: 472Member
    Originally posted by GrayGhost79
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by cichy1012

    I have a 23 inch HP2310m monitor that runs a native resolution 1920x1080

    I also have mounted on the wal 3 feet from me 32 inch panasonic LCD 720p tv.

    I cant figure out what is better.

     

    Whats wierd is the tv can be set to all the HD resolution 1080p or 720 etc... or just basic 1366x768 even though specs on tv say 1366x768 max....

    Do i set it to basic or HD resolution. And if I use the tv will i see a major difference in gaming from the monitor?

    720p is in fact 1080i so that is 1920x1080

    I use a 32 inch tv as my monitorand have it sitting around 3 feet from me looks great in games.

    errr not quite, the p and the i stand for progressive scan and interlaced which are two VERY different image rendering methods, progressive being the clear winner over interlaced, its often better to display in 720p rather than 1080i as the interlacing creates digital artifacts on LCD displays.

     

    I'm aware but if you buy a 32 inch 720p and hook up a hdmi it defaults to 1080i.  In 1080 I have no artifacts and everything looks great.  If he hooks the hdmi up on that tv I gurantee it will show up 1080i and will look great thats my experince across 3 different displays from 3 brands all 720p.  To say o it will have artifacts is factually incorrect I have not seen a single artifact among any of them.

    well in my personal expierience as an installer for a television service provider I have seen artifacts and or picelization when the tv is set to 1080i.....this would be why my company advises us to always set the TV's to 720p, it provides a better viewing expierience, do yourself a favour and google interlaced versus progressive scan if you dont believe me...

    lol, I was wondering when the "I'm a professional..." comment was coming. 

    1080i will not display any artifacts or blurring if the TV has at least a decent processor lol. 

    I said personal mister reads good, and Im pretty sure my line of work gives me ALOT more expierience than you, done.

    image

  • grndzrogrndzro Reno, NVPosts: 1,150Member
    Originally posted by Aori
    Originally posted by grndzro
    5ms refresh rate is quite a difference when quickly scrolling through forums and websites. I sure as hell noticed a huge difference when I switched from my Sony FW900 CRT to LCD.

    Uh.. depending on the LCD the CRT should have been better. ignoring power draw, massive size and space heating ability of course.

    That was a good CRT and people still use them for gaming if they don't mind its drawbacks.

    I swear some day I will buy another and refill the CRT, do a complete recap and custom heatsinks, and install a couple fans to run it at 160hz @ 2304 x 1440

    oh I was meaning the CRT was way better :)

  • GrayGhost79GrayGhost79 Webster, MAPosts: 4,813Member
    Originally posted by ianicus
    Originally posted by GrayGhost79
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by cichy1012

    I have a 23 inch HP2310m monitor that runs a native resolution 1920x1080

    I also have mounted on the wal 3 feet from me 32 inch panasonic LCD 720p tv.

    I cant figure out what is better.

     

    Whats wierd is the tv can be set to all the HD resolution 1080p or 720 etc... or just basic 1366x768 even though specs on tv say 1366x768 max....

    Do i set it to basic or HD resolution. And if I use the tv will i see a major difference in gaming from the monitor?

    720p is in fact 1080i so that is 1920x1080

    I use a 32 inch tv as my monitorand have it sitting around 3 feet from me looks great in games.

    errr not quite, the p and the i stand for progressive scan and interlaced which are two VERY different image rendering methods, progressive being the clear winner over interlaced, its often better to display in 720p rather than 1080i as the interlacing creates digital artifacts on LCD displays.

     

    I'm aware but if you buy a 32 inch 720p and hook up a hdmi it defaults to 1080i.  In 1080 I have no artifacts and everything looks great.  If he hooks the hdmi up on that tv I gurantee it will show up 1080i and will look great thats my experince across 3 different displays from 3 brands all 720p.  To say o it will have artifacts is factually incorrect I have not seen a single artifact among any of them.

    well in my personal expierience as an installer for a television service provider I have seen artifacts and or picelization when the tv is set to 1080i.....this would be why my company advises us to always set the TV's to 720p, it provides a better viewing expierience, do yourself a favour and google interlaced versus progressive scan if you dont believe me...

    lol, I was wondering when the "I'm a professional..." comment was coming. 

    1080i will not display any artifacts or blurring if the TV has at least a decent processor lol. 

    I said personal mister reads good, and Im pretty sure my line of work gives me ALOT more expierience than you, done.

    Doubtful, but not knowing what your fields of expertise are keeps me from knowing with certainty just as not knowing my fields of expertise keeps you from knowing with any certainty. 

  • ianicusianicus Calgary, ABPosts: 472Member
    Originally posted by GrayGhost79
    Originally posted by ianicus
    Originally posted by GrayGhost79
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by ianicus
    Originally posted by treysmooth
    Originally posted by cichy1012

    I have a 23 inch HP2310m monitor that runs a native resolution 1920x1080

    I also have mounted on the wal 3 feet from me 32 inch panasonic LCD 720p tv.

    I cant figure out what is better.

     

    Whats wierd is the tv can be set to all the HD resolution 1080p or 720 etc... or just basic 1366x768 even though specs on tv say 1366x768 max....

    Do i set it to basic or HD resolution. And if I use the tv will i see a major difference in gaming from the monitor?

    720p is in fact 1080i so that is 1920x1080

    I use a 32 inch tv as my monitorand have it sitting around 3 feet from me looks great in games.

    errr not quite, the p and the i stand for progressive scan and interlaced which are two VERY different image rendering methods, progressive being the clear winner over interlaced, its often better to display in 720p rather than 1080i as the interlacing creates digital artifacts on LCD displays.

     

    I'm aware but if you buy a 32 inch 720p and hook up a hdmi it defaults to 1080i.  In 1080 I have no artifacts and everything looks great.  If he hooks the hdmi up on that tv I gurantee it will show up 1080i and will look great thats my experince across 3 different displays from 3 brands all 720p.  To say o it will have artifacts is factually incorrect I have not seen a single artifact among any of them.

    well in my personal expierience as an installer for a television service provider I have seen artifacts and or picelization when the tv is set to 1080i.....this would be why my company advises us to always set the TV's to 720p, it provides a better viewing expierience, do yourself a favour and google interlaced versus progressive scan if you dont believe me...

    lol, I was wondering when the "I'm a professional..." comment was coming. 

    1080i will not display any artifacts or blurring if the TV has at least a decent processor lol. 

    I said personal mister reads good, and Im pretty sure my line of work gives me ALOT more expierience than you, done.

    Correct!, knowing that you're awesome and clearly the all knowing Oz,  I conceed to your greatness in the fields of digital display technology!

    corrected lawl just joking around by the way, ive stopped taking this forum post seriously considering the OP posted his thanks and reply a page back? this should really be locked.

    image

  • AdamaiAdamai derbyPosts: 469Member
    the monitor has a faster colour change than a tv and it sqwishes the resolution into a smaller box giving far batter quality image. the monitor screen also has more ppi a tv is dpi. or is it the other wsy round. aby way the tvs dots or pixels are larger thab a monitor. your monitor also is designed for a pc. a tv is designed for tv. as for resolution your hdmi cable on your pc wont work like your vga cable... stick with monitors if yiu care for image qaulity.
13»
Sign In or Register to comment.