Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

1080i please!

drbaltazardrbaltazar Member UncommonPosts: 7,856
Question:most broadcaster worldwide are 1080i.a big part of the world listen to their content in 1080i .looks great and close to 720p band with.how come game Dev fight tooth and against this?let's face it !1080i is here to stay.ryze ff14 etc is showing this with their 900p and 720p.how come console maker don't stick to what broadcaster use (1080i)?

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383

    Umm, because the i stands for Interlaced, which saves on bandwidth required to transmit the image - by only transmitting every other scan line in a single frame, but not on the number of pixels required to generate the image - you still have the same number of pixels to render in that image.

    This is opposed to p, for progressive. Which means the entire image is refreshed each frame.

    DisplayPort, DVI and HDMI cables generally aren't starved for bandwidth, so there's no reason for a computer to output interlaced graphics. Satellites and cable companies are, because they have a fixed number of channels they can deliver on a service feed (or rather, a fixed number of bytes/second they can deliver, and they chop that up across all the channels).

    So if the computer has to generate 1920x1080 pixels, regardless of if it's interlaced or progressive, why would you even consider interlaced if you had the choice? The GPU has to work just as hard either way.

    Most AV enthusiasts will agree that 720p content generally looks better than 1080i content; because you have higher bandwidth you get more fluid motion. You know that thing that most gamers complain about if their frame rates drop below 60fps? Well, with interlaced, you'd need 120fps in order to get the same effect as 60fps progressive.

    So if you think about it - to get the same level of performance, the GPU would actually have to work twice as hard to deliver the same level of quality in an interlaced picture as in a progressive picture at the same resolution.

    I am generally responding not because I think you really get this; I'm pretty sure either my explanation is way underneath your hidden genius, or you can't really read/understand english anyway - but rather for the benefit of anyone else who comes across this thread and wonders why themselves.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    I'll test when I'm home but I found my issue for everything was hopping I might be fixed also.

    So p is p from computer to screen.and is never manipulated into I or changed .it stay the same

    What happen for I from disk?1 its generated as p?transformed into I?sent to TV .TV convert it to p again?how come we can't get I all the way?too complex to create game in I.how come they don't make interlaced native TV anymore.I ask because majority can't afford the cost of p entertainment.are there such thing as led rgb CRT monitor?1920x1080i native 23 inch ?
  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    It look like only CRT and dlp do native 1080i
  • RidelynnRidelynn Member EpicPosts: 7,383

    If your monitor/tv can do 1080p, it can do 1080i.

    The resolution is what is native. the i or p just determine your scan type/frequency.

    Most TVs that are only 1080i are really 720 native, but they are able to take a 1080i signal and down-convert it to 720 resolution and call it 1080i because the bandwidth is similar to 720p. It accepts the 1080 signal, but you aren't getting a 1080 picture.

    Broadcasters tend to use 1080i because it takes half as much bandwidth, so you can show twice as many channels in HD in the same broadcasting constraints.

    As a consumer, if your given the choice of just HBO (in 1080p), or HBO East and HBO West (but in 1080i) - only the technophiles would realize that the single 1080p channel is better because it will be a higher quality picture and that you can DVR whatever you want if you miss the show; the mass consumer population will say "Two channels is better than 1" and the marketing will spin it to say you can catch up on a show on the West channel 3 hours later if you missed it on the East (or watch it early on the East if you live in the West). The cable companies will spin it saying they have more channels in HD than the competition (because 1080i is technically HD), even if they are lower quality.

    So why again would you possibly want a monitor that does not do 1080p? Any flatscreen (LCD, DLP-based projection, Plasma, OLED, etc) are going to display the image progressively anyway (because they don't have scan lines or electron beams), you just get a crappier image because you effectively cut your frame rate in half.

    A few years before the whole "HD" thing hit big in the consumer market, there was just "Progressive Scan". Everything in the US was NTSC 480i, because that was how the electron scan guns worked on a CRT and that's how they broadcast the analog signal.

    Then DVD companies, looking for an edge on picture quality, realized they had enough storage space on a DVD, and enough bandwidth on an analog channel, to show the image progressively - to update the entire frame at once, rather than only half of the frame at a time. The picture quality was a lot sharper - mainly because high contrast and fast motion images translated more accurately and with less motion blur. This was just marketed as "Progressive Scan", but amounted to the difference between the traditional 480i versus the newer 480p.


    http://en.wikipedia.org/wiki/Interlaced_video
    http://en.wikipedia.org/wiki/Progressive_scan

  • Dreamo84Dreamo84 Member UncommonPosts: 3,713

    I'm sorry I think the OP is a troll. The fact that his reply had even worse English than his original post seals it for me.

    Pretty good one though.

    image
  • KyllienKyllien Member UncommonPosts: 315
    Originally posted by Ridelynn

    If your monitor/tv can do 1080p, it can do 1080i.

    The resolution is what is native. the i or p just determine your scan type/frequency.

    Most TVs that are only 1080i are really 720 native, but they are able to take a 1080i signal and down-convert it to 720 resolution and call it 1080i because the bandwidth is similar to 720p. It accepts the 1080 signal, but you aren't getting a 1080 picture.

    Broadcasters tend to use 1080i because it takes half as much bandwidth, so you can show twice as many channels in HD in the same broadcasting constraints.

    As a consumer, if your given the choice of just HBO (in 1080p), or HBO East and HBO West (but in 1080i) - only the technophiles would realize that the single 1080p channel is better because it will be a higher quality picture and that you can DVR whatever you want if you miss the show; the mass consumer population will say "Two channels is better than 1" and the marketing will spin it to say you can catch up on a show on the West channel 3 hours later if you missed it on the East (or watch it early on the East if you live in the West). The cable companies will spin it saying they have more channels in HD than the competition (because 1080i is technically HD), even if they are lower quality.

    So why again would you possibly want a monitor that does not do 1080p? Any flatscreen (LCD, DLP-based projection, Plasma, OLED, etc) are going to display the image progressively anyway (because they don't have scan lines or electron beams), you just get a crappier image because you effectively cut your frame rate in half.

    A few years before the whole "HD" thing hit big in the consumer market, there was just "Progressive Scan". Everything in the US was NTSC 480i, because that was how the electron scan guns worked on a CRT and that's how they broadcast the analog signal.

    Then DVD companies, looking for an edge on picture quality, realized they had enough storage space on a DVD, and enough bandwidth on an analog channel, to show the image progressively - to update the entire frame at once, rather than only half of the frame at a time. The picture quality was a lot sharper - mainly because high contrast and fast motion images translated more accurately and with less motion blur. This was just marketed as "Progressive Scan", but amounted to the difference between the traditional 480i versus the newer 480p.


    http://en.wikipedia.org/wiki/Interlaced_video
    http://en.wikipedia.org/wiki/Progressive_scan

    Over time the same amount of data is sent for either 1080i or 1080p.  1080i just does it in 2 passes while 1080p is one pass. For the most part humans can't detect the difference between i or p but technologically it always make sense to complete everything in 1 pass, this is why the majority or TVs and Monitors are 1080p.  Also a majority of TVs and Monitors are LCD based now the LED stuff are still LCD screens.   

  • Stuka1000Stuka1000 Member UncommonPosts: 955
    And of course you are assuming that English is his first language
  • GwapoJoshGwapoJosh Member UncommonPosts: 1,030
    To the OP.. Is your space bar broken?  Might be time to get a new keyboard..

    "You are all going to poop yourselves." BillMurphy

    "Laugh and the world laughs with you. Weep and you weep alone."

  • BigmamajamaBigmamajama Member Posts: 198
    Originally posted by GwapoJosh
    To the OP.. Is your space bar broken?  Might be time to get a new keyboard..

    Smartphone poster.

  • GwapoJoshGwapoJosh Member UncommonPosts: 1,030
    Originally posted by Bigmamajama
    Originally posted by GwapoJosh
    To the OP.. Is your space bar broken?  Might be time to get a new keyboard..

    Smartphone poster.

    Ah I see :)

    "You are all going to poop yourselves." BillMurphy

    "Laugh and the world laughs with you. Weep and you weep alone."

  • g0m0rrahg0m0rrah Member UncommonPosts: 325

     

     Lets drop interlaced and progressive and we still have 720 and 900 content on 1080 screens.  With my PC I find that my video card is placed under much more stress due to AA and such over simply having the game set to a proper resolution.  You would think that the entire purpose of 4k screens is to end  the jaggy since we all know 4k screens will only be beneficial when you are close to the screen and it is a decent sized screen.

     I guess what I am getting at is maybe the OP is saying " stop releasing content that cant scale to 1080p ". I understand 720p is nice when streaming videos over the internet but if you are selling a game, it better be 1080 or higher...

  • RidelynnRidelynn Member EpicPosts: 7,383

    The reason you see a lot of PS3/XB360 content in 720 is because that's all they can do - they don't have enough processing power to go up to 1080 for many of the titles.

    There are a few 1080 titles out there, but 720 far and away is the most common resolution on the devices.

    Seeing as how they originally released in 2005 (360) and 2006 (PS3) - I'd say the 7-8 year old consoles are doing well to hold up at modern game engines at 720. The hardware is locked, and has been locked since their introduction date all those years ago, and we won't see any improvements in consoles until we get the PS4/XBOne.

    How many people would figure on playing BF4 or GTA V at 1080 (i or p) on a standard computer/video card released in 2005?

  • MeleconMelecon Member UncommonPosts: 74

    I am not sure you under stand what bandwidth is... yes you will eventually transmit the same amount of data but only half at a time. So their fore you will save bandwidth on a single i broadcast.

    Example: (Simple numbers)

    A show in "P" will take up 100% per second of the bandwidth

    A show in "I" takes up 50% per second of the bandwidth

     

    So to max out at a 100% bandwidth per second you can have 1 in "p" or 2 in "I"

     

    If you read the posts from Ridelynn you should have pick this up as he already stated it there.

  • DeivosDeivos Member EpicPosts: 3,692

    My 2650x1600 LCD screen is ~ six years old. Been using it on my computers consistently since then and play any game that supports it rather well.

     

    I would very much appreciate more developers on the PC creating their game content to properly scale between more disparate resolutions, as the most often issue is the UI elements are too low res and it ends up either chunky or tiny. The performance itself has never been an issue.

     

    However. How many people own and use such monitors? Part of why I have this and other high res screens is for work rather than for gaming and unless you're doing professional graphics arts, professional film editing, or some other activity that needs a large screen/resolution then you're not likely to be using one.

     

    Save for if yer just a really big nerd and love having giant monitors set up in an array in front of you like yer plugged into the Matrix.

     

    Ignoring these extreme cases, screen resolutions on standard hardware is definitely capable of running at a higher resolution than 720 and 1080, and people on the PC could benefit greatly if developers on that platform would push to improve the supported resolutions so we can see a quicker growth.

     

    That consoles and TV dictated PC gaming resolutions so strongly that some of the larger screen sizes we used to have available died out when HD became standard still irks me to this day, as it felt to me like the TV market stunted the quality of the PC market.

     

    But what ye gonna do. Complain on the internet and continue to play games.

    "The knowledge of the theory of logic has no tendency whatever to make men good reasoners." - Thomas B. Macaulay

    "The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge." - Daniel J. Boorstin

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Deivos
    My 2650x1600 LCD screen is ~ six years old. Been using it on my computers consistently since then and play any game that supports it rather well. I would very much appreciate more developers on the PC creating their game content to properly scale between more disparate resolutions, as the most often issue is the UI elements are too low res and it ends up either chunky or tiny. The performance itself has never been an issue. However. How many people own and use such monitors? Part of why I have this and other high res screens is for work rather than for gaming and unless you're doing professional graphics arts, professional film editing, or some other activity that needs a large screen/resolution then you're not likely to be using one. Save for if yer just a really big nerd and love having giant monitors set up in an array in front of you like yer plugged into the Matrix. Ignoring these extreme cases, screen resolutions on standard hardware is definitely capable of running at a higher resolution than 720 and 1080, and people on the PC could benefit greatly if developers on that platform would push to improve the supported resolutions so we can see a quicker growth. That consoles and TV dictated PC gaming resolutions so strongly that some of the larger screen sizes we used to have available died out when HD became standard still irks me to this day, as it felt to me like the TV market stunted the quality of the PC market. But what ye gonna do. Complain on the internet and continue to play games.

    What you are talking about is the difference between dots per inch (DPI) and resolution. You are right, it is a big problem.

    You are also right - many video cards can display high resolutions. Most can go much higher than even your monitor.

    The problem is, for most games, as you increase the resolution, you increase the number of pixels that must be rendered, which usually means more textures, more polygons, more everything - and that's where the slowdown comes in. They may be able to produce an image at 5000x9000 (just some random big numbers I picked), but the trick is can it do that 60 times a second for whatever engine is happens to be running.

    Now, if games scaled their DPI, rather than just scaling up the resolution, you wouldn't necessarily have that problem. That's exactly what Apple does with their Retina displays (on both the iOS and OSX devices). You still have more pixels to drive, but at least you don't necessarily have more polygons or textures - so the additional load larger, but not linearly larger like it would be if you just straight up increased the resolution.

    And if you scale up DPI rather than just resolution, you also fix the problem of UI elements getting very small and spread out, text getting very tiny, and icons shrinking. But you also give up the "extra desktop space" that increasing resolution typically brings.

  • QuizzicalQuizzical Member LegendaryPosts: 25,351
    Originally posted by Deivos

    My 2650x1600 LCD screen is ~ six years old. Been using it on my computers consistently since then and play any game that supports it rather well.

     

    I would very much appreciate more developers on the PC creating their game content to properly scale between more disparate resolutions, as the most often issue is the UI elements are too low res and it ends up either chunky or tiny. The performance itself has never been an issue.

     

    However. How many people own and use such monitors? Part of why I have this and other high res screens is for work rather than for gaming and unless you're doing professional graphics arts, professional film editing, or some other activity that needs a large screen/resolution then you're not likely to be using one.

     

    Save for if yer just a really big nerd and love having giant monitors set up in an array in front of you like yer plugged into the Matrix.

     

    Ignoring these extreme cases, screen resolutions on standard hardware is definitely capable of running at a higher resolution than 720 and 1080, and people on the PC could benefit greatly if developers on that platform would push to improve the supported resolutions so we can see a quicker growth.

     

    That consoles and TV dictated PC gaming resolutions so strongly that some of the larger screen sizes we used to have available died out when HD became standard still irks me to this day, as it felt to me like the TV market stunted the quality of the PC market.

     

    But what ye gonna do. Complain on the internet and continue to play games.

    If you want to scale 3D graphics sensibly to different resolutions, it's fundamentally about doing some computations in real projective space RP^3.  They're not hard computations; indeed, it's quite possible that some programmers could even get it exactly right without really understanding what they're doing.  But for a programmer who has never seen quotient spaces, they'll likely seem weird.

    Trying to scale 2D graphics to different resolutions is perhaps harder.  Sprites and UI elements are built to be a fixed number of pixels, and if you stretch or compress them, they look terrible.  Redoing all of your 2D artwork for several different monitor resolutions would be expensive.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856

    as far as resource goes?if you got a say crt ?or some dlp unit?a gpu trying to max  will have to send twice the data for same quality!or so close I don't believe any will notice!but adopting 1080i cause other issue!like game are made at 60.1 or so!1080i movie or whatever are made at 59.94 .from what I gather .this is why you almost always need two screen one to view the properly 1080i encoded stuff and one for gaming .as far as I know only dlp and crt do native 1080i!dont even know if dlp are made in 2013.oh 1080i is uber if it wasn't cable would have moved on to 1080p sadly computer game had gone a different route!

    and the chasm was never mended!i found my issue of why the 1080p looked like a 1069! or whatever ,ya I found the fucking articfact cause thing.i didn't like this because fixing it 100% is way more involved then I can do!

    but let me say it is caused by hardware and window os! nothing else!

    I ll look into the 1080i tomorrow to see if this fixed the issue I had !(probably will be fixed!so I wont be able to game in 1080i but at least I ll view non-online serie in the way they re meant without bug in my hardware and I ll game on the other screen for 1080p!ty guys for the help.never taught chip maker and os maker had suck a hard time with image quality!its like they do this on purpose with hybrid are around!

  • RidelynnRidelynn Member EpicPosts: 7,383

    I blame the OS for not using the right event interrupt counter and your hard drive not supporting MSI-X.

  • DeivosDeivos Member EpicPosts: 3,692
    Originally posted by Quizzical

    If you want to scale 3D graphics sensibly to different resolutions, it's fundamentally about doing some computations in real projective space RP^3.  They're not hard computations; indeed, it's quite possible that some programmers could even get it exactly right without really understanding what they're doing.  But for a programmer who has never seen quotient spaces, they'll likely seem weird.

    Trying to scale 2D graphics to different resolutions is perhaps harder.  Sprites and UI elements are built to be a fixed number of pixels, and if you stretch or compress them, they look terrible.  Redoing all of your 2D artwork for several different monitor resolutions would be expensive.

    That's why you build UI elements initially in vector graphics and convert it into a set of resolution based assets for release. Means you only have to build the assets in total once, and only have some touching up to do on conversion.

     

    And I'm aware Ridelynn. My complaint in general is the point that graphics hardware on the PC is generally less stunted than the TV and console market, and that PC titles adopting TV resolution standards hampers the capability for that platform to produce higher quality graphics.

     

    Both of you skipped over the part i stated of not only the UI size being off, but the scaling of it being chunky. Increasing DPI can work to a degree, but it shares the threat of looking fuzzy or chunky if the resolution is notably different.

    Why it won't affect certain fonts on a computer when they scale DPI is because it's vector based (with outline or stroke based, there are bitmap fonts too and they scale poorly).

    That's honestly the only way for graphics to not be borked when tweaking scaling about, but it takes more processing power. With text that tends to be fine as it's usually a negligible impact, but the moment you start tossing in a lot of complex graphics, you're going to add a lot of things to process.

     

    Pretty sure we're at least close to the same page and not really in disagreement, Just noting as I don't really need to see corrections or clarifications unnecessarily. A conversation on solutions or the like might be nice though.

    "The knowledge of the theory of logic has no tendency whatever to make men good reasoners." - Thomas B. Macaulay

    "The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge." - Daniel J. Boorstin

  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    @ridelynn ya you are right nos thar you mention it .its an ms contr
  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    Sorry on tablet.having issue (it want to correct un french)
  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

    OK!this is my problem with ms!they ditch 1080i idea which is number one used on cable.I mean there is no better bang for your bandwith!I'll assume to love eye candy!so they go ballistic,so they re 1080p everything look better then computer ans then BOOM THEY ENABLE THEIR DYNAMIC WHATCHEMACALLIT!OK so now I ask :what did they gain over 1080i?only one thing my brain can come up with!their own standard!it isn't gonna be better then 1080i and will not take less band with since 1080i take about same.what about cable show at 1080i?they will stick to 1080i since every kink are about to be fixed(ya 1080i issue was bad hardware timing and interrupt.Any know if ps4 will support 1080i and keep it simple or they won't ?1080i is here to stay it is the best compromise till Google and shaw gigabyted america!(in 20 years)
Sign In or Register to comment.