A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • imetators@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    9
    ·
    5 hours ago

    I have 65" 4K TV that runs in tandem with Beelink S12 pro mini-pc. I ran mini in FHD mode to ease up on resources and usually just watch streams/online content on it which is 99% 1080p@60. Unless compression is bad, I don’t feel much difference. In fact, my digitalized DVDs look good even in their native resolution.

    For me 4K is a nice-to-have but not a necessity when consuming media. 1080p still looks crisp with enough bitrate.

    I’d add that maybe this 4K-8K race is sort of like mp3@320kbps vs flac/wav. Both sound good when played on a decent system. But say, flac is nicer on a specific hardware that a typical consumer wouldn’t buy. Almost none of us own studio-grade 7.1 sytems at home. JBL speaker is what we have and I doubt flac sounds noticeably better on it against mp3@192kbps.

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      Interestingly enough, I was casually window browsing TVs and was surprised to find that LG killed off their OLED 8K TVs a couple years ago!

      Until/if we get to a point where more people want/can fit 110in+ TVs into their living rooms - 8K will likely remain a niche for the wealthy to show off, more than anything.

  • Pringles@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 hours ago

    I don’t like large 4k displays because the resolution is so good it breaks the immersion when you watch a movie. You can see that they are on a set sometimes, or details of clothing in medieval movies that give away they were created with modern sewing equipment.

    It’s a bit of a stupid reason I guess, but that’s why I don’t want to go above 1080p for tv’s.

  • IronpigsWizard@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 hours ago

    After years of saying I think a good 1080p TV, playing a good quality media file, looks just as good on any 4k TV I have seen, I now feel justified…and ancient.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 hours ago

    The main advantage in 4K TVs “looking better” are…

    1. HDR support. Especially Dolby Vision, gives noticeably better picture in bright scenes.

    2. Support for higher framerates. This is only really useful for gaming, at least until they broadcast sports at higher framerates.

    3. The higher resolution is mostly wasted on video content where for the most part the low shutter speed blurs any moving detail anyway. For gaming it does look better, even if you have to cheat with upscaling and DLSS.

    4. The motion smoothing. This is a controversial one, because it makes movies look like swirly home movies. But the types of videos used in the shop demos (splashing slo-mo paints, slow shots of jungles with lots of leaves, dripping honey, etc) does look nice with the motion interpolation switched on. They certainly don’t show clips of the latest blockbuster movies like that, because it will become rapidly apparent just how jarring that looks.

    The higher resolution is just one part of it, and it’s not the most important one. You could have the other features on a lower resolution screen, but there’s no real commercial reason to do that, because large 4K panels are already cheaper than the 1080p ones ever were. The only real reason to go higher than 4K would be for things where the picture wraps around you, and you’re only supposed to be looking at a part of it. e.g. 180 degree VR videos and special screens like the Las Vegas Sphere.

  • gandalf_der_12te@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    6 hours ago

    i can confirm 4K and up add nothing for me compared to 1080p and even 720p. As long as i can recognize the images, who cares. Higher resolution just means you see more sweat, pimples, and the like.

    edit: wait correction. 4K does add something to my viewing experience which is a lot of lagging due to the GPU not being able to keep up.

  • melsaskca@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 hours ago

    Black and white antennae TV’s from the 1950’s was clearer than a lot of TV’s today, but they weighed 600 kilograms. Nowadays I buy cheap, small TV’s and let my brain fill in the empty spaces like it’s supposed to. /s

  • kossa@feddit.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 hours ago

    I just love how all the articles and everything about this study go “Do you need another TV or monitor?” instead of “here’s a chart how to optimize your current setup, make it work without buying shit”. 😅

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      Selling TVs and monitors is an established business with common interest, while optimizing people’s setups isn’t.

      It’s a bit like opposite to building a house, a cubic meter or two of cut wood doesn’t cost very much, even combined with other necessary materials, but to get usable end result people still hire someone other than workers to do the physical labor parts.

      There are those “computer help” people running around helping grannies clean Windows from viruses (I mean those who are not scammers), they probably need to incorporate. Except then such corporate entities will likely be sued without end by companies willing to sell new shit. Balance of power.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    17 hours ago

    An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      13 hours ago

      This is true. That said, if can’t tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you’re sitting too far away. In which case there’s no point in going with 4K.

      At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:


      Source: https://www.rtings.com/tv/learn/what-is-the-resolution

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.

      • Redex@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        I’ll add another explanation for bitrate that I find understandable: You can think of resolution as basically the max quality of a display, no matter the bitrate, you can’t display more information/pixwls than the screen possess. Bitrate, on the other hand, represents how much information you are receiving from e.g. Netflix. If you didn’t use any compression, in HDR each pixel would require 30 bits, or 3.75 bytes of data. A 4k screen has 8 million pixels. An HDR stream running at 60 fps would require about 1.7GB/s of download wihout any compression. Bitrate is basically the measure of that, how much we’ve managed to compress that data flow. There are many ways you can achieve this compression, and a lot of it relates to how individual codecs work, but put simply, one of the many methods effectively involves grouping pixels into larger blocks (e.g. 32x32 pixels) and saying they all have the same colour. As a result, at low bitrates you’ll start to see blocking and other visual artifacts that significantly degrade the viewing experience.

        As a side note, one cool thing that codecs do (not sure if literally all of them do it, but I think most by far), is that not each frame is encoded in its entirety. You have, I, P and B frames. I frames (also known as keyframes) are a full frame, they’re fully defined and are basically like a picture. P frames don’t define every pixel, instead they define the difference between their frame and the previous frame, e.g. that the pixel at x: 210 y: 925 changed from red to orange. B frames do the same, but they use both previous and future frames for reference. That’s why you might sometimes notice that in a stream, even when the quality isn’t changing, every couple of seconds the picture will become really clear, before gradually degrading in quality, and then suddenly jumping up in quality again.

      • HereIAm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos

        No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.

      • starelfsc2@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        15 hours ago

        On codecs and bitrate? It’s basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it’ll get really blurry with a low bitrate even in 4k.

      • null_dot@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 hours ago

        The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.

        Suppose you’ve recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can’t be computed from the data.

        In summary, the TV in my living room might be more capable, but my streaming provider probably isn’t sending enough data to really use it.

  • 0ndead@infosec.pub
    link
    fedilink
    English
    arrow-up
    8
    ·
    18 hours ago

    I think the real problem is that anything less than 4k looks like shit on a 4k tv

  • michaelmrose@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    20 hours ago

    The study doesn’t actually claim that. The actual title is “Study Boldly Claims 4K And 8K TVs Aren’t Much Better Than HD To Your Eyes, But Is It True?” As with all articles that ask a question the answer is either NO or its complicated.

    It says that we can distinguish up to 94 pixels per degree or about 1080p on a 50" screen at 10 feet away.

    This means that on a 27" monitor 18" away 1080p: 29 4K: 58 8K: 116

    A 40" TV 8 feet away/50" TV 10 feet away

    1080p: 93

    A 70" TV 8 feet away

    1080p: 54 4K: 109 8K: 218

    A 90" TV 10 feet away

    1080p: 53 4K: 106 8K: 212

    Conclusion: 1080p is good for small TVs relatively far away. 4K makes sense for reasonably large or close TV Up to 8K makes sense for monitors.

    https://qasimk.io/screen-ppd/

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        17 hours ago

        The article title is basically a lie intended to generate clicks by pretentious people far stupider than the people who did the actual research which is why the non morons who did the research called it “Resolution limit of the eye — how many pixels can we see?”

        • faythofdragons@slrpnk.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          You appeared to be complaining that OP’s title didn’t match the article title, and I was only pointing out the article’s title has changed since OP posted.

          My apologies if I misread.

  • ManosTheHandsOfFate@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    20 hours ago

    This finding is becoming less important by the year. It’s been quite a while since you could easily buy an HD TV - they’re all 4K, even the small ones.

  • OR3X@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    1 day ago

    ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.

    • treesquid@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      23 hours ago

      My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        Old people with bad eyesight watching their 50" 12 feet away in their big ass living room vs young people with good eyesight 5 feet away from their 65-70" playing a game might have inherently differing opinions.

        12’ 50" FHD = 112 PPD

        5’ 70" FHD = 36 PPD

        The study basically says that FHD is about as good as you can get 10 feet away on a 50" screen all other things being equal. That doesn’t seem that unreasonable

    • Nalivai@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      23 hours ago

      Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      21 hours ago

      They don’t need to this study does it for them. 94 pixels per degree is the top end of perceptible. On a 50" screen 10 feet away 1080p = 93. Closer than 10 feet or larger than 50 or some combination of both and its better to have a higher resolution.

      For millennials home ownership has crashed but TVs are cheaper and cheaper. For the half of motherfuckers rocking their 70" tv that cost $600 in their shitty apartment where they sit 8 feet from the TV its pretty obvious 4K is better at 109 v 54

      Also although the article points out that there are other features that matter as much as resolution these aren’t uncorrelated factors. 1080p TVs of any size in 2025 are normally bargain basement garbage that suck on all fronts.

    • nek0d3r@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      22 hours ago

      Even 4K is noticeable for monitors (but probably not much beyond that), but this is referring to TVs that you’re watching from across the couch.

    • Jarix@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      Going down from 24" 2048x1152 to 27" 1920x1080 was an extremely noticeably change. Good god I loved that monitor things looked so crisp on it.

        • ShinkanTrain@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          20 hours ago

          If 4k is 4k because the horizontal resolution is around 4000, so you’d think 1080p, with its 1920p-long lines would be 2k. It’s fucked that it isn’t.

          • _g_be@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 hours ago

            “4k” is supposed to be a term for cinema widescreen resolution, but got taken over because it’s short and marketable because “4k is 4x as many pixels as 1080p”

            What makes it worse is that then 1440p becomes 2k because “it’s 2x as many pixels”

            The flip flop irks me

            • ShinkanTrain@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              3 hours ago

              They shouldn’t use numbers at all tbh. QQVGA, QVGA, VGA, q(uarter)HD, HD, Full HD, QHD, UHD and so on works for all aspect ratios, and you can even specify by adding prefixes like FW (full wide) VGA would be 480p at 16:9. It gets a little confusing cause sometimes the acronyms are inconsistent (and PAL throws a wrench on everything), but the system works.

              PS: I also don’t like that 540p is called qHD cause it’s a quarter of Full HD.

      • rmuk@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        Yeah. They went from counting pixels by rows to columns. A 16:9 widescreen 1080 display is 1920×1080, and most manufacturers are happy to call 1920 “2K”.

  • ShinkanTrain@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    21 hours ago

    Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish.

    That seems in line with common knowledge? Say you want to keep your viewing angle at ~40º for a home cinema, at 2.5m of distance, that means your TV needs to have an horizontal length of ~180cm, which corresponds to ~75" diagonal, give or take a few inches depending on the aspect ratio.

    For a more conservative 30° viewing angle, at the same distance, you’d need a 55" TV. So, 4K is perceivable at that distance regardless, and 8K is a waste of everyone’s time and money.