It’s 2021 and watching TV properly is much more difficult than 50 years ago
At the beginning of the 80s, watching TV didn’t have much mystery: you connected it to the antenna and at most to VHS video, you turned on, and that’s it. Technology has evolved dramatically in this area and now we have more channels and content than ever at a quality that would have been unthinkable at that time. And yet it is increasingly difficult to watch TV properly.
The problem is in an industry that It floods us with devices to connect to the TV, but also with standards, parameters and options that are supposed to help us achieve the best image and sound quality but that end up being an enigma for many users.
How do I see football better? And video games? And the movies?
The options are great, but if there are too many they end up causing some stress and confusion. It is of course what is happening with the world of Smart TVs, which present the user all kinds of standards and options so they can customize the viewing experience to their liking.
Nevertheless, setting those options is tricky for most users, that usually (we usually) do not touch the factory settings too much on issues such as calibration and that for example go (we) a bit crazy when they talk to us about all those television technologies –MicroLED, MiniLED, QNED and Crystal LED they are the most of the most now — that plunge us into a sea of doubts when choosing which one to buy.
Of course, it does not end there. HDMI cables can also cause confusion, both for the versions of this standard as for its theoretical quality. Here the norm is usually always the same, and with rare exceptions, very expensive cables are not particularly different in their performance from normal ones (but be careful with the very cheap ones).
From there other issues come in, such as the number of devices we have connected to our TV, which can also be connected to sound bars, A / V receivers or HDMI hubs and splitters that in turn make it more difficult to find what we want to see and hear at all times.
The smart tv platforms (Android TV, Tizen or webOS, especially) they try to put all the content from different platforms at their fingertips, of course, but the problem is watching that content – including television channels – in the best possible way.
Cinema modes – with the filmmaker mode as an extra option – or sports make there also a profile determined for each situation, but switching between them can end up becoming tedious. And that mania of some manufacturers of enable image smoothing by default —That the filmmakers they hate even more that users— it doesn’t help either.
Dear Smart TVs: you are not as smart as you say
Something similar happens with other perfectly customizable preferences that leave the user a lot of room for maneuver. The problem is that the user does not want so much: if the TVs are now supposed to be smart, Why don’t they self-adjust the preferences according to the type of content?
It is true that thanks to the HDMI 2.1 standard we have gained something in the field of video games with ALLM technology, but the truth is that this idea of autodetecting the ideal mode for each content should be a thing of the television (although later the user can customize it) and not so much of the user.
My colleague Juan Carlos López, an expert on these issues, already told me that even he, who knows these issues well, usually don’t touch those preferences too much except maybe when he plays video games.
For this “self-identification,” he explained, there are several possible options: the artificial intelligence systems that manufacturers presume so much could identify (or at least try to) the type of content and adjust to it. On the other hand, those responsible for issuing they could identify it with a data packet that would allow the television to adjust to that “broadcast identifier” to adjust all kinds of levels.
Issues like black levels, which are quite important to avoid contrast issues, are joined by others like that variety of HDR standards that also compete with each other. If I don’t have HDR 10+ but I do have Dolby Vision, something happens? Is what I’m seeing really taking advantage of these famous dynamic range systems?
Sometimes, less is more
The questions are also multiplied with the remote controls. It is assumed that the HDMI CEC technology makes it easy to unify several controllers into one, but things do not always work as we would like, and in the end we usually have several controls swarming around the table (or the couch).
I, for example, have discovered recently that wonder called Chromecast with Google TV. After buying a new TV at home – a great one with a ton of options that I will probably never use – I ended up connecting this little HDMI dongle for both the fantastic Google TV interface and that remote control that it is. almost the summum of remote controls.
The remote control of the TV (in the image) almost scares compared to that minimalism that Google poses, and here I wonder if the manufacturers should not understand that perhaps users would like to have fewer options, and no more.
Including a basic control like the Chromecast (or Fire TV, another great example) and another “complete” like the one in the image would be a good solution, but no: in this, as in many other things, it seems that the manufacturer’s option is always offer many options, even if you don’t end up using them later. Many buttons, many cable formats, many HDR standards, many
The manufacturers of Smart TVs and of this entire ecosystem should begin to be clear that there is another possible revolution for their products. One that makes “less is more” its own and allow those theoretically intelligent TVs to be so and avoid that watching TV properly does not make it necessary to study engineering.
Image | Unsplash