I get why the TV manufacturers don't like the concept but TVs should really be dumb displays. Using something like a ChromeCast to get content to the TV makes so much more sense than embedding "intelligence" in a display.
It's the same mismatch of lifecycles that you get with entertainment (and navigation) electronics in cars.
It's interesting that Smart TVs seem to have two lives, the first as a Smart TV, and the later as a dumb display.
At least in some cars, you can still replace the ICE if it accomodates single or double DIN units. The newer cars with hidden "brains" and a proprietary display though, will that be the equivalent of having an 8-track player mounted under the dash in a few years?
It reminds me of a very good Mitch Hedberg quote: "An escalator can never break: it can only become stairs. You should never see an Escalator Temporarily Out Of Order sign, just Escalator Temporarily Stairs. Sorry for the convenience."
I've certainly seen escalators undergoing maintenance where the floor at the base was opened up and workers were standing inside. Wouldn't want to take those stairs in that case :)
And I've taken some escalators which might as well be broken when they "become stairs", I wouldn't walk up and down the StPetersburg metro escalators (most of the stations are >50m below ground).
Best case: "I don't seem to be making any progress going up."
Worst case: "This is moving down way too fast! Aaaahhhhh!!!" -screamed by hundreds of people, many of which will die as they pile up at the bottom of the escalator.
Now imagine the brakes failing on an escalator that leads down into a subway with an electrified rail.
The weight of your body will set the escalator in motion and it will keep accelerating. As you approach the bottom end at ever increasing speeds you will have a few moments to contemplate on what a mistake it was to not take physics class in high school. Once you reach the end of the line your journey will end in a rather unpleasant manner due to the high velocity. You could get injured very seriously or even killed.
For me, a big part of the problem has been that smart tv's were anything but smart. They were slow, threatened to overlay ads, assaulted your privacy, had terrible UIs, and were difficult to use.
The usability issue is starting to be fixed. My daughter bought a cheap TV from Amazon that has the Roku software and it works incredibly well. The TV was something like $165 for a 32" display.
I don't know what the long term prospects are for this. What happens if our LAN goes IPV6? Or Roku's agreement with Netflix changes? etc... I'm not terribly concerned because in a few years a replacement will be $100.
The premium for the "smart" version of TV "x" is getting lower. I just bought a Roku TV last month. For the 55" Roku TV I bought last month, it was basically $0. If it becomes a dumb TV in 4 years I won't be too bothered. The 2020 FooNexusCastWebFireBar will probably be better anyway.
> The premium for the "smart" version of TV "x" is getting lower.
That's great, as long as it isn't because they are making money on showing ads, replacing ads, or selling your viewing data (without proper disclosure, of course).
If we value our privacy? Some of us do, some of us don't.
I value my privacy, but I don't put a terribly high value on it. If Google or Apple wanted to track my TV habits with the promise of serving me better in some way, I would probably sign up for that in a heartbeat. But when my off-brand TV wants to, I would decline.
That may mean some products and services aren't easily available to those that put a high value on their privacy.
That may mean some products and services aren't easily available to those that put a high value on their privacy.
Or we could decide that some things are too high a price to pay and not allow the businesses to include mandatory surveillance technologies. Are TV makers really going to stop making TVs if they aren't allowed to spy on their customers as well? Are TV prices really going to be lower with that spying going on if everyone in the industry is doing it anyway?
Regulation is what you do when competition no longer provides an acceptable level of choice in the market. What constitutes an acceptable level of choice may well depend on how important or even essential a certain type of product or service is, but in any case preventing a lack of real competition from causing this kind of customer-hostile race to the bottom is a valid use of statutory regulation IMHO.
Everything you say is true. I just think you value your privacy far more than the general public does. People seem to be willing to surrender an astonishing amount of data to get very little in return.
I personally have changed from being that guy who always used cash, who responded to every privacy notice that was mailed to me, who never signed up for loyalty programs, to really not caring about it anymore. For me, I see tangible benefits and only theoretical harm.
I certainly do value my privacy much more than the average person. I don't even have accounts on the likes of Facebook, for example. I am not even close to an average level of concern, and I fully acknowledge that.
On the other hand, I am also far more technically knowledgeable about the potential implications of privacy intrusions than the average person, and unfortunately I have all too often seen what can happen to friends and colleagues who aren't as careful as me and wind up victims.
The harm may only be theoretical in your experience, and I've no reason to doubt you. However, just thinking about my personal contacts over the years, I have heard of lost jobs (multiple times), credit denied for a mortgage, a gay friend living in my country whose society back home is still hostile to homosexuality being publicly outed by mistake, identity thefts (multiple) including a huge financial loss to a company I sometimes work with where someone authorised to make large payments was very carefully impersonated using a range of information gathered over a considerable period of time, leaks of potentially embarrassing healthcare records (multiple times from multiple sources), cyberstalking (really creepy remote access to unsecured in-house devices like baby monitors and cameras) and a couple of big surprises (on the scale of a marriage proposal) spoiled by careless disclosures. I'm not aware of any personal contact of mine who has been a victim of something like burglary or having their children taken as a result of a criminal discovering and exploiting their household routines, but obviously these aren't out of the question either.
Now, obviously I'm not suggesting that someone spying on your TV habits is necessarily going to lead to harm on that scale (though I note in passing that if you hold a public office and watch the "wrong" TV programmes then it's not entirely inconceivable, hence laws like the Video Privacy Protection Act in the US). But the risks from excessive surveillance, including the collection and use of incomplete or inaccurate data, are not theoretical at all. If you've never known anyone suffer the kinds of problems I mentioned above then I'd guess you and your friends have been extremely lucky; it's not like this stuff happens in my social group all the time, but I'd estimate that I've heard about at least one or two serious problems a year over the past decade or so.
All of this affects my opinions, naturally. However, perhaps the thing that sways me more than anything else is that I've been noticing recently how much some people are put off by privacy intrusions if they are better informed about what is really happening and the implications. I think a lot of people simply don't understand what they are allegedly consenting to or what might happen as a result, and sometimes you can't even see the results. After all, how do you know whether your insurance premium has been increased, your credit limit affected, or your job application denied, on the basis of information you never knew someone else had and never had any opportunity to check or refute? Realistically, I don't think it's possible to educate everyone sufficiently on these issues for them to protect themselves as much as they might choose to if they were fully informed, any more than you can make everyone an expert on how financial services work or the shady practices of used car dealers. This is why we delegate some amount of protection to people who do have the time and resources to become experts, and who do have a mandate to balance allowing businesses to do business against looking out for the interests of society as a whole.
You mean like those Samsung TVs that scanned your network shares even if you told it not to and sent the names of all the files it saw to a server somewhere?
It's not the TV manufacturers that don't like that concept it's that consumers don't like the concept.
Every TV is a "dumb display", consumers wanted more features.
Sony, LG, Samsung and w/e would love nothing more than to sell you only a panel with video input ports but that is not what people want to buy.
People are constantly buying the most feature loaded TV's even at the expense of getting lesser panel quality, inch size and various other things.
Do you have any evidence of this? I've never ever talked to anyone that wanted a feature like this in their TV. The manufacturers have incentive, but the consumers aren't asking for it.
And -all- of the interfaces built in to TVs for such things I've used are terrible. Hell, most TVs still can't make a remote/on-screen UI work reasonably. People can talk to their voice agents ("hello assistant, who is the president of the USA") on the toilet and generally expect a reasonable answer. And yet we still have to point some wand at an exact spot on our TVs, press buttons, and hope they register and change channels.
I actually have a 2012 Bravia. It has a webcam and a Skype app. When we get a Skype call, a little popup appears in the bottom left corner, and I can accept/decline with the remote. I can then carry out a video call in either full 50 inches or composited with the TV signal. Coolest thing ever.
Digital recording is a great feature too, plug in a USB stick for storage then control and manage recordings with the remote. I actually like to have everything integrated, rather than having to fiddle with my OSMC, which, as an example, in the case of recording would need its own signal input.
My only (yet pretty big) issue with the Sony software is that it is slow to start up. From turning on the TV until it will respond to the remote is easily 40-50 seconds, long after the picture appears. You can't even change the channel or volume.
And, oh yeah, the YouTube app has been broken for years, but I use the OSMC for that.
I can then carry out a video call in either full 50 inches or composited with the TV signal. Coolest thing ever.
So can my XBox One :). I agree that this is a nice feature, but there is no reason why this can't be part of an external replaceable/upgradable peripheral. I'm also kind of surprised that Apple and Google didn't create an add-on for Apple TV/Chromecast yet, or we could give it to our daughters' grandparents as well ;).
What happens when you're on a Skype call on the TV and want to move to another room? I guess it's like with the old wired phones, you don't move. A 50" display is cool but a tablet or a phone are usually better suited to answer calls.
The only smart feature I'm using of my LG TV is playing videos from USB sticks. It doesn't have some audio codecs common in MKV containers, but when it works 40" is better than a tablet screen. BTW, the LG is quick enough at starting up. I usually keep it completely off, not in standby. Switch on the power, push the remote, get the image: 9 seconds. Maybe on par with old analog CRTs.
I never enabled WiFi because of the terms of service. Even if they are not as invasive as other ToS I saw from other manufacturers, they are way more than they should. I should use the TV to watch things, not to be watched. I already have the PC, tablet and phone for that :-) and at least I can fight there with adblockers and the like.
I have a Raspberry 3 connected to the TV and I could do YouTube with that (it's quite OK). Still the tablet is much better. I could chromecast to the TV but it's not worth the trouble.
If I'm sitting in the sofa and answer the call from there, it usually means I am fine staying put. We usually use it for family abroad, and here the format has the obvious advantage that the whole family can gather around the TV and see/be seen in the call.
I agree that a lot of features are generally pretty useless... however I love that I can download the Netflix app to my TV directly. It saves me having to use a second device and I can update to another VOD provider if my TV outlives Netflix...
My "smart" LG TV has a Netflix app. Still have a Chromecast hooked up for Netflix. Why? Because it is several orders of magnitude faster for me to find the video I want to play on my phone than it is with a TV remote.
This same wisdom holds true for TV remotes that have full QWERTY keyboards! TV interfaces just plain suck. They are so slow as to be painful and they are perpetually out of date and always user-unfriendly.
I really hoped Apple would come out with a TV that set a baseline for what a TV interface should be just like the original iPhone did for smart phones (which weren't all that smart at the time).
Maybe so, but I never personally asked the TV manufacturers for a smart TV.
Who I think came up with the idea: TV manufacturers.
Selling hardware that has software/firmware, that will become obsolete in 4-8 years, will result in more TVs sold in 4-8 years than would have been.
We just got rid of a TV that we've had since the late 1990s- about 18 years. It was not smart. We just bought a smart TV and I'm betting we won't get more than 8 years out of it.
Exactly. My TV was connected to my network for initial setup then promptly disconnected and used as a dumb display. I don't trust other people to keep devices on my network patched and up to date even during their supported lifetime.
The Roku is built into this TV such that I must use the Roku OS and remote when the TV comes on. It is much slicker than the other crap smart TV interfaces I've used, but when it goes out of style, that means I'll have to use one Roku remote to turn the TV on, browse to the port of the other one, click ok, and now I'm in the second Roku OS. That won't be as nice. Still- for now, it's fine, which is why I bought it.
If I could have gotten the same TV without any smart features and an external Roku, that would have been better long-term, but almost all newer TVs now are smart.
> If I could have gotten the same TV without any smart features and an external Roku, that would have been better long-term, but almost all newer TVs now are smart.
Its important to remember that "TVs" are just "displays whose bundled additional features include at least a TV tuner".
Unless you absolutely need a TV tuner, you can get displays that include neither a TV tuner nor smart features, but they aren't sold as "TVs" because they aren't.
If you were OK with a TV for 18 years you would be OK with another TV for 18 years.
This is simply a fallacy the fact that the software would become obsolete doesn't make the TV obsolete as long as it can still receive a video input and display it - this has nothing to do with the "smart" features of the TV.
Here in the UK, the BBC has something they call the "red button" feature - most of the time, it just brings up "teletext 2.0" with some news headlines etc. but when there's something like the olympics on, you can use it to switch between coverage of different events.
On my "dumb" TV that is not and never will be connected to the 'net, this feature works fine and has done so for years. On my friend's "smart" TV, the red button feature has been replaced by an "app" that we had to update and fiddle with to get it working, and it won't do anything at all without an internet connection despite the key feature just being a kind of switch between two sub-channels, both of which are served over the usual aerial/cable. You can't even turn this off to get the normal red button back.
That works on FreeView by broadcasting extra channels. They have something like 2 to 4 slots I think. It's nothing more than smoke and mirrors really, and there is still an element of software getting in there - it just switches you to the appropriate channel in the background.
There's not much smoke even. The red button channels are normal channels at 600 and 601. If the red button is broken you can switch to them directly. And switching directly is usually quicker than navigating the red button nonsense.
I have one of the "newer generation" smart TVs driven by Roku's interface- completely. In other words, you turn the TV on with a Roku remote and when the TV comes on, you see the Roku interface which you must use to choose which HDMI or game port to use, etc.
I like the interface a lot, but I bought it knowing that at some point, it will not be as cool as it is now. It will probably get in the way, as I'll have to use the Roku remote to browse via the old Roku OS to the port with the box that is the new browseable streaming OS. Maybe the Roku one will get borked by a system update because my hardware is unsupported- I don't know; I hope that doesn't happen, and have had a good experience with my old Roku external box, but it is a possibility several years from now.
Years ago when I bought a TV, never did it cross my mind that "one day the user interface is going to be obsolete".
>If you were OK with a TV for 18 years you would be OK with another TV for 18 years.
A TV manufactured today is very unlikely to last 18 years without some kind of hardware failure. There is a big difference in complexity/points of failure and cost-cutting between 1998 and 2016.
This is wrong. Modern TV follow standard 'bathtub' curve for failures.
If the TV hasn't had any faults when new, it's not very likely to develop any new ones as it hasn't got any moving parts and things that wear and tear.
The exceptions are capacitors quality and pixel burn-in, but even after 18 years they might work, you just won't like the picture quality.
Anything manufactured with some process control should follow a bathtub curve for failures. It's the width of the tub that matters. I don't know if LCDs have been on the market long enough to have much data for end-of-life failures (probably not 18 years?) and that data probably wouldn't be publicly available.
It's hard to believe that TV manufactures are shooting for an 18 year mean lifetime. They might work after 18 years, but it is unlikely.
Maybe. I'll accept that TV manufacturers have convinced many consumers to pay attention to the arms race over features. But were consumers really clamoring for 3D, for example?
I haven't been on the market for a TV for a few years but I'm not convinced that a lot of people see more apps on a TV as a critical selling point.
But were consumers really clamoring for 3D, for example?
No they weren't. IMO it wasn't worth the extra money. After all, just how many times can someone re-watch Avatar? Most other 3D movies weren't actually shot in 3D, they were just faked in post production.
Here's a headline that shows just how dead 3D is:
With a bullet to the head from Samsung, 3D TV is now deader than ever
It may shamble forward zombie-style for a few more years, but without the world's No. 1 TV maker on board, 3D TV is doomed.
3D isn't really that much extra money though. If you've got a quality panel, you already have a high refresh rate. So all you need to add 3D, beyond a bit of software, is polarization or active glasses. It adds so little to the cost that it doesn't make much sense for manufacturers not to include it on high end TVs.
Saying that 3D is "doomed" because one manufacturer is not bringing out new 3D models this year is, IMO, ridiculous. Eventually of course it will get much better, with no glasses, head tracking setups that are already in development. So 3D as we know it now of course won't last forever. But it's not like it's going away. Look at the percentage of movies released in 3D today compared to 5 years ago.
Do you have to stay in front of the tv or can you move around the room and do whatever you have to do? I almost never stay put in front of the tv, even when I watch movies. It's the advantage of being at home. I think 3D is really dying. Maybe VR headsets will revive it, but without TV screens. Or in a shorter time the head tracking technology you're writing about. But the problem is that having to stay in front of a screen in a fixed location of the house to watch something is becoming an alien concept, the same thing that happened to having to be there at exactly that hour to watch a show.
I actually have a 3D TV that I got on a pre-Christmas sale. My story is that I didn't really pay a premium for the 3D aspects of the TV (which included glasses) and I'm sticking to that story :-) I think I've maybe watched a half dozen 3D movies on it.
I've watched a few films that did a good job with it. But they're few and far between.
The reason why most people avoided them especially the "passive" ones is that the "filter" they used on the panels made the normal TV experience slightly worse if you are pedantic about image quality.
The image tends to be bit more fuzzy and dimmer than on 2D displays but again it's a compromise some of the latest 3D TV's pretty much are indistinguishable from 2D only panels but overall the lack of content was what killed it rather than the performance of the TV itself.
Sure you still have some viewing angle issues which might make some living/TV room setups not optimal for 3D and most TV's only came initially with 1-2 pairs of glasses and if it was a active 3D setup (had an active switching component inside of them) they could be somewhat expensive but if the content was there it wouldn't matter.
If there were more 3D blurays and DVD's out there and especially if Netflix or any other streaming service would have supported 3D content (3D streaming is still not a solved issue somewhat) the outcome would've been different.
Overall once 3D content is going to be easily streamable (and it's being worked on) and "glassesless" 3D/Stereoscopic panels become slightly cheaper it might make a comeback.
Avatar was a nice experience but overall even in the cinema many people avoid 3D atm simply because it's still not there, these issues would be solved eventually and when every content would go through stereoscopic processing it would look pretty nice :)
Mine is a Panasonic Viera that I got a year or so before they stopped making Plasmas. It's a nice panel even if, in retrospect, I might have gotten something less reflective for the room it's in. (Still, I use it very little when it's light out.)
If there's a 3D Bluray of a movie I genuinely want to watch, I'll pick it up but there are very few. Funnily enough, I've never bought Avatar as I saw it in Imax 3D in the theatre and was pretty much "That was an impressive experience but I don't really need to watch an echo of a not-so-great movie again at home."
There always seems to be a scene in which a random shrapnel or some object flies directly towards the viewer's face. I thoroughly dislike those; you don't have to remind me that we are watching a 3D movie.
There are always the gimmicky scenes that are so clearly added in for the IMax and 3D "experience." My conclusion is that 3D must only be effective for chaotic and nonsensical chase sequences.
I've always been very meh towards 3D, but found that it works quite well in a VR cinema. The 3D effect is more pronounced, more natural, and there are none of the downsides (blurrier/dimmer). On the other hand, the 2 hours with a DK2 on watching the Jurassic Park remaster wasn't very comfortable ergonomically. A great deal of potential if they reduce the weight however.
They can still do it, most people don't use Chromecast most people want the YouTube, NetFlix, W/E logo on the box.
And they sell a TV with 5 HDMI ports because their competitors sell 4, this what happens when you have a features arms race, there are only what 2-3 panel makes so most TV's are pretty identical (like Cars that you only have a few chassis/wheelbase for each price range) so to have the advantage they load the TV's with other features like more ports, better WIFI, more this more that (just like cars which sell you BS things you don't need).
Chromecast doesn't have to be a literal Chromecast. It just has to be some kind of dongle that uses a standard interface (e.g. HDMI, and maybe USB for remote command handling), and can be upgraded on its own.
Of course, the TV manufacturers don't want to do that, because they profit when you have to upgrade the entire TV.
They don't want to do it because the consumer would not accept it as a solution because the minute that their competitor ditches that dongle and puts those features internally they have the "advantage".
Sunny sells you dongles that take a port, buy symsyng and have all your ports for other devices.
If you are an advanced enough user that you can use a streamer/dongle/set top box or w/e it doesn't matter what the TV has as long as it comes with display input ports.
For the majority of the shopping channel BIG BOX mart consumers this isn't really an option and they want one box that does everything even if it's not the most optimal solution.
What you are asking for is simply something that the TV maker has no reason or interest to do not because they are greedy but because they don't want to deal with that.
They aren't stopping you from buying whatever dongle you want so what exactly is the complaint? The majority of the cost of the TV is still the tuner, panel, and LED and optical assembly the 11$ SoC they use for the smart features won't make the TV's cheaper.
If somehow removing those features would allow them to focus on more important things I would understand but in reality it doesn't add up to much if anything.
>the 11$ SoC they use for the smart features won't make the TV's cheaper
And that's the good news in all this.
I can buy and use the TVs of my choice and the reality is that I don't really need to pay a material premium because they have YouTube which may or may not continue to work after some day in the future. So I can choose to ignore those features if I want. It's not really a choice between buying SmartTV for $X and DumbTV (which is otherwise equivalent) for $X/2.
My concern about the "smart" TVs isn't things I can ignore, it's things I can't.
For example, I don't want a TV with its own network connection, a built-in camera and microphone, and hideously insecure firmware right in my living room.
Literally the only controls I ever use on my TV are power/standby, volume, and input selector. Everything else already comes from external sources like a set top box or PVR or Chromecast, and it's not like only A/V nerds have that arrangement.
I just want good video, good audio, and decent manufacturing quality and longevity. How on earth did we lose the plot as badly as we seem to have with the "smart" marketing?
Unfortunately, being able to see and hear the actual model you're thinking of buying is rather important if you care about audio and visual quality, and certainly in my city, even the relatively high-end stores have been full of "smart" models lately.
Also, the next logical step for "smart" devices is to establish their own wireless connections via alternative channels if they aren't connected to a household network. Privacy is fast becoming a thing of the past if you let anything with sensors onto the premises, and short of wrapping their home or office in a Faraday cage, there is disturbingly little anyone can do about it as more manufacturers reach agreements with network providers to handle their phone-home data needs.
So do what I did: don't connect the TV to your network. I bought the cheapest smart TV with a screen that I liked. That was some 50 Euros more that the most expensive dumb TV of the same size. It works well and I watch YouTube on my tablet, which has more pixels and at a arm length is as big as my TV.
What advantage? Guy A gets TV with dongle pre-connected, turns it on, things Just Work. Guy B gets TV with the same functionality embedded directly, turns it on, things Just Work. What's the difference?
If you want to make it really foolproof, make a mounting bracket for the dongle on the TV itself, such that it would require mucking around with screws to remove.
The company integrating it doesn't have to pay for the connector, for soldering it into the system, or for routing the signals form wherever they are on the main board to the connector. He just solders the chip that would be inside the dongle on the motherboard, in a location that is convenient for him.
A few generations in, he doesn't even have to do that soldering; the functionality will be integrated with something else on a single chip.
That makes the integrated TV set cheaper to produce. Guy A pays more.
Margins are paper-thin for televisions, so that takes most of the market. Eventually, guy A can't even buy that set.
Going from (what would now be considered "old school") an original Xbox running XBMC to a friends Playstation 3 with DLNA felt like a monumental step back
With the Xbox, I simply mapped a Samba share (something like \\NAS\XBMC\) full of content and it brought it up without issue and simply played it
DLNA meanwhile every time something changed, I'd have to restart the whole thing to regenerate the "index". It also played maybe half the file formats XBMC did, despite being newer and an order of magnitude more powerful
Things have finally more or less come full circle again, these days I can run "Plex" on my NAS to just spew out media to everything, as long as it runs the Plex app anyway (which covers my Xbox One and Phone, at least). The only downside is having to pay for it, of course
Because there is absolutely nothing that confuses many people more than having more than one remote! My mother still cannot figure out how to change the volume on my dad's home theater setup.
What devices do you have where HDMI cec works remarkably well? (and what year did they come from)
My experience was that cross brand interop is not quite there, but the experience is so inconsistent it doesn't matter. If configured to do it, turning on the blu ray player would consistent turn on the tv, but only sometimes set the right input. That kind of thing.
I mean, everyone I've asked just wants their Microwave to have two dials, Power and Time, yet no such thing exists on the market. Instead it's a sea of bullshit Microwaves with bullshit "value add" features.
I'd wager the same is true of TVs, and myriad other consumer products. The end-users just want simple tech that works for years on end, while Product Managers and Designers want to boost their own status by sheparding these mindless "improvements" into production.
Not entirely true. At work we have a microwave oven with a single dial for time (haven't bothered to even check if it has power settings). It also doesn't rotate.
It's branded as a "commercial microwave oven" or something.
People would flip their shit if their microwaves didn't rotate anymore, for example. But what "Value Add" features are you talking about that are non-trivial to implement?
It's not a microwave, but... I've lost my fridge manual and have no idea how to set its clock anymore - so it's just there, blinking, for eternity.
Yes, I brought this fridge model because of "features". I even did a feature comparison while searching for a model. The caveat is that all the features I looked for were related to chilling things, or easy maintenance.
I would pay extra for getting rid of the clock - and more yet for just some different software that displayed temperature instead of time.
This is pretty much the best argument I've heard on this issue. I always hear "people want this", but there are "features", like "overscan on HDMI" that nobody wants, and yet these features exist. It's the manufacturers who are pushing shit down the throats of people.
I think it is more a matter of manufacturers wanting to sell TV's sooner than a 5-10 year life cycle. To do that, you have to come up with new features to try to entice.
But if the average consumer compares the Sony and LG 'smart' TVs with YouTube and Netflix and 47 other apps to a by extension 'dumb' Samsung display, they won't pick the Samsung, even if it has better panel quality that can't be readily quantified or seen on the hypersaturated demo.
Well, you and I might buy the Samsung, but most consumers aren't as informed or as savvy with a Chromecast.
If I'm buying a TV for my mom neither would I.
If you want a "dumb" display then buy a display it's a product category of it's own and virtually every company that makes TV/Panels sells them e.g. http://www.necdisplay.com/category/large-screen-displays these are the ones that they sell for display purposes for advertisement, boardrooms etc. but they are not "TV's", some of them have slightly better panels than some TV's some of them don't.
They're also much more expensive, have fewer input options than my current (dumb) tv, and have some remote configuration options that would more conveniently be put into menus, in a consumer device. So...what's the point, besides pigheadedness over not buying a "smart" TV? It's not like there's a way to even go see what I'm buying before I do it.
I'm not a big fan of smart devices, but I'm not going to pay more for a display with "meh" connectivity that doesn't really seem to be optimized as a home entertainment device. Although annoying, it makes more sense to buy the smart TV and ignore the extra features.
again, if manufacturers engage in the most basic 'mine is bigger than yours' playground games then it's not surprising that the users, who love shiny things, are not making a decision based on fact, if instead of releasing a new shape box every year, they really focused on a consistent product line then they could delivery incremental improvements and show users over time why their product is the better 'viewing experience' which is what it's all about really!
1. High quality LCD panel
2. 'good enough' speakers (optional)
3. 4+ HDMI ports
4. no TV tuner, composite, component, etc.
5. Proper working HDMI CEC support
6. Make it as 'always on' as my Macbook or Apple Cinema Display – hit a button or activate a display and the screen comes on instantly.
For the premium model, add:
1. Ambient light detection to adjust brightness
2. Embedded RPi-like (or actually RPi) on a fifth 'overlay' input (i.e. the HUD/guide/etc). Make it open source, let people hack it and do fun stuff with it, build widgets, whatever.
Don't sell me on 'Your TV can do Netflix now!' Every device in my house with an LCD can play Netflix. My iPad from 2010 can play Netflix, and it can do it better than most smart TVs can. The only thing I own that uses 2.4 GHz that doesn't play Netflix is my microwave, and if it did I might actually use it.
If I buy a $70 AppleTV and it becomes obsolete, outdated, or can't support what I want, I can buy another $70 AppleTV. If I buy an $1800 SmartTV and the 'Smart' features become obsolete, outdated, or can't do what I want, then 50% of my value is gone.
I totally agree, they have no plans to maintain 5 year old TV apps...
I love my 4 year old Sony tv. Sure, none of the pre-built apps work anymore... but our Blu-ray player does all of it and better.. the screen works great so I'll keep it till that part dies.
I agree with this. There was a time where smart TVs were useful but it is dying out. The other big issue with Smart TVs is they try to put the lowest powered SoC in that they can and tiny amounts of RAM.
They should all just be monitors now. The cost of a Chromecast, Raspberry Pi, or Apple TV is so low that it makes Smart TVs redundant. I actually don't understand why these manufacturers haven't moved to making their own versions of casting/streaming devices. I remember when I bought my Bravia a long time ago it didn't come with WiFi but you could buy their wireless USB device for AUD$80 (they blocked all others). Why don't they do that but instead a Chromecast like device (without blocking out 3rd parties)?
I don't disagree, but I've gone the opposite way.
Just last week I replaced a 46" Samsung 1080p set from 9 years ago with a brand-new top-of-the-line Sony 4k set that has all this fancy AndroidTV stuff I neither knew I wanted nor really understood.
Thing is, it's the ONLY way I can get 4k content.
I have an HTPC, WDTV, AppleTV, FireStick, PS3, and TiVo all plugged into my receiver (which was a very high end unit just 1 year ago). However, while the receiver supports 4k, it does NOT support HDCP 2.2 which either is required for 4k content, or WILL be once they start using HDCP 2.2. (I'm not quite clear on this point.. I don't know if today's UHD BluRay discs use HDCP 2.2).
And I'd argue that the SmartTV stuff is not really a reason many people buy a TV; regardless, having it does not remove your ability to use external devices in the future. If Sony drops my AndroidTV apps in 4 years, it probably won't much matter because the devices of the future will do far more for $35 than the software on the TV.
It's a frustrating situation, but, IMO, the Smart TV stuff only adds value (unless you're talking about the microphone in my remote and the possibility of said devices being used to backdoor my network, of course).
I forgot to add, new things pop up all the time that you don't know you want until they arrive. I noticed Google Photos being advertised on TV. I gave it a go on my iPhone and notice a Google Cast button. I don't have any of Google's streaming devices yet, nor do I have an HDMI port free on my receiver to support another device. Well, I hit the button, and what do you know, it automagically knew all about my TV and displayed my pictures at the touch of a button. Sometimes having fancy new stuff is cool, and you get added features you never expected. I sure wouldn't buy a Google streaming device just for this one novelty, but I'm not going to complain about having it.
About the latter part, I've noticed the same thing on my Sony Bravia TV as well. I believe the TV's may have a sort of "inbuilt chromecast": so I can cast youtube and netflix on my TV, and it will play the content via the TV-native app. I found this a LOT more convenient than having to deal with chromecast per-se; since I can't really control the TV much with the remote control once chromecast is in play.
Yes and no. Support varies by the "app". Currently Netflix/PlayMusic works great with pause/stop, but no support for other buttons (next track, fast forward, rewind, etc...). I mostly just want pause though, so it works for me.
> I actually don't understand why these manufacturers haven't moved to making their own versions of casting/streaming devices.
Well, because streaming as a service to a customer is largely about the contents and about customer support. This is area that Internet providers are controlling exclusively (over here anyway). Every triple play offer has a TV part which comes with a set-top box, and a random streaming device simply cannot compete with "state-supported" traffic (CDNs in close proximity, multicast channels).
Why do the ISPs provide their own set-top box device? 1) They are trying to reduce the volume of calls to hotline, and this area of tech can generate many; 2) They use the customer-premises device as a shop front in selling even more VOD contents. A streaming stick would require lots of customization per every ISP in order to address these points.
>They should all just be monitors now. The cost of a Chromecast, Raspberry Pi, or Apple TV is so low that it makes Smart TVs redundant. I actually don't understand why these manufacturers haven't moved to making their own versions of casting/streaming devices.
The vast majority of TV viewers do not use and do not even understand how to use these devices.
Chromecast/Firestick is at least somewhat usable to the masses, some of the other streamers are more complicated but do you really expect some random person to buy and build a RasPi streaming set top box?
The TV makers can't take out features because people would not buy that TV, people want features, and these features are what the guy at the store can sell TV's by.
And for Samsung/Sony and the likes to get into the streamer business it's pointless, and it would not change the issue it would just move it to another component albit considerably cheaper one.
Streaming devices also need to be constantly updated, many of them need to be updated more often than the TV their SoC's get obsolete just as fast if not faster and YouTube can make changes tomorrow that would make the current version of Chromecast incompatible as well.
If you are an Advanced user you don't care for all that crap, my best TV still by far is now what a 7-8 year old Toshiba 42" with a 10 bit panel which was actually made in Germany, dumb as you can get (it can show images, play MP3, maybe some MPEG) it has a PCMCIA slot which I used to hack into it and the image quality is still better than virtually every TV out there that doesn't cost 1500 GBP.
But I can use it because I'm an "advanced" user I run my own streaming and transcoding server and have a media center PC. But for some one who wants a no hassle turn on TV and get netflix this isn't really an option, even Chromecast isn't very ideal quality isn't that great and it can lag pretty badly on congested networks.
The more expensive streamers give you quality that matches the TV native app but they also tend to cost considerably more than the cheap sticks, unless you buy the Android unbranded chinese SoC sticks.
The point is that tv manufacturers can still sell all of those features in-the-box, but provide it as a dumb screen + a separate device that's replacable when it becomes obsolete before the screen does.
I think a key competitive feature would be to have their own device integrated better than third-party devices. A usb connection to share commands from the tv's remote, for example. Maybe integrated menus of some sort.
If the add-on device is fairly cheap, and major upgrades require replacement rather than just software, the tv manufacturers could get more upgrade sales than they can get for whole tv upgrades.
But this isn't mutually exclusive, you are complicating a design for no reason, 99% of the people would never upgrade either, and if it's too complicated they wouldn't want to use it.
You can buy any add-on yourself they work with every TV you don't have to worry about brand compatibility or lockouts why ask the TV makers to make the market worse?
The minute that Sony or whoever starts selling dongles is the minute they block all other dongles from working on their TV's and everyone would do the same.
As for the TV remote thing this already works, HDMI-CEC is a protocol that allows you to send commands from the TV to the device via HDMI and virtually every TV I had for the past what almost 10 years with HDMI had the support for it. I have several streamers at home that all work with it without any issues.
Every TV is a dumb panel, removing the "smart" features from it won't make it any cheaper those SoC's are cheap as dirt, you gain nothing by removing it and you lose the simple crowd that doesn't want to add other devices and worry about setting them up or figuring how they work, they want to turn on the TV and click on netflix and that's it.
When it comes to the point where it doesn't work anymore they have the choice to buy a dongle or buy a new TV they do not have to do it and there is no commercial trickery or anything that forces them to buy a new TV over a Chromecast or a Roku or Firestick or w/e.
Selling a high end "gimmick" by forcing people to pay for features that are needed for the TV to work like speakers because you can sell a few rich yuppies the idea of a minimalist design is a nice anecdote but nothing more :)
How many people can afford to spend that much every couple of years on a new tv? I'd bet most tvs in most us homes are 5-15 years old. My tv, which happens to be a bravia, is 13 years old and still works perfectly.
EDIT: for what it's worth, my tv is predates smart tvs by a year or two. I use an old Tivo with a lifetime service plan and a roku (3rd gen I think, upgraded once) for the smarts.
This. No matter how you spin it, shipping and handling large TVs is costly. Even a small % of failed units is going to cost you plenty for shipping returns alone.
And the margins on TVs are as slim as modern TVs, I think the likes of Sony could get the same amount of margin on a small 'smart' box packing something like a PS3 equivalent in a tiny box mounted behind the TV.
There shouldn't be any lifetime mismatch by now. A first release Raspberry PI is powerful enough to fill a full HD TV nowadays, and while it takes a more powerful computer for a 4K TV, this is not something that changes with time while you are not looking.
The only "lifetime mismatch" that still exist on smart TVs is manufacturer DRM that is fulfilling its design goal of not letting you use your device.
TV Manufacturers actually subsidize the cost of the TV by adding user tracking features into their smart TVs. The on by default "features" gather valuable marketing data that companies can then sell to the highest bidder. That's why all TVs these days are smart, even the super cheap ones.
The best thing you can do with a Smart TV is never connect it to the internet.
I Have a NUC at home, but its power as a computer seems to be wasted on being the smart part of a TV to me. So I use a nexus player for its interface and the NUC runs its DLNA and lives its life as a full-fledged desktop PC.
There's no interface for the television. It just has Chromecast built in.
Downside (of the M-Series at least) is they don't come with a tuner installed. If you want a cable or an antenna you'll have to dish out $100 more for a tuner.
Exactly this. I would absolutely love one of the new high end LG OLED panels with none of the smart tv features and no network connectivity (privacy & security). I'm really curious how big of a market this would be.
No you wouldn't. There are many niche 'simple high end phone' products, the majority of which fail, despite forums full of people claiming 'take my money', 'I'd be first in line' etc.
The reality is that these products are as expensive or more as regular products (because small batches), and everybody wants a different 20 pct of the full functionality.
My recollection is that TV manufacturers didn't even want to include digital tuners until the FCC required them to. Of course, that tuner isn't doing me any good now that the FCC allows Comcast to encrypt broadcast channels.
Unless of course you want to watch local television, in which case you would need a digital tuner. That is why the FCC requires the tuner. There were subsidized coupons for tuner boxes when the switch was made, so that everyone and their grandparents could continue watching television.
Unfortunately, YouTube doesn't have the same kind of support.
Most monitors can't/won't do 23.976Hz, nor 24Hz refresh rate. They require resampling. Some will do it, but are expensive. Large good monitors are very expensive, much more expensive than TVs.
When Rec. 2020 becomes a thing, TVs and monitors will have a vastly different gamut and will use different color primaries.
My personal opinion is that if you need a TV, or a display to watch movies on, buy an 1080p plasma (not LCD) that nobody wants anymore. There are models (search the Internet) which have almost reference-grade color reproduction, something you will not find in any LCD that is cheaper than your car, and contrast is much better than LCD too.
I had the same problem with my 2011 Samsung Smart TV. It had an app for Amazon's Prime Video. Amazon eventually abandoned it and sent me part-credit for a Fire TV stick.
I also have a Chromecast and am generally positive about it, however it's very laggy to controller (i.e. app) input. For example, here's a feature I see as pretty basic for a "smart" display: select a bunch of photos from my phone, beam them to my TV, and flick through them using the phone. All the Chromecast-enabled slideshow apps I've tried for this have been laggy and disappointing, unfortunately.
The companies only add the software as a way to make more money. If a smart tv costs more and offers free Youtube videos, then people see it as a added value. But technology changes so fast that sometimes the smart tv sets can't keep up.
My wife got a smart tv from her employer out of a catalog for being a loyal employee for a long time. It has Roku built in. One day that will be obsolete.
My son is thinking of an Amazon Fire TV stick to use and cancel our Cable TV subscriptions. It is only supposed to be $40 and plugs into an HDMI port. So if you need to buy a new one, it should be only $40 later on.
Totally agree. I currently have a SmartTv, a Blu-Ray player with very similar apps, and an X-Box with similar apps. They all have so much in common: streaming media from a network attached PC, youtube videos, amazon prime video, Netflix, etc. It seems like a complete and total waste of developer time to reinvent the wheel for what is essentially the same thing on different devices.
Exactly right. The TV itself is a durable product my TV is 8 years old. This is what I like about the modular approach of "pluggable" smarts. Upgrade the smarts by upgrading the cheaper smart module in this case Chromecast or Fire or Apple TV.
Show me those cheap 55" 4k monitors that you speak of. I know Vizio TV that are cheap, but to my knowledge no 'monitors' exist in that size or price range.
You are right, good monitors are more expensive than TVs. Especially monitors that can do 23.976Hz and 24Hz refresh without resampling.
But why do you need a 4k monitor, to use as a TV, since all current material is 1080p? Better buy 4k when there is a need for 4k, then the price will be lower.
Sigh. I kind of miss the old, heavy, dumb-as-rocks TV I grew up with. It turned on instantly. Switching channels was instantaneous - none of this five second Comcast delay - although you did have to get up and punch the physical buttons on the cable tuner[1]. I don't think it even had coaxial input; I remember there were two leads[2] that were held on with a couple screws, so if the picture got snowy, you'd jiggle the wires around to get a better connection.
I really don't understand how TVs are so bad. You can spend thousands of pounds on one and the interface will be incredibly slow, buggy and take ages to change channels. A £5 raspberry pi is capable of running a faster interface and my old DVB-t cards for PCs were able to switch muxes almost instantly. What is so difficult about putting this capability inside a television?
Because it's rare for hardware manufacturers to be good at software and also that those hardware manufacturers don't want to spend to license something better that 95% of their customers wouldn't really care about.
To be fair in the current situation, it's not worth buying a TV for it's smart capabilities because they'll rapidly fall out of date. It makes more sense to buy something like the Nvidia Shield which can be swapped out without replacing the panel should it be needed
Even a cheap external digital tuner has a more features and a more usable interface than (dumb) TVs with builtin tuner.
Samsung TV (2010): takes longer than a CRT to "warm up", remote died within 4years
Sony TV (2008): half the screen gets covered by program info popup for some seconds every time some decoding error happens, remote died within 4years
Telefunken TV (early 90's) + external tuner: original remote still works, screen turns on automatically when decoder is turned on and it also has built-in DVR and tetris
This is a market I've daydreamed about trying to enter. Build a high quality, dumb-as-shit TV and couple it with a small line of high quality receivers with beefy ARM processors and well written software. How hard can this possibly be?
you would be surprised. For how many languages do you plan to localize? How many date-formats for the built-in clock?
What about a virtual keyboards and input methods?
do you support EPG? hbbtv? a rudimentary browser and everything that goes along with that?
So you want to support netflix, are you ready to be certified by them, including system security and are you ready to give up everything lpglv3/gplv2 because of the anti tivoization clause?
Will you support multi-tasking? wayland, x11 or something entirely different? You will want to make it usable from some kind of remote.
Do you have QA to test all of that and are they knowledgeable about all the internationalization issues?
You will need a specialist on tuning and decoding of ATSC, DVB, hdmi inputs, upsampling (motion interpolation) to 100hz, 120hz and above. color enhancements, dynamic brightness, the list goes on and on.
Well, the receiver would also be pretty dumb. Definitely no networking. You'd hook a Chromecast (PS4, Xbox, etc.) into it to actually stream stuff. The receiver would provide UI for TV tuner switching, arbitrary audio/video routing, an equalizer, that kind of stuff.
It did take a few seconds to warm up, but you'd get the nice crackly static charge on the screen, warm glow of the tube heater, and barely audible coil whine.
My mother in law has a classic floor console tube (wood cabinet) hooked to google fiber.
I got a new LG SmartTV about a week ago. Before it, I actually was on the same side of being annoyed by SmartTVs, but this one is actually a pretty smooth experience. I tossed out my Fire stick. I think TV manufacturers are, necessarily, going to keep working on the kinks. I don't think SmartTVs are an inherently bad idea, they just needed polish.
An issue like OP posted shows the problem with SmartTVs - they need to be updated. If your SmartTV manufacturer wants to stop supporting an 'app', or can't handle the new specs for said app, you need to buy a new TV to keep your expected functionality. Whereas, if your FireTV stopped supporting an app, you could upgrade your FireTV for a fraction of the cost of upgrading your entire TV.
It would be better for the environment if consumer devices had open source so end users can fix software issues. Reason being end users care about their devices longer than device manufacturers do.
In the meantime a Chromecast dongle will fix the TVs.
Imagine your future TV booting Debian/Ubuntu from an open boot loader via standard micro SD-card.
Yeah I can imagine it being exactly like every linux machine I've ever used (stable but vulnerable or needing some techno-fu to persuade it to boot back up again after I ran apt-get upgrade and it didn't quite do what it meant to for my hardware).
It would be nice, but in the end you gain more flexibility by building a HTPC with the hardware you want and treat the TV as a dumb screen that can receive input from HDMI, DVB-T and DVB-C.
The "Smart TV" is like those "all-in-one desktop PCs" -- unless you really prioritize the appearance of the thing, like in an architect's office where clients are visiting, etc. -- it almost never makes sense. The display stays good for a decade and the compute is often obsolete in 12 months -- it makes no sense to bundle them. I have a overheating iMac whose compute I wish I could switch out.
The difference is, you sacrifice almost nothing (save a few dollars) to have the Smart TV gubbins inside. The TV still works great as a display -- in fact, my new Smart TV has more inputs and other such non-Smart-TV features than my old Dumb TV did.
Also, I call BS on being obsolete in 12 months. My mom wanted to buy a 20" iMac back in ~2009, and I was hesitant to go along with it, as I felt the same way as you did. Then I realized computers last longer and longer these days, and are approaching display level lifespan.
She's still using that iMac today, and it works great. And guess what, I bought a 5k 27" iMac to replace my 10 year old Mac Pro (which has a high-end-for-its-time 24" Dell 2407WFP connected to it).
The reality is, by the time the computer was too old to be particularly useful, a 1080p 24" display was old hat as well.
I don't disagree it makes the disposable culture a bit worse; I'm not particularly likely to try to repair the iMac after a couple of years (which is one reason I sprung for an extended warranty on it which I typically avoid like the plague), but it's not so bad as you suggest. And, of course, the machine can still be used as an external display (or output to another display if it's the display that fails instead of the computer).
Perhaps your strongest argument should be that Smart TVs may accelerate the decline of standalone streamers. If that comes to pass, TVs may come to look a bit more like the all-in-one computer world.
But for now I'll enjoy my 4k streaming on my Smart TV apps, and replace those functions with standalone devices that do it better, as they come on the market. They're not there yet, though, and I'm not about to throw out all of my other devices and replace them just because I bought a new TV.
No it doesn't. Every Smart TV I've seen has horrible ergonomics. It takes a while to boot, the remote has lag, it takes a long time to change from one input to another, it takes a huge amount of time to start the app that is responsible for changing inputs.
They just suck.
Yeah, they have inputs and everything, but android-based UI is hands down the worst UI I have ever had to use, and this included typing magic commands into the mainframe of airline global distribution systems.
Oh yeah. You want perfectly synchronised audio/video streams, just like you had before? Not possible with Smart TVs.
I never thought of my TV in terms of ergonomics. Yes, my tv takes perhaps 20-30 seconds to boot if you have it unplugged, but from its normal standby/off mode, it's as fast as any other tv -- perhaps 2 seconds?
I don't have input switching problems, but then again, I typically use my receiver as the switch; the TV stays on HDMI4 (ARC).
I've never noticed non-perfectly-synchronized audio/video so I can't really comment on that.
You don't just sacrifice a few dollars to have a Smart TV inside... It has security vulnerabilities, features stop working, etc... Especially with the "stick" format of Roku etc., I just see no point. I understand why the manufacturers want you in their heinous "LG ecosystem" with an "LG Live Account" etc. but I'd pay extra to have the smart features on a removable stick.
If the features stop working, the only thing you have lost compared to the TV which never had those features is... a few dollars. If you are worried about security issues don't connect to a network, you will be just sacrificing a few dollars.
I haven't owned a laptop with a VGA output since about 2008, so it's not relevant to me. I don't think it's a smart TV drawback, it's just a "newfangled stuff doesn't support old stuff" problem.
I was using VGA to drive a 1920x1200 monitor at work for months. The image was more than sufficient to write code on. Most people looking at VGA on a TV at TV viewing distances wouldn't be able to tell the difference. You get more artifacting from the video codec than VGA output.
There are so many consumers out there who buy new TVs, laptops, phones, and sometimes even cars every few years; just read the comments on any tech news site to find them.
I agree that it's silly to bundle the two, but these types of consumers won't care that their 2012 TV's YouTube app stopped working in 2016 because they already bought a new TV years ago.
People always say this about the iMac but the fact is that the iMac is cheaper than the competing displays. You literally get the computer inside for free. If you don't like the computer inside you can ignore it and use the iMac as a display.
Anyway these days a computer should last you five years easily.
I still use my iMac, but I can't ignore the massive fan noise that the computer creates if I use it as a target display. I get that the computer inside is "free" but I'd gladly pay extra to have the components separated.
"The symptoms being experienced are not a failure of the TV, but are as a result of specification changes made by YouTube that exceed the capability of the TV’s hardware."
Which changes? YouTube still streams H.264, and I can not believe that those TVs can not run properly the glue-code scripting, e.g. doing a check of "if old TV, then, use less fancy things". So as others comment here, it looks like an excuse.
there are trade offs all the way along the convenience spectrum. If you're reading HN, you're probably not in the target market for youtube apps embedded in smart tvs. But these customers do exist.
I imagine the main selling point is they don't have to connect or configure any external set top box or media pc, and they don't have to install or maintain the app. If you don't have the skills (or the time) to do so, this may be appealing.
The polar opposite would be a Linux expert running their own media center - absolute control but at the cost of said time and skills.
I am skeptical of smart TVs and smart fridges and the like, but I like having an embedded Netflix/Youtube app because it's not harming me any -- provided the UI is snappy, the UX is good, the TV isn't spying on my pixels -cough- Vizio -cough-, and it doesn't go away like it did here.
I know these are generous assumptions, but if it saves me from having to Bring My Own Streaming Stick and I have no particular brand loyalty and/or ecosystem investment like AppleTV, Amazon, Google Play Store, PlayStation Store, then fine. It's only harmful when it doesn't work as intended, of when my 'smart' TV is so obsolete that it's no longer functional as a 'dumb' TV.
The problem is every TV manufacturer suddenly pulled a Windows 10 and thought to themselves: "Everyone already has our products in their homes, why don't we try to monetize this further? Otherwise we'll never beat Google and Apple!" So they're all trying to build 'platforms' and 'ecosystems' and the like, which is terrible value for the person who doesn't want to re-buy the same DRM'ed movie on four different platforms.
Luckily, more and more of these homegrown smart TV platforms are dying out in favor of Android/Google TV (whatever they keep renaming it to), Roku's licensed software, and Sony's PlayStation ecosystem.
> I like having an embedded Netflix/Youtube app because it's not harming me any -- provided the UI is snappy, the UX is good
People have been putting up with digital cable's crappy on demand menus for a decade, so I imagine they'll tolerate the chunkiness that smart TV apps usually have.
I bought a HiSense Smart TV because it was a good price for the picture quality. And I thought the Smart bit would be a nice plus.
But it's quite annoying. Takes a loong time to turn on and the UI is glacial and inconsistent. I use Netflix through the Xbox, and use Chromecast for other things.
Very much this. My grandparents can grab the regular old tv remote like they've been doing for 25 years and browse netflix on the tv like regular channels.
Convincing them to use a smartphone, search on the tiny screen, play, and then cast to their tv is impossible. I don't blame 'em.
I use Xbox One as a media center, works very well.
Oh god. Don't get me started on how horrendously complicated my Harmony remote was to setup.
Plus, once in awhile it goes haywire, which makes my wife's ability to use the system 10x as difficult as it would be without the universal remote in play. Because I set the system up, I understand its quirks. But when it works 99.9% of the time, the other 0.01% of the time is too difficult to train a novice on.
Whereas, in the old days, we'd both be using 7 remotes to accomplish any task, and be more or less equally proficient at it. Practice makes perfect.
My mum bought a (Sony) bluray player that, at the time, was great. It had support for Lovefilm (now Amazon), Spotify, Netflix and all sorts. It was quite expensive, but we thought it was worth it for the extras and it was highly rated as a standalone bluray player. She even paid for the overpriced USB wifi dongle.
Now it's just a bluray player. There's no support for any of the apps because they all changed their protocols (or whatever happens). Thankfully it's still useful and it's not the TV that's defunct.
It damages consumer relations a lot because now she's very cynical about buying any kind of 'smart' device. Good for her to be savvy, but bad for Sony. The people in the target market are getting screwed. Unless you buy an Android TV that has a decent chip in it, you're setting yourself up for EOL in a couple of years.
I bought her a Roku stick for Christmas and she couldn't be happier.
I have a newer (2015-ish) Sony Bravia TV with YouTube integration, along with Amazon, Netflix, and other apps like Plex. When it works, it actually works surprisingly well. But it's a very buggy system (Android OS) -- I get constant hard crashes of the entire TV, or apps crashes, etc. And in general, the interface is not as snappy as it should be. I sometimes have to completely pull the plug to restart it.
That being said, it is definitely convenient to have all these apps accessible from a main screen. The alternative would likely be switching to a different input and having to grab another remote (oh, the horror! ;) ). But I am worried about how long it will be until my "smart" tv is obsolete because all the apps are no longer compatible with their services, but I'm hoping that because it's Android-based, it will be more likely to receive regular updates.
Those people have tablets and smartphones that can use very well. Why should watching youtube on a tv be better than on a tablet? I believe it's not. It takes more time to operate and you can't walk around the house. Furthermore everybody must watch the same video, which is seldom welcome even with standard tv programs.
i dunno, i use the netflix and amazon video clients on my ethernet-connected panasonic viera plasma all the time, multiple times per week.
adding titles using my laptop, they just appear in the client and i select and play. the netflix playlist interface is pretty good, i can browse series and titles quite easily.
i've had it for 5 years now, and it still works fine. i'm going to use it until it dies, probably in the next 2-3 years.
It seems like a dream come true when you're in the business of people buying new iterations of the same thing, Which seems to be all SoCs and embedded systems providers these days.
Potentially related: Youtube updated their TOS [1][2] and posted additional documents that developers need to adhere to [3][4]. These take effect February 10, 2017, but were posted in advance on August 11, 2016.
As far as failure modes go, so far so good -- the 'smart' TV will still work, it'll just have one less app. Eventually, as more and more apps raise their minimum requirements (either by necessity or just because), the older hardware will no longer cut it; but the TV will keep working as a display.
The problem will come when the smart TV's OS itself is no longer getting patches.
Is there a chance to get a full refund and return a device in general when something like this happens? Some countries have pretty hefty consumer protection laws
I have one of the affected TVs and I don't think that I have ever used the YouTube app on it. This is primarily because it has always been hooked up to something smarter: a computer, game console, or Apple TV. The interfaces on smart TVs of that era were abysmal. Now that all of the modern boxes and TVs have voice search and everything, I can't imagine there are too many people who are extremely heartbroken over this. Especially when you look at the price of a Roku or FireStick.
I realize it's quite fashionable to bash "smart TV's" in this thread, but I've got a Sony 2016 Bravia and love it. It runs Android TV and has the usual stuff via Android app store (Netflix, Plex, etc). It has built-in chromecast support, too. Everything is nice and responsive; no complaints. The HDMI CEC supports works well, so I can do everything from 1 remote.
> Everything is nice and responsive; no complaints.
My parents-in-law have the same TV. The UI is the most frustrating UI I ever had to use. Please give me back the dumb 1-level-deep dumb UI.
Apart from how hard is to actually do anything (because the Android crap just gets in the way), everything is so slow and takes forever. There's so much lag in everything. So much time until the damn app that I need starts. So much time until the damn thing boots.
With the previous TV, it took less than half a second to change inputs. Everything was smooth and responsive. Now it takes about 10 seconds, and it's 10 very, very frustrating seconds.
Plus, the worst of all, audio and video are not synchronised on this TV. How can it have such a basic problem, I can't explain. By turning image processing and all that crap off (as much as possible), A/V sync has gotten better, but it's still noticeable to me, and, very, very, very annoying.
So I split A/V signal, and route the digital audio through a box that delays it as much as the TV delays the image, so I can get them in sync. But depending on the load on the TV, the delay is not constant. It changes during the movie. It's the most frustrating piece of equipment I ever laid hands on.
Are you serious? Seconds??? Turning on a TV and waiting seconds is considered acceptable today? My displays start in milliseconds.
> No sync issues for me
Ah, yes, the "not for me" argument.
> as it's all pass through to sound bar.
This is not quite true, as my measurements showed, but if it were strictly true it would be bad, because video takes time to process, and audio would come too early.
But in the 3D configuration the sound bar adds a delay of its own. How do you think the 3D sound works? There's a DSP, with a buffer, and sound is delayed such then when taking in account reflections in the room, etc, you get the impression of 3D sound. To work properly, the delay is on the order of second-order reflections in the room.
But even in the 2D configuration the sound bar adds a delay, as I measured with an oscilloscope. I guess they just turn the DSP program off, but still pass it through all the buffers. This is trivial to hear without an oscilloscope, just turn on both the sound bar and integrated TV speakers.
However the culprit is not the audio chain, which seems to have a deterministic latency, it's the video chain, which has a nondeterministic latency. In the default configuration the TV comes with a lot of stupid and absolutely awful video processing filters in the chain, for example 60Hz interpolation, all kind of noise reduction filters, a compressor, etc. All these add variable video latency, and there doesn't seem to be any kind of feedback between the audio and video chains.
By turning as much as as the video processing off as the dreaded UI will let you, you can minimise the audio/video delay, but as I observed with an oscilloscope, you can never bring it down to acceptable levels. I have a extra hardware device in the audio chain that I can use to sync them up, but it's a PITA to use because each time I change input sources, or do anything really, the video delay changes and I have to reset.
Plus, one day I got a software upgrade, and all my careful setting of disabled the vomit-inducing video processing were reset to default.
I guess it works all fine if you have no standards.
Ehh I think you're a bit unreasonable. Even LCD displays take a few seconds to turn on. Power on any recent LCD from cold and it takes a while to boot. Resuming from a S3 state (or whatever the LCD equivalent is) even takes a second or two.
As for the sync issue, yeah I don't have it. Sorry that your gear is crud. Cannot reproduce. Most soundbars can adjust the audio delay if you're having sync issues.
I have a X930D and the picture is great (albeit bright in standard mode). In Cinema Pro it's pretty accurate. I couldn't tell much difference between the "tuned settings" from AVS and such vs Cinema Pro.
As for my lack of standards, let's leave the ad hominem attacks out please.
The first TV I remember at my parents' place was thrown out after twenty years.
The first TV I bought for my place (admittedly for $999AUD) was thrown out after 5 years - ironically because I wanted a "smart" feature, HDMI CEC. I'm now using a second hand Sony TV that just barely supports the HDMI CEC features I need - enough for it to be driven by a Raspberry Pi, at least.
Our electronics don't last as long but we ask a lot more of them.
Considering that it replaced a TV from 2005, I'd wager yes.
4K will probably be the last big consumer TV tech push until we get cybernetic implants for eyes. You can't see a difference from 4K to 8K at under 100" screen sizes.
I have a company that has built a lot of smart tv apps. I can say that the Sony TVs have consistently been the least performing of the bunch. We don't support most 2012 TVs anymore and that's on more solid TVs like Samsung and LG.
This is a pretty natural evolution it's hugely painful to support legacy TVs right now and 2012 is about where they're becoming too old and underpowered.
As an owner of a Sony Bravia "smart TV" I'm not surprised to see this. I tried using the features but they were very slow and never seemed to work properly. It was hard enough to keep it connected to my wifi with the $80 wifi dongle I bought. My new Apple TV is so worth the money.
For everyone saying "just use Chromecast", note that Vizio now sells that: https://www.vizio.com/m-series It's flaky, but I don't know if it's more or less flaky than smart TVs.
It's the same mismatch of lifecycles that you get with entertainment (and navigation) electronics in cars.