This article and the earlier Samsung "smart" TV article [1] confirm my own observation that 99% of software running on consumer electronics devices (particularly set top media players and wireless routers) is utter crap, thrown together carelessly, probably at the last minute, by hardware people who have no clue about software. Again and again, you can see the same low/broken functionality and lack of polish (watch any media player as it changes video modes, for example. flash flash flash jump flash). The only redeeming quality of these software distributions is that their appalling security make them easy to root, at least allowing those who know better to go in and either fix things or install third party software written by people who give a damn.
It's like these companies spend a year getting the hardware ready, then at the last minute realize it needs software. So they find some monkey to barely manage to get ARM Linux slapped on to it so they can ship something.
It's gotten to the point where I really appreciate the attitude from some of the new up-and-coming set-top box manufacturers who basically say on their web site, "Screw it, we're a hardware company, here's how to get root and do it properly.", e.g. [2]
Until hardware manufacturers find financial incentive to change, this will continue. As it stands today, hw vendors do not derive any (significant) revenue from updating software and so it's small wonder that these devices are not updated.
Personally, I'm perfectly happy with a "dumb" TV with lots of HDMI inputs and a Raspberry Pi with XMBC installed.
I expected Apple TV to come in and crush it, rather like they did to the smartphone industry. It wasn't so long ago that I had a motorola flip phone that could play MP3s. One. At. A. Time.
Apple TV is one of the better set-tops, but still far short of its potential. I attribute this to a "first we build the locks, then if there's anything left, we'll build the house" mentality forced on them by media companies. I wouldn't be surprised if every manufacturer more or less suffered from this and leaving in easy rootability is a form of subtle rebellion.
Pretty much the same reasoning, I was genuinely expecting Google to come up with something manufacturers would adopt, and other than an eventual high end actual Apple TV for it to be pretty much everywhere. Especially because there's a big crossover (Samsung, Sony, LG) between mobile and TV manufacturers. I really don't see myself buying a smart TV in the current market, seems like such a missed opportunity.
Many are still waiting for that "solution" that Steve Jobs had before his death to come about. The disappointment I have with Apple TV is that it wasn't a revolutionary as the iPod or iPhone were. It was apparently only created to be minimally sufficient to support delivering the limited iTunes content that was available.
> The disappointment I have with Apple TV is that it wasn't a revolutionary as the iPod or iPhone were.
This is because there are already many TIVO/DVR/HTPC/Console Media software applications that do a pretty good job at this. Apple is having a hard time breaking into an existing market with something that is already fairly polished by the community.
There isn't a working jailbreak for AppleTV3. People have injected themselves into the existing apps changing where they look for video, but that's different to having local root.
Ok, but the GP said "Good luck unlocking/hacking an Apple device" not "... an Apple TV 3", implying that Apple devices are impenetrable fortresses of doom or something.
Full-blown iOS devices get jailbroken pretty quickly. I assume the difference here is that no one cares about the Apple TV, so there's little incentive to hack it (exaggeration, but the difference in Apple TV owners vs. iPhone owners is probably several orders of magnitude). Even if you did hack it, what's the point? There's no hard drive to store your pirated videos, and IIRC these things can stream videos from your PC out of the box anyway. If you're gonna install XBMC and bypass the Apple TV interface completely, why get an Apple TV in the first place?
Good points. I should've been clearer. I meant the scene is not such that there is enough work done that a relatively non-techie person can sift through the jailbreaking community work and apply it him/herself.
Many "hackers" seem to magically stop caring about hacking devices if they're made by Apple. Clearly they must have been perfected prior to release, so there's no need.
EDIT: Whoa, took a while to post this and then parent was gone. I should've quoted them, ah well. Their comment was along the lines of, "Why are companies hiring people for $40k to write this stuff (and getting bad products as a result) instead of hiring someone for $140k that'll turn out better products."
Management doesn't look at it that way at these companies. The problem is hardware companies don't grok software (not just companies, US DoD has such a heavy background in hardware they often don't have a clue how to handle software, see the F-35 and attempts to "modernize" old systems, wonder why budgets on new systems get out of control? It's probably the software.). An anecdote, but one that I'm told by others was representative of their experience with hardware companies.
The company began by making sensors. It got into software because they wanted a way to sell the complete package to their customers (buy our sensors and this control panel, your building/bus/train/plane will be safer and you don't have to write the control software yourself!). Before my time there they had 4 or 5 people in software. 1 coder. 1-2 testers. 1-2 QA. Their test suites were ludicrously incomplete. No one thought that this was odd, except for those of us in software who came to the show later. The higher ups considered software to just be a necessary inconvenience, it wasn't until our shop started delaying deliveries (because it was understaffed, underpaid, and inexperienced) that they started hiring more people (all underpaid and inexperienced as well) in an effort to throw more bodies at the problem (seriously, see the F-35 for a repeat of this exact behavior).
Short version: Hardware organizations don't get software. It's an inconvenience to them, and they'll screw over the poor sap that gets stuck managing the project with a pitiful budget, and absurd deadlines and requirements.
> by hardware people who have no clue about software.
and
> Short version: Hardware organizations don't get software.
I concur with these two commenters.
Going back to the 1970s, starting with DEC, it's been my observation that hardware companies don't get software. (I worked at two separate companies that OEMed PDP-11s and developed operating systems instead of using DEC's.)
Rather than being viewed as critical to a product's success, in far too many cases it seems like the software is merely a component on the bill of materials for a device without which the unit can't be shipped.
Companies that achieve excellence in both both hardware and the accompanying software are so rare that they might as well be thought of as unicorns.
I recently bought a Panasonic camera - in the compact travel zoom class. One feature is wifi support, where for example it can upload pictures to your computer. There are several places the camera lets you enter text such as names in face recognition, and in SSIDs. But for some unfathomable reason a Panasonic engineer added extra code so that spaces are disabled for wifi passwords. This was actual extra effort, and completely and utterly wrong. Spaces are allowed, every domestic access point I use has them, and everything else is perfectly happy with them including iOS, Android, Windows, Linux, Mac, Playstation 3, Nintendo Wii, and Roku. Oh and Panasonic TVs and bluray players (the "smart" ones).
It is impossible for Panasonic to find out about this or do anything about it, because like many companies they value interaction with their customers so little that it is outsourced to the lowest bidder. Those companies business model is to give out the same answers to the same questions over and over again. They cannot cope with an answer they don't already have. The one Panasonic uses in the US is even worse, refusing to tell Panasonic about the issue. They also have a cunning system where links to surveys after chats don't work, and the phone survey won't recognise giving them low scores!
I enjoyed the blog post mostly because 'misery loves company' and I've been through this same nonsense before.
Bottom line is that when a support system like Panasonic's isn't working for you, quickly to abandon it and resort to a proper business letter sent through the mail. Format it properly, keep it very short and clearly state what you want, and include at least one complement as this works wonders. It takes less time than all the bullshit on the phone, plus I have seen a 100% success rate after resorting to this myself. I do 3-4 a year, easy.
Look up the company's address and CEO, or head of that particular product division. Spend a tiny bit more than normal postage and get signature proof of delivery, it really is worth it.
Include your email address on the printed letter and you will hear back from someone at the company who can actually help you. And be quicker to abandon the traditional support channels when they frustrate you. I'll set a hard limit of 10 minutes to get a human on the phone who sounds competent before I hang up and send a letter. I'll hang up right in the middle of their script, I just don't care anymore. The letters work.
This is absolutely true. During the housing crisis, I couldn't get the time of day from Wells Fargo. I must've spent half my days playing phone tag with their useless reps. I finally got fed up and looked up the names of every executive member of their home mortgage group from the vp of service up. I included them all on an email (including the CEO) explaining my situation. No more than an hour passed before the VP of service had called me directly.
Five buttons, all for spaces? That would already be a sign to me that whoever designed that UI wasn't really thinking straight...
And I'm almost willing to bet that not letting you enter spaces might be the result of a "security fix" because they pass the contents of that field, unescaped, directly through to a wifi configuration shell script, and someone doing some testing discovered that spaces would "cause things to break", so the solution was to disallow spaces instead of performing escaping on the password string.
"Now that we're translating this UI to English, what do we do with all these buttons that aren't needed?" "Too much work trapping into the UI drawing code to hide them, let's just remap them to something else... how about space! That's harmless!"
I see what looks like the Japanese equivalents of single and double quote, full stop and comma, so it's still a bit puzzling why the English version doesn't have those there.
They are all regular regular ascii spaces (identical). The same number/positioning of buttons is shown no matter what language is selected on the camera. English doesn't need as many as Japanese, so they made the surplus buttons be space.
Spaces are fine in the SSID so they already had to cope there. I agree with your hypothesis that something probably went wrong in their dev/testing, and then they ran with it making things worse and worse.
Compounding things by keeping customers as far away as possible ensures that lessons are not learned.
The 5 different spaces on the alphabetic screen all work as spaces in other contexts, including manually entering an SSID. When switched to numeric there are 3 grayed out spaces, and punctuation has 1 grayed out space. The developer who decided to disable spaces in wifi passwords was very thorough.
The problems extant in home entertainment products are only reflected throughout the rest of the "Internet of Things" ecosystem. Or as I prefer to think of it: security threat vector channel.
Consider: You've got a slew of products for which rapid iteration and obsolescence is a key attribute, profit margins are slim, cost pressures are immense, talent is limited (and not attracted by time and cost pressures), parts availability is arbitrary, capricious, and liable to change with little notice or reason (we've seen in the GM ignition switch recall what the implications of even a minor parts change can be, in a slightly different space).
A few years back I had the distinct pleasure of attempting to configure and deploy a major enterprise vendor's storage product on what the sales team promised, but the support team denied at pain of abuse to my firm, was supported under our OS. The company shall remain nameless, but rhymes with "hell".
What I eventually established was:
• The vendor itself had little or no real understanding of the equipment or its configuration.
• The documentation for the underlying technologies (all, as it happens, open source free software), ranged from quite good to abysmal. One README file consisted entirely of the text "Good things to read here.". We were less than gruntled.
• True support was actually a pass-through to the OS vendor, who, it turned out, wasn't in fact the source of our OS. We'd been sold something which didn't in fact exist (real support).
And that was for high-end enterprise-grade hardware.
The situation for the consumer-grade stuff, especially the cheap consumer-grade stuff, is far, far worse.
Which is why I want the products I buy to have the absolute minimal amount of technology in them as possible or necessary. It's fewer opportunities to foul up.
I hate myself for saying this, but perhaps the only way to wake up these manufacturers is to make them liable for leaving the "security door" wide open. If they've followed good practices and at least tried to get things right, that should be a strong defense. But if they've completely ignored security, then they should be at risk too.
That will likely be a part of it. Though finding pockets to go after may be difficult as well: hardware and software components are already modular, there is the developer, the manufacturer, and often a "brand" who effectively purchases the tech and tosses a logo on it (Apple is an exception to this rule). To say nothing of transnational jurisdictions.
The complexity problem extends to the legal side as well. Much as I think liability could help, it's going to be hard to apply, particularly with lay attorneys, judges, and juries.
Apparently in order for a home network, with modern appliances to be secure you need to be at least a geek these days.
You probably need a router that runs OpenWRT, Linux or BSD like caramobola2[1], to configure manually in order to monitor incoming and outgoing connections, blocking suspicious traffic.
Then you probably need something that can run XBMC[2] as a home entertainment system (atom HTPC works like a charm, I have one). Then a flashed dreambox PVR which runs linux too and you can do a lot of things but most importantly monitor everything.
So basically, anything that runs proprietary software is a security concern. Strangely, the more corporations try to build walled gardens the bigger the security risk is.
This software has been written without any concern for security, and it listens on the network by default.
This dawned on me when I intended to switch over to FIOS due to problems with my cable internet connection. I moved all of my media and gaming devices first, then abruptly stopped, wondering why I would want to share a network with a bunch of devices I had no reason to trust. I know I can isolate them on my home network, and may eventually get around to it, but in the meantime, I'm using separate connections to different ISPs (which provides other benefits, as well).
...wondering why I would want to share a network with a bunch of devices I had no reason to trust. I know I can isolate them on my home network...
Unfortunately, it's actually quite difficult to do that properly with a typical broadband/WiFi box as supplied by most ISPs (at least here in the UK) by default. You need to step up to business-grade gear in most cases, which is both more expensive and beyond the technical knowledge of most consumers, and even then some otherwise respectable devices are regrettably lacking in flexibility when it comes to VLAN configuration and the like. I'm looking forward to the day when standard home user boxes allow you to do things like isolating certain physical ports or wireless networks by default, so hopefully running untrusted devices on independent networks becomes routine.
Even then, there's still the problem that if you want to watch some sort of streaming content off the Internet, whichever device(s) have that external connectivity could also pose a privacy risk if they have access to whichever devices are acting as home media servers. DLNA seems like a step in the right direction for connecting up home media devices, but I haven't seen much evidence of a robust security model being implemented so far.
You'll want to get interested in the advanced configuration capabilities on that FIOS router. There's quite a bit under the hood there that will let you, with an annoying level of difficulty, get things into good shape.
I previously worked in a company where the main product was an IP camera. Most of the team was building a windows client to interface with the camera and I was a one man army to handle the software onboard. I'd say that most software people don't know what is hard with hardware (no pun intended, seriously).
Most of your time is spent around finding libraries that can be compiled for your board or fixing device drivers because of that slight board modification the hardware guys did. You also have to figure out how the proprietary hardware encoders work and make them work properly. If the documentation doesn't fit the implementation? Well you have this contact under nda agreement that will eventually say "I don't know either, have you asked in the forums?" after 2 weeks of back and forth. Also sometimes your device would just reboot and you'll spend time arguing with the hardware guys that linux don't just reboot for no reason. Did I mention having to worry about how your 64mb of ram get allocated and in which DSP pool?
Eventually, you'll find time to write code that will cause your device to do something meaningful.
Writing software for hardware is much harder than it seems, because you have to worry about all those things you take for granted when working with a x86 desktop. You can't upgrade often after shipping, so testing focuses on critical bugs instead of usability. I do agree than spending more resources would make sense, but consider this : Hiring a DSP specialist to get an H264 encoder to work would cost 80k$ in Montreal in 2007. This is something that is now free with newer chips, but just getting the hardware to work used to drain a lot of resources. Larger companies could have more money for that, but I do not know how they work internally so I can't speak for them.
Spot on. I'm currently arguing with a retailer over a Toshiba television which can't play mp4 files reliably. It crashed and hangs regularly.
They want to send it for repair because to be honest they are fucking morons and however much explaining I do they don't realise that the manufacturer doesn't support any kind of firmware update. I want my money back so am having to take them to small claims court in the UK to force a refund under the sale of goods act because the item is not fit for purpose.
Between this and a Bravia EX which is buggy as hell I'm tempted to just buy something that will run a Linux based media center with a PC TFT and say fuck it to consumer appliances.
All converged appliances suck, and probably always will, and the consumers are figuring it out, which will be the doom of the "convergence" meme.
Non-converged appliances usually work pretty well. My dumb TV displays whatever the HDMI flings at it, the separate optically connected surround sound system works beautifully, the computer hooked up to them just does its thing.
The manufacturer could've made it more secure, but then you wouldn't be able to change the region code like that or do anything else to be in control your device... security cuts both ways.
Not really. FLOSS is about being in control of your software (being able to legally modify and distribute things), not making any guarantees about software behavior (including security).
Indeed, source code availability (much weaker than FLOSS) is very useful in software analysis. Nonetheless, while in most cases this is just tedious and boring work, one can still validate proprietary binary blobs. Obviously, that only applies to countries where reverse engineering is legal.
That's a likely outcome, but it doesn't have to be. The settings UI could allow me to install new keys off USB, and then set a password to prevent someone else from doing the same thing. Security doesn't have to mean giving up control of your own system (Microsoft's requirements for UEFI Secure Boot on x86 systems get this precisely correct)
It's difficult enough to get a correct implementation of secure boot as it is, and while adding your own keys is nice in theory, it also means that the manufacturer would have to do more work to add that functionality. In the case of secure boot it was MS's requirements that persuaded them to (probably reluctantly) do it, but there's nothing of that sort for smart TVs and the like. On the other hand, there's plenty of pressure by media corporations to lock things down for DRM. Thus, even if there was functionality that enabled you to truly "own" your device by making it trust you, it would undoubtedly also come with its own restrictions (e.g. disabling access to DRM'd content, future updates, etc.), so in some ways it segregates and stigmatises.
So while I agree that "secure, full control" > "insecure, full control" > "secure, no control", since the first option is highly unlikely and efforts toward more security are probably going to result in the last one, I think the middle option isn't that bad after all...
The only way to implement network-connected appliances safely is with open-source. It doesn't have to be under a permissive license, but it does have to allow the end user to change the code. Anything that is network-connected -HAS- to be update-able and independently verifiable. Anything less is a risk to all of us, like biologically unstable organisms released into a city would be.
No, open source is not, in and of itself, sufficient.
I do believe increasingly, however, that it is necessary.
But so are other attributes: a healthy development culture, an approach which preemptively seeks out and eliminates security threats and holes, and most importantly, operates with the end-user as a primary focus. Projects with a strong record of this include OpenBSD (which is now conducting a focused effort to clean up the OpenSSL code: http://opensslrampage.org/) and the Debian project (with formal social contract, constitution, and policy all of which put the interests of the end-user front and center). While Debian's had its security snafus, particularly relative to OpenBSD, among Linux distros it's tended to be among the more secure and security-conscious distros, in my experience.
But projects with a long track record of disdain also tend to show themselves for want of strong security: PHP, awstats, GNOME, and more recently, systemd, all come to mind. I have grave concerns over the damage the latter is now in a position to do, and the fact that one dev has already been denied privileges to submit kernel patches by Linus Torvalds doesn't do much to increase my confidence.
Eventually, it was disclosed publicly and fixed, yes. But the standard set of arguments for open source and 'all bugs being shallow' would have expected it to be caught and fixed almost immediately.
I'm not arguing that open source isn't a better system, just that it's not a guarantee of safety when compared to closed source in this regard - both systems have to have competent people actually caring enough to pore over the code and provide fixes. Consumers cannot be expected to manually review and audit every line of code in their networked devices, nor (apparently) can they reliably depend on someone else to do it for them.
You can grab a firmware image from any closed-source appliance and immediately find dozens or hundreds of security problems that you know will never be fixed except possibly in your copy. They probably won't get fixed even if you talk to the manufacturer about them; in fact you run the risk of the manufacturer convincing the police to raid your home, or of the manufacturer suing you.
Now try the same with open-source software. There will be far fewer problems to find, because all of the obvious ones have been fixed already, and most of the difficult ones too. Even if you do find a problem, as soon as you put some information out there about it it'll be fixed. Upgrades invariably happen more quickly with open-source software, even in firmware roles, so the population is covered more quickly as well.
That's what is meant by 'given enough eyes'; not that all bugs are found and fixed immediately, but that all bugs are found and fixed eventually. OpenSSL simply didn't have enough eyes, for a variety of reasons. Developers in the US were discouraged from contributing, since our legal system has in the past been a risk to that kind of project. Many developers avoided it because they didn't know anything about old systems like VMS, or about cryptography. Most of the rest of us avoided it because it wasn't obviously broken.
Edit: We also generally assume that the NSA has 'enough eyes'. If they decide that they have to attack product X, then they get a bunch of people to look at product X and find its bugs. It doesn't matter whether it's open source or not, it's just a question of the number of eyes you apply to the problem.
If you seriously think that all bugs in open source are found and magically fixed immediately, I suspect you should try working as a community member / contributor of an open source software project.
>If you seriously think that all bugs in open source are found and magically fixed immediately
I don't think that at all. But, the heartbleed bug is an example of one which should have been, by all rights. It wasn't due to some complex crypto implementation, or an arcane syntactical edge case in C. It was a pointer bug. It was a simple, critical fault in a bit of software which had a lot of very qualified eyes on it, which a lot of people here would ridicule as a matter of course.
Now why, if something like that can happen with code most hackers and open source proponents care passionately about, should it be taken for granted that enough people are going to be validating dvd player and smart tv firmware for the end result to be better for the average end user than expecting whoever the manufacturer hires to do it?
It was a simple, critical fault in a bit of software which had a lot of very qualified eyes on it
Do we know this particular bit had had a lot of very qualified eyes on it? This is the problem with the eyeball theory. That a piece of code is open for anyone to read does not mean people who care actually read it. Everyone: it's open source, there are many of eyeballs on it, therefore I don't need to read it!
I'm trying to think of ways to make people wake up and routinely review the code and changes they use. You really don't need to be an expert programmer to spot a duplicated goto fail or the total lack of bounds checking.
From Robin Segglelman, the maintainer who "accidentally" added the heartbleed bug:
"""
...
OpenSSL is definitely under-resourced for its wide distribution. It has millions of users but only very few actually contribute to the project."
"""
I really think you should try joining and open source community and contributing some code. Then I suspect you'll understand that it is just as hard as any proprietary code (like that which I work on for $day_job). The difference is that Open Source code, even critical stuff that runs the internet, is often woefully understaffed and over expected to be perfect.
It's like these companies spend a year getting the hardware ready, then at the last minute realize it needs software. So they find some monkey to barely manage to get ARM Linux slapped on to it so they can ship something.
It's gotten to the point where I really appreciate the attitude from some of the new up-and-coming set-top box manufacturers who basically say on their web site, "Screw it, we're a hardware company, here's how to get root and do it properly.", e.g. [2]
1: https://news.ycombinator.com/item?id=7616420 2: http://www.pivosgroup.com/