Hacker News new | past | comments | ask | show | jobs | submit login

Now imagine the exploit being used by a blackhat. The hackers aren't the problem here. The fact that somebody can even control cars over the Internet at all is.



You seem to be confused.

Because a dangerous threat exists does not give a researcher license to endanger the public to prove it. This is especially the case when a safer alternative to demonstrate this exploit easily exists.

Robbers could enter your home and hold your family at gunpoint AT ANY TIME. That does not give me the right to prove to you how easy it is by entering your home and scaring the crap out of your family.

1) This is a dangerous exploit

2) This was a dumb way to demonstrate it

Those are not mutually exclusive.


> This is especially the case when a safer alternative to demonstrate this exploit easily exists.

And when said safer demonstrations are ignored by manufacturers, as was the case here?


> This is especially the case when a safer alternative to demonstrate this exploit easily exists.

If you read the article, you'd know that said safer alternative was already attempted and presented to auto manufacturers, only to be met with dismissal.


Did you read the article? Here are two quotes:

"Second, Miller and Valasek have been sharing their research with Chrysler for nearly nine months, enabling the company to quietly release a patch ahead of the Black Hat conference."

"WIRED has learned that senators Ed Markey and Richard Blumenthal plan to introduce an automotive security bill today to set new digital security standards for cars and trucks, first sparked when Markey took note of Miller and Valasek’s work in 2013."


> "Second, Miller and Valasek have been sharing their research with Chrysler for nearly nine months, enabling the company to quietly release a patch ahead of the Black Hat conference."

I did admittedly miss the "nine months" portion of that, but that's still only one company out of many.

> "WIRED has learned that senators Ed Markey and Richard Blumenthal plan to introduce an automotive security bill today to set new digital security standards for cars and trucks, first sparked when Markey took note of Miller and Valasek’s work in 2013."

If you read further, you'll see the paragraphs on Markey's letters to auto makers regarding the 2013 findings; Markey's own findings only reinforce my point further.

Also, note that my point - that auto makers mostly ignored Miller and Valasek, according to the article - would not include senators (unless said senators build cars, of course).


> I did admittedly miss the "nine months" portion of that, but that's still only one company out of many.

Yes, it's the company that owns Jeep. The company that has a demonstrated the security flaw. How different automakers responded to different security issues isn't related to this article or discussion.

> Also, note that my point - that auto makers mostly ignored Miller and Valasek, according to the article - would not include senators (unless said senators build cars, of course).

Senators may not build cars, but they can (and are trying to) force auto makers to take security seriously.

The argument in this comment chain has been whether this problem could get the attention it needed without such a dangerous publicity stunt. The fact that automaker and lawmakers were convinced to take action by less dangerous demonstrations shows that this stunt was not necessary.


> How different automakers responded to different security issues isn't related to this article or discussion.

It is related to the article when the article discusses those responses.

> The fact that automaker and lawmakers were convinced to take action by less dangerous demonstrations shows that this stunt was not necessary.

One automaker (even this is dubious; Chrysler seriously expects people to believe that the only way to patch a bug that allows total control over a car's transmission and brakes - let alone the rest of the car - is via a USB stick, and that over-the-air patching isn't an option? Please.) and two senators. There are dozens more automakers and 98 more senators to convince. Hopefully the demo helps make that a better situation.

Meanwhile, a bunch of Dodges and Chryslers are driving around America totally susceptible to UConnect bugs, and a very large number of new cars on the road don't even have the most basic safety precautions (like, you know, not connecting the brakes and transmission to the Internet willy-nilly).

The convincing so far has been negligible. Hopefully that'll change soon, before someone with less-benevolent motives follows in Miller's and Valasek's footsteps.


And yet somehow the sins of the auto manufacturers in no way excuses the reckless behavior demonstrated in this video. It is possible that more than 1 person/entity in this story is in the wrong. Its clear you've already made up your mind and have posted more than a dozen comments here defending the researchers, so I'm not sure what more can be said. The ends simply do not justify the means.


I agree that both sides are in the wrong, actually. I've even stated my disagreement with the methodology in several of those "dozen" comments.

Most of those comments, however, are only clearing up a specific misconception: the belief that the researchers jumped straight to a "live" test before trying tests in closed conditions. My idea of "right" v. "wrong" does not factor into doing my part to ensure that discussions on this matter are based on accurate information.

My "defense" of the researchers is more just identifying a lesser evil. Between the evil of a couple of nerds hacking one car and the evil of auto manufacturers willingly putting hudreds of thousands - if not millions - of innocent people in mortal danger, I'd sooner take the former (assuming that it actually makes a difference re: security priorities of auto manufacturers; in reality, even this particular demonstration is probably insufficient, though perhaps I'm just jaded).


First off, a "dangerous thing you can do" and "exploit" are not synonyms. So examples like anthrax attacks or home invasion are stupid and massively miss the point.

Secondly, nobody would give a fuck about this exploit if it was performed in controlled environment. The researchers knew it because they did this kind of stuff before. Guess what, the cars did not become any safer!

This much should be obvious to anyone with a hacking mindset. The comments in this thread read more like "Moms Against Drunk Driving Bulletin Board" than "Hacker News".


Actually home invasion is an "exploit" of the home security (i.e. locked doors / windows).

Just because they do it from behind a screen doesn't make it a less culpable crime. Computers don't insulate you from ethics...

We put locks on our doors to prevent people from entering. They have always been exploitable but we use threat of laws to prevent it. Now we put locks in our software to prevent people from entering it (encryption). Somehow this generation believes that these locks are exempt from decency and law. It's sad that people think exposing vulnerabilities at any cost is righteous. There's plenty of people researching security in responsible ways. These two are not in that camp.

Have fun... it's no different than kicking your neighbors door down and tell him to pay you for exposing his security flaw. Still makes you are jerk.


Using your logic, the government doing control biological tests on an unaware population (e.g. to test the spread patterns and inform their response to an actual event) is ok because terrorists/foreign governments could do much worse with (e.g.) weaponized Anthrax. I somehow doubt that you would be ok with such actions by the government (though I could be wrong).


It's not either/or. You're presenting a false dilemma. You can demonstrate the problem without doing it where you put real lives in danger. The researchers acted recklessly.


> You can demonstrate the problem without doing it where you put real lives in danger.

Indeed. And according to the article, they already did. The manufacturers ignored them.


Well, no, the manufacturers didn't ignore them. They responded with a patch, but the researchers didn't like their response.

Still doesn't matter though. There are a million shades between quiet disclosure and outright stupidity that would still make headlines.

1) They could have let the "test dummy" in on what was going to happen, so they could give feedback as to when it was safe to do so.

2) They could have ensured constant two-way communication.

3) They could have done it when nobody was on the road.


> They responded with a patch, but the researchers didn't like their response.

It was my understanding that the patch was released in response to the live highway test, not the prior tests in controlled environments.

> They could have let the "test dummy" in on what was going to happen, so they could give feedback as to when it was safe to do so.

The article makes it sound like they did.

> They could have ensured constant two-way communication.

Indeed they could've. I agree with you about the recklessness of this particular element of the test.

> They could have done it when nobody was on the road.

Perhaps, and I agree that maybe they should've coordinated with local authorities (if they didn't already). However, between "do the test with vehicles on the road" and "don't do the test at all", I'd certainly pick the former.

Not to mention that the urgency involved with other vehicles on the road factors into the effectiveness of the demonstration.


Regarding the patch timeline, the article makes it clear they had been working on the patch for months before this went public.

> Miller and Valasek have been sharing their research with Chrysler for nearly nine months, enabling the company to quietly release a patch ahead of the Black Hat conference.

With respect to letting the driver in on it, it's pretty clear they withheld most information:

> Miller and Valasek refused to tell me ahead of time what kinds of attacks they planned to launch

And with respect to this:

> However, between "do the test with vehicles on the road" and "don't do the test at all", I'd certainly pick the former.

Oh look, another false dilemma. Between those two, I'd pick neither, and do the test responsibly.


> With respect to letting the driver in on it, it's pretty clear they withheld most information:

The reporter knew there were going to be attacks in the first place. There was also plenty of reason to believe said attacks could severely impair safety.

> Oh look, another false dilemma.

It's a trilemma; the concept of "do the test 'responsibly'" was already implied, so I merely provided the other outcomes. There's "perfect execution of demonstration" and "no demonstration"; between that is a spectrum of perfection, on which this demonstration happens to lie somewhere near the lower-middle.

I don't disagree that the demo could've been done with more safety precautions, but the desire to do a "live" demonstration like this seems pretty reasonable, and even a demonstration lower on the perfection spectrum is preferable to the bottom end of "nothing at all".




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: