Hacker News new | past | comments | ask | show | jobs | submit login

> Does it though? Does it really?

Of course not. It's just protectionism and rent-seeking.

> I don't understand this move from HDMI Forum. They're handing a win to DisplayPort.

I don't think so, at least at this point. Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

Sure, that situation may change, and the HDMI Forum may walk back these requirements.

At any rate, for some reason DisplayPort has just not caught on all that much. You very rarely see them on TVs, and a good number of mid-/lower-end monitors don't have them either.

It's bizarre, really.




> At any rate, for some reason DisplayPort has just not caught on all that much.

DisplayPort won everything, except not becoming the physical connector for home cinema. Heck, even within those HDMI-exposing devices, DP won.

The vast majority of display drivers speak eDP. Few things actually implement HDMI, and instead rely on DisplayPort to HDMI converters - that's true whether you're looking at a Nintendo Switch or your laptop. Heck, there is no support for HDMI over USB-C - every USB-C to HDMI cable/adapter embeds a HDMI converter chip, as HDMI altmode was abandoned early on.

The only devices I know of with "native" HDMI are the specialized TV and AV receiver SoCs. The rest is DP because no one cares about HDMI.

However, seeing that home cinema is pretty much purely an enthusiast thing these days (the casual user won't plug anything into their smart TV), I wonder if there's a chance of salvation here. The only real thing holding onto DisplayPort is eARC and some minor CEC features for AV receiver/soundbar use. Introducing some dedicated audio port would not only be a huge upgrade (some successor to toslink with more bandwidth and remote control support), but would also remove the pressure to use HDMI.

With that out of the way, the strongest market force there is - profitability - would automatically drive DisplayPort adoption in home cinema, as manufacturers could save not only converter chips, but HDMI royalties too.


>home cinema is pretty much purely an enthusiast thing these days (the casual user won't plug anything into their smart TV)

Except a gaming console, a laptop, a roku, apple TV...

Every single person I know has some external media source plugged into their TV, even my tech illiterate mother.


You’d be surprised by the number of users who are satisfied with the built-in media experience.

I’d say it’s most likely a large majority. Google TV is common, but people with an Android-powered TV are not the main target for those until the TV gets old and out of date. Apple users on Samsung TV’s might also get far with the built in AirPlay support.

Heck, even within enthusiasts there is a strong push to use the built-in media features as it often handles content better (avoiding mode changes, better frame pacing). Even I only use an external box after being forced due to issues when relying on eARC.

Very few people plug in their laptop to a TV, and laptops are not normally HDMI. Some laptops have a dedicated port with a built-in converter, but all modern laptops are USB-C which only exposes DisplayPort.


I'm in this crowd. The TV apps work well enough and it's one less remote. The only thing I use the attached Chromecast for is to (rarely) mirror my phone screen.


> Introducing some dedicated audio port would not only be a huge upgrade

I'm not sure about that - suddenly there's a cost in board space and BOM, and they're not automatically linked together. Or do you just mean for audio output from TV to soundbar? I feel like USB would suffice for that if anyone could be bothered. Personally I use regular TOSLINK to a stereo amplifier and accept having another remote.


Heh, good point, USB 2.0 would absolutely suffice. You'd hardly need more than a standard audio profile either. Some TVs even support this already - recent Samsung models at least.

A specialized port could theoretically have a lower BOM cost through simpler silicon or port design, but USB 2.0 is free at this point so why bother.

> Personally I use regular TOSLINK to a stereo amplifier and accept having another remote.

The problem with TOSLINK is not only the remote scenario (which I do think is absolutely a necessary feature for any kind of adoption), but also lack of bandwidth for uncompressed surround sound.

Large surround setups at home are uncommon these days, but soundbars with virtual surround is common, and some of us still manage to squeeze in a simple 5.1 setup.


> The only real thing holding onto DisplayPort is eARC and some minor CEC features for AV receiver/soundbar use. Introducing some dedicated audio port would not only be a huge upgrade (some successor to toslink with more bandwidth and remote control support), but would also remove the pressure to use HDMI.

USB-C

I mean think about it

USB-C/DP alternative mode is good enough as upstream for most use cases (including consoles)and has some additional future feature potential, and still has some USB bandwidth left usable for various things including CEC

for eARC-like use-cases (i.e. sometimes audio+video upstream, sometimes audio downstream) you have a few choices (one needs to be standardized):

- always create a DP alt mod channel upstream, use audio over USB for downstream, technically that already can work today but getting audio latency synchronization and similar right might require some more work

- switch the DP alt mode connection direction or have some audio only alt mode, which either requires a extension of the DP alt mode standard, or a reconnect. But I think the first solution is just fine

as an added benefit stuff like sharing input devices became easier and things like Roku TV sticks can safe on some royalties ... which is part of where the issue is there is a huge overlap between big TV makers and HDMI share holders, I mean have you ever wondered why most TVs don't even have a single DP port even through that would be trivial to add?

which is also why I think there is no eARC like standard for USB-C/DP alt mode, it only matters for TVs and TVs don't have DP support

honestly I believe the only reasons why TVs haven't (very slowly) started to migrate to USB-C/DP alt mode is that most of their producers make money with HDMI

and lastly there is some trend to PCIe everything in both consumer and server hardware. In the consumer segment it had been somewhat limited to the "luxury" segment, i.e. Thunderbolt. But with USB4 it slowly ends up in more and more places. So who knows PCIe based video might just replace both of them (and go over USB-C)


> and lastly there is some trend to PCIe everything in both consumer and server hardware. In the consumer segment it had been somewhat limited to the "luxury" segment, i.e. Thunderbolt. But with USB4 it slowly ends up in more and more places. So who knows PCIe based video might just replace both of them (and go over USB-C)

Thunderbolt/USB4 is not PCIe. It's a transport layer that can run multiple applications at once, sharing bandwidth based on use. This is opposed to USB-C Alternate Mode, where pins are physically reassigned to a specific application, which uses the pins regardless of whether it needs the bandwidth.

PCIe is then one of the supported applications running on top of the transport.


I know, but this isn't relevant for the argument, if anything it's in favor of some future protocol replacing HDMI/DP/USB-C+DP alt while using the USB-C connector.


I was just pointing out specifically that there is no such thing as PCIe-based video - nor is there any need for that.

Support for USB4/Thunderbolt DP will proliferate, but there is still benefit to a DP altmode as it's free to implement (the host controller just wires its existing DP input lanes directly to the USB-C connector) and allows for super cheap passive adapters.

If USB-C ends up becoming the standard video connector as well, it will most likely be DP altmode as you then only need a cheap USB-C controller to negotiate the mode.

There isn't really any pressure to invent a new protocol. https://xkcd.com/927/


> Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

Arguably true, but I think that is changing all the time while there is a push towards open-source drivers regardless of the average user knowing/caring what that is, along with resolutions and refresh rates increasing.

I was affected by HDMI Forum's decision by buying an off-the-shelf 4K 120Hz monitor which refused to work at that resolution/refresh rate on an HDMI cable.

I was not expecting an arbitrary decision affecting software to be the cause instead of a hardware problem - which took me a while to figure out.

Now I know if I want to use my hardware to the full capacity, I need DisplayPort in future.


> off-the-shelf 4K 120Hz monitor which refused to work at that resolution/refresh rate on an HDMI cable.

I run a 4K 144Hz monitor over HDMI. Are you sure you don‘t just need a better cable?


My HDMI cables work at 4k 120Hz with the same monitor with an NVidia card using closed-source drivers, not with AMD open-source drivers, because of the issue in the article.


> I don't think so, at least at this point. Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

I do, but this hardware doesn't have DisplayPort. I switched from Nvidia to AMD specifically for the open source Linux drivers, so I'm quite mad at the HDMI forum for this.

On the other hand, my next TV likely won't have DisplayPort, either, because almost none of them do, so it is indeed questionable whether this is going to loose them any mind share.


>Of course not. It's just protectionism and rent-seeking.

Don't know why you're being downvoted but it's true. Especially when you see that the HDMI standard was developed by the cartel of TV manufacturers and major movie studios[1] when DVI and Display Port already existed but those didn't generate royalties or have DRM.

Despicable standard. There wasn't even a standards "war" like in VHS vs Betamax, or SD vs MemoryStick, or USB vs Fire Wire, to say that HDMI won over DisplayPort, it was simply shoved down consumers' throats since every TV, media player and games console only shipped with that port alone as they were manufactured by the same cartel that developed the HDMI standard.

So much for the so called "free market".

[1] https://en.wikipedia.org/wiki/HDMI#History


To be fair, and note that I think of the hdmi foundation as the bad guys.

hdmi was not an alternative to display port, display port did not exist yet. it was an alternative to dvi, really hdmi is dvi with a sound channel and drm. And as much as I dislike the hdmi foundation I can see the benefit here.

as to hdmi vs display port... I have no idea why you don't see more display port, VESA has a proven track record as the nicer standards body, display port is a better system. probably just inertia at this point.


As a media tech guy (running the media tech department of a university, which includes a DCI conform cinema): absolutely everybody hates HDMI. It is unreliable as hell, both physically and as a protocol. It tries to be too much to too many people and most devices, including expensive "pro" gear includes unchangeable random weirdness like ignoring EDIDs or forcing them onto you, that is documented nowhere and you can only find these things out when you buy it.

Add to that the fact that consumers/users can break the picture/sound in 100 different ways on their devices and you get a veritable support nightmare.

I wish it was just DVI+ but it does so much more.


Isn't this why VGA is still widely used everywhere? It always just works no matter what even when connector or pins are damaged since there's no digital handshake or error correction just a basic analog pipeline.


I don't know, at least here (Europe) VGA has pretty much died out in all but legacy applications. The true pro format would be SDI using BNC connectors.

But I guess HDMI is going to be replaced by USB-C in the long run. Especially since the "everything-connector" also doing Video makes more sense than the video-connector also doing everything.


> unchangeable random weirdness like ignoring EDIDs or forcing them onto you, that is documented nowhere and you can only find these things out when you buy it.

FWIW: https://www.store.level1techs.com/products/p/5megt2xqmlryafj...

Sadly this is not entirely a HDMI-specific problem either, he has a displayport feeder too. Also DisplayPort had many problems with disconnects/sleep state for many years, especially surrounding EUP Compliance/EUP Deep Sleep mode. I wouldn't say DisplayPort monitors were relatively bulletproof until the GSync Compatible generation finally rolled around in 2019-2020.


I think interface change fatigue is real. DisplayPort has been around but there wasn’t a compelling reason to use it when displays had hdmi ports.

People are also looking to USB-C as the next iteration in display connectivity because it does “all of the things” from a consumers perspective.


Most people will use the path of least resistance.

Many people/organizations still use VGA and ketchup-mustard-onion cables to this day if they still do the job, let alone HDMI.


Shouldn't it be mayo if you are going with a condiment theme?


Very fair point and duly noted!


At least video over USB-C is DisplayPort, so there's hope.


I used a plasma panel, vintage 2004 (retired in 2016 with no noticeable burnin), that had a DVI connector with HDCP support. If it had not supported HDCP, I could not have connected my cable box to this panel.


Works exactly as a free market is designed to. The strong and coercive win. That's what market dynamics are really about. Monopolies form easily and naturally unless regulation stops them.


Actually these monopolies are enforced by the state via IP laws. Without IP laws any upstart could reverse engineer the protocols and provide an implementation with less limitations. But of course free market enthusiasts like to ignore that part and only rant against the government when it protects consumers from companies.


There are a huge number of free market types who are against IP laws and they're a big part of computing culture. Names like the FSF [0] spring to mind. A market can't expose a fraction of its potential if people are banned from competing because someone else got there first. The only reason the software world did so well was because the FSF managed that inspired hack of the copyright system known as the GPL that freed up the market, in fact.

[0] https://www.fsf.org/


Yes, if there were no IP anyone could cheaply make a single-digit-nanometer-node custom ASIC to provide the alternative 4K-capable video hardware implementation. /s


Anyone? No, probably not. Some enterprising company in Shenzhen, who would sell the thing for $.25 a piece due to fierce competition driving prices down to cost of materials? Now that's more likely.


Single-digit-nanometer-node custom ASICs aren't really required to achieve this. Although there is higher latency this can and has been done on FPGAs at a company I worked for which designed and built custom AVOD systems for private jets and helicopters.


One could argue that at least this specific tactic would not be possible without the state granting a monopoly on "intellectual property". Without that, nothing would hinder AMD from just shipping their already existing implementation.


> One could argue that at least this specific tactic would not be possible without the state granting a monopoly on "intellectual property".

Microsoft ? RIAA ? MPAA ? Google (AI, books)


The irreducible state role in a free market is to enforce property rights.

Almost all free market fans I have seen think that this should extend to some notion of intellectual property.


I think the standard answer to your point is that you can recognise "intellectual property" without granting a (limited) monopoly. There are plenty of proposals floating around for copyright and patent reform that curtail or replace the ability of the creator/owner to unilaterally set the price and decide who can license the material and how they can use it.


Thing is, HDMI forum is not a monopoly. It's a literal cartel of a few corporations and other cartels. Other cartels pushing for it include MPAA.


The monopoly here is HDMI LA, who provides the required licenses.

> HDMI® Licensing Administrator, Inc. (HDMI LA) is the agent appointed by the HDMI Forum to license Version 2.1b of the HDMI Specification and is the agent appointed by the HDMI Founders to license earlier HDMI Specifications.

https://hdmi.org/adopter/index

See also: https://www.symmetryelectronics.com/blog/what-are-the-licens...


Mostly I think the person who mentioned that this is free markets working as free markets is largely right. You can't defend free markets on the basis that property rights are enforced by the state, and thus somehow changing the free market outcome.

I also think critizising intellectual property on grounds of granting a monopoly is muddling the language. If I write a novel I have exclusive rights to the novel. But I am not the only supplier of mediocre novels. I don't have a monopoly in a relevant market.

None of this contradicts the point that IP and patent rights are in desperate need of reform, or that they can play a central role in abusing a monopoly position (e.g. https://en.wikipedia.org/wiki/Orange-Book-Standard).

Edit: Maybe my post was unclear: I would agree that IP should be abolished. But this is not a position I have seen classical market liberals and other free market advocates take. Instead, they tend to favor strengthening all forms of property rights. If I am wrong on this point, I'd be happy to read some examples.


I don't think we disagree, I would just like to add that this subtlety about "monopoly" depends on the (subjective) existence of substitute goods. Maybe as a consumer I just want any old book to read and so an individual author has no market control. On the other hand you can imagine, say, a technology that's practically or actually unavoidable as an input for a particular business (suppose HDMI had no viable alternative), then the IP holder could extract super-normal profit and make the economy less efficient.


Monopolies form easily? That's funny, you should try and start one, seems quite profitable.

Seriously though, this is an oft repeated fallacy, and frankly irrelevant to the discussion.

IP laws are the actual culprit in facilitating the apparatus of the state for the creation of monopolies. Most people seem to embrace this double-think that IP laws are good while monopolies are bad. You simply don't get monopolies without IP laws. IP laws are the ultimate king maker and exclusively exist to perpetuate profits of the IP owner.

If your proposition of regulation is to disband the patent offices and repeal the copyright act, my sincere apologies.


Getting rich is easy. You just need rich parents.

Two things can be true at the same time.

The truth is, if you are in the position to make the step towards becoming a monopolist especially in a new market it is not impossible to do so (and by the rules it should be).

Getting to that position isn't easy tho.

But from a consumer standpoint the only thing that matters is if you have monopolists or not — we don't care how hard it was for them to become one other than it might change the number of monopolists that force their crop down our throats.


Without imaginary property, AMD would have signed a similar contract - they would rather focus on their own products rather than reverse engineering the HDMI standards to create their own implementation. At which point AMD would be in the same position, unable to reverse engineer HDMI or adopt solutions from other companies who did.

Imaginary property laws most certainly encourage and facilitate monopolies and collusion, but they are not necessary to the dynamic. Such laws are essentially just the norms of business that companies would be insisting on from other businesses anyway, for which it's much more lucrative to assent and go along with rather than attempt to defect and go against them.

Another example of this effect is the DMCA - the tech giants aren't merely following its process verbatim, but rather have used it as basis for their own takedown processes with electively expanded scope - eg why we see takedown notices pertaining to "circumvention" code, or the complete unaccountability of Content ID. Google and Microsoft aren't significantly hurting themselves by extralegally shutting down a tiny contingent of their customers, meanwhile the goodwill they garner from other corporations (and possible legal expenses they save) is immense. The loser is of course individual freedom.


If only the free market was even more free, all our problems would be solved!


The invisible hand of the free market will come and fix all the things! /s

If you talk to people who still subscribe to that notion, it quickly becomes clear that they value their miniscule chance to win the capitalist lottery more than the wellbeing of the many — the idea that markets balance everything to the advantage of everybody then seems to be just an excuse to be egoistic and without any care for others.

Don't get me wrong, nobody has to care for others and I am not going to be the person to force you, but if you don't care about others please stop pretending you are doing it for the greater good.


You're conflating several schools of thought. Utilitarianism, which appears to be your basis for defining ethical behavior, underlies this reasoning behind compulsory government action.

This line of thinking is often repeated in election cycles and mindless online discussions, with mantras like "We justify doing something heinous because it serves 'American Interests'" or "We'll coercively tax one group and redistribute funds to another because they'll do something dubiously for the 'greater good'".

However, Utilitarianism is not a foundational principle of libertarian ideology. In fact, libertarianism often refutes and rejects it as applied to governments. It doesn't prioritize egalitarianism or rely on public opinion when defining citizens' rights.

The argument for a free market unencumbered by protectionist policies isn't about the greater good; rather, it's an argument for an ethical government grounded in first principles.

The "greater good" argument tends to crumble under close examination and logical scrutiny. Its claims on reason collapse as soon as you scrutinize them more deeply.

Notably, Utilitarianism has been the basis for nearly all modern-day dictatorships, which rely on a monopoly of violence to enforce the "greater good".

It's possible to support free markets while still caring for others – this is called altruism. It's similar to utilitarianism but without coercion and fallacies.


I studied philosophy and ethics so you can safely assume I know my definitions. But that does not matter, as you apparently failed to read what I wrote.

Could you please paraphrase my "greater good argument" that crumbles under close examination? A examination you somehow failed to provide? Maybe you hoped people are too impressed by you use of words to recognize that you even failed to provide an argument against an strawman you created?

No offense, but the way you write makes you sound like a 15 year old teenager that figured out using smart words makes you sound smart, without any deeper understanding of or regard for the concepts at hand or the arguments made. If you want to show some argument is wrong you can't just simply claim it is, you need to demonstrate it - ideally using the very logic and examination, you seem to so highly value.


My original post was intended to clarify why I believe Libertarian ideology is distinct from and incompatible with Utilitarianism, particularly since in your response, you conflated the concept of the greater good as a core principle of Libertarian ideology. This is quite surprising given your claim to have "studied philosophy and ethics".

To address this misunderstanding, let me break down the logical fallacies I alluded to earlier:

- The "tyranny of the majority" problem: Since happiness is determined by the number of individuals, a simple majority can impose its will on the minority, potentially denying them their rights or freedoms.

- The "moral arithmetic" fallacy: This assumes that individual well-being can be measured and added up like numbers in an equation, ignoring the complexities of human experience and the difficulties of making such calculations.

- The "majority rules" fallacy: This implies that whatever the majority wants is automatically just or right, without considering the potential for mob rule, manipulation, or coercion.

- The "ignore individual rights" fallacy: By prioritizing the greater good over individual interests, Utilitarianism may lead to the trampling of human rights and dignity.

No offense, but it's worth noting that a more nuanced understanding of philosophy and ethics might be beneficial for more accurate representations of complex concepts.


I will defend utilitarianism, since I like it a lot and all your arguments against it are bad.

- The "tyranny of the majority" problem is a problem of direct democracy, not utilitarianism. Happiness in utilitarianism is determined not by a number of individuals, but by all individuals and perfect utility function must take into account both majorities and minorities and create consensus. This will only fail if majority and minority have directly opposed interests, but in this case overall good is still better this way (you don't want to deny majority people their rights too in favor for minorities).

- The "majority rules" fallacy is a problem of democracy overall. Every democracy system is vulnerable to this, not only utilitarianism. But then again, perfect utility function should take into account people's desire to not be fooled, so there's that.

- The "ignore individual rights" fallacy is the same as "tyranny of the majority". Utility function takes into account interests of all individuals and tries to create the best possible consensus.

- The "moral arithmetic" fallacy is the best one here, since it's actually close to the truth. You can't really create a perfect utility function, but you don't need to. You can create imperfect one and improve it later with feedback and democracy mechanisms. With time imperfect utility function will get closer and closer to perfect one. Profit maximizing utility function can't be calculated too, but corporations handle it just fine. But if you're not blind, you can see that profit maximizing utility function leads to a lot of real people suffering (climate change, wars, hunger, poverty and many many more) while leading to profit maximization (alignment problem).


Again: explain which argument about the greater good I supposedly made.

Ideally before you go off on a totally unrelated tangent again. Not trying to be mean here, but if you want others to understand why I am wrong a good start is to explain what my argument was.

Because it certainly wasn't: "conflating the concept of the greater good as a core principle of Libertarian ideology". But maybe to the reader your amount of projection onto my very simple statement is in itself telling.


"the idea that markets balance everything to the advantage of everybody then seems to be just an excuse to be egoistic and without any care for others."

There are two problems here: 1. You misstate and mischaracterize free-market ideology as having the pretense of being to the "advantage of everybody". It's potentially a byproduct but definitely not a first principle. 2. You cast a judgment of value on egotism and selfishness as being the true motivators behind free market proponents. Selfishness and egotism are human characteristics expressed across all ideological spectrums.

"Don't get me wrong, nobody has to care for others and I am not going to be the person to force you, but if you don't care about others please stop pretending you are doing it for the greater good." - Here is where you conflate utilitarian with libertarian ideology, especially as you label those who disagree with your view as pretenders and posers for the greater good, again misstating the position of your ideological opponent and then proceeding to cast a judgment of value on the positions they don't actually hold.

Not trying to be mean here, but have you thought about getting some reading comprehension lessons? It could really help you understand the things that you read as well as give you a more well rounded view things.


Haha. By the nonexistent gods.

Have you ever considered I was talking about specific individuals that muttered those things towards me instead of reading everything I did as a paragraph from a political reader? I have no close relationship with Libertarianslism, as where I come from it is not very wide spread as a political ideology and more of a curiosity that gets mentioned at the fringes.

So what I criticized here are the things people told me in online discussions as a defense for why the system we have is okay. I did not ask them which ideology they subscribe to, but I am pretty sure that was not some pure text book form of Libertarian ideology. So I am still curious how my criticism of an observed phenomenon made you jump directly in defense of Libertarian ideology, that I neither thought about nor mentioned.

Additionally: I can start to understand what you're talking about once you start at the beginning instead of diving straight into some sort of convoluted US-internal political debate. Rephrasing what you thought the other person said and why precisely it is wrong is a good habit to keep before writing hundreds of lines attacking them on what you think they said.


Ok. This is even worse. You shouldn't use your misunderstandings from previous discussions with other people and make generalizations with everybody else you meet on new discussions, especially if you are using an incendiary tone.


Linux gamers with modern TVs wanting VRR.

Maybe that's still a tiny amount, but it's likely the most common 'need'.


> Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them.

Most people maybe not but a simple 4K TV that can do > 60 FPS fits that criteria. Those aren't that rare anymore.


> At any rate, for some reason DisplayPort has just not caught on all that much. You very rarely see them on TVs, and a good number of mid-/lower-end monitors don't have them either.

I suspect all the nice features that make DisplayPort a better standard are harder to implement cheaply, eg chaining


I have fewer display port devices now than 8 years ago.


Don't forget about usb-c. Video over usb-c is almost always display port in disguise.


What about latency? Is it on par or at least in the same league compared to direct connection? Not an issue for most people, but gamers could disagree if it is too high.


Performance is identical. DisplayPort Alternate Mode (which is what most displays use) isn't transmitting video data over USB; it's agreeing to use some of the high-speed wire pairs in the cable to transmit DisplayPort data instead of USB.


This is very interesting to know, thanks!


It's still direct connection, so there's nothing to compare there.


May be it can change if USB 4 will sneak in and supplant HDMI in those devices, since it can route both HDMI and DP.


Is HDMI over USB even a thing that any real devices support? But yeah, demand for mobile phone support might force TV manufacturers to adopt DP over USB.


> Is HDMI over USB even a thing that any real devices support?

No. A spec for HDMI Alternate Mode was written, but almost nobody (possibly nobody at all?) implemented it, and it was eventually withdrawn.

https://arstechnica.com/gadgets/2023/01/hdmi-to-usb-c-spec-a...


It cannot route HDMI, partly because HDMI is built upon antiquated principles and doesn't really fit besides more modern protocol designs. USB4 would need to get entirely redesigned for tunneling native HDMI.

Having a DP to HDMI converter on one end though, that's easy.


> HDMI is built upon antiquated principles

I'm interested in learn more, in what way are they antiquated?


HDMI uses a digitalized form of the traditional TV signals. The format of the transmitted data still depends on the parameters that defined traditional TV signals, like video frame frequency, video line frequency, vertical and horizontal retrace intervals and so on. Such parameters are no longer essential for digital television and there is no longer any need to constrain the transmission of video signals with them.

DisplayPort uses a typical communication protocol that can carry arbitrary data packets, not much different from the protocols used on USB or Ethernet.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: