The title (and the first comment on the link) makes it sound like it's a perpetual motion machine, but as far as I can tell it's more like a tiny heat-pump: energy is conserved, entropy increases. Still kind of neat though.
Of course they didn't beat thermodynamics, but still, assuming they can scale the tech up to a useful power, it's a light that's also a cooling element. That'd be more than kind of neat, wouldn't it?
Hmm, are you getting this from the paper? It's not obvious to me from the abstract. [0]
From the description given, the diode uses current to exchange ambient heat for emitted light. It cools the device, and heats whatever the light falls on, essentially acting as a remote heat pump. We can convert waste heat into a somewhat larger amount of "waste light", effectively increasing black-body radiation, which is great if light is more useful to us than heat.
Of course you don't get a free ride, and you can't make something cool without making something else hot (eventually), but conventional air conditioning is bound by the same limitation and is still plenty good enough for our purposes.
In principle, what prevents me from covering my roof with these things, radiating my waste heat (less atmospheric absorption) into deep space, and thus cooling my house?
Again, this supposing that the device could be scaled up to produce usable amounts of visible light while retaining the local cooling effect at room temperature, none of which have yet been demonstrated. Apparently their technique is to run the diode at as low of a voltage as possible, which presents some obvious problems.
Actually the abstract states it acts as a "thermodynamic heat engine", which would support what you're saying-- but given that it requires an input of electrical energy, I don't see how that could be right. They also liken it to "thermoelectric coolers". Possibly that was just mistyping or sloppy terminology, or possibly I'm missing something.
I think where the light is being sent is open space so what ever temperature the local background radiation is at; a common local minimum being the cosmic background radiation.
> it's not where the photons end up that matters, it's the total of all the photons that are being sent in your (the lamp's) direction that counts.
This is what I was referencing when I siad "the local background radiation." The common minimum was supposed convey high frequency when choosing a random spot in the known universe.
> On earth it's the temperature of the air, or the walls.
The temperature of the air only comes into play if it is emiting radiation at the frequency of light produced by the led and at greater intensity. I do not see why this would be required.
> The temperature of the air only comes into play if it is emiting radiation at the frequency of light produced by the led and at greater intensity.
Blackbody radiation. Air (everything) always emits light.
> I do not see why this would be required.
It's a fundamental law of thermodynamics that requires a heat source and a cold source in order to convert heat into some other form of energy.
The air around you is the cold source, and the LED lamp is the hot source. If the LED is colder than the air it will not work - the LED must be hotter (in this particular case they put it in an oven).
I think I understand what you're saying now. In vacuum, an object will radiate until it reaches absolute zero; in air, however, you're constantly being heated to the ambient temperature, meaning you can't radiate your way down until the air has lost its heat too.
The thing I think you're not taking into account is that this device has the effect of slightly increasing your black-body radiation. So you're at 1300K, radiating at 1300K, but we apply a little electricity and you start radiating a little more-- maybe you now look like you're at 1301K. Your heat hasn't actually increased, just apparent radiation, and as a consequence you cool down a little faster than your black-body profile would suggest.
So say you're at room temperature, radiating at room temperature, being warmed by the air, and not losing any heat. We apply current again, and you start radiating at a little over ambient temperature. The air is still warming you, but you're cooling a little faster than before, with the extra heat being deposited wherever the radiation is absorbed. To the air, you now feel cool.
Why should the absolute temperature of the ambient air have an effect on the process?
If you put a piece of metal in a very hot oven it will emit light without any electric power input whatsoever. That's one way to see that this is indeed not magic. The accomplishment here appears to be to get the same result at lower temperatures.
Lighting is hot. I'd love to have lights that weren't also little heaters. Or, if the cooling effect is big enough - imagine solid state cooling on devices. Who needs a fan when you can have a little cooling light?
I doubt that they'll be able to scale this up too much though, since it probably relies on an effect that only occurs below a certain threshold.
I am sorry that I cannot elaborate more on that for now(4am) but the cooling effects of light are known for quite some time already. Please check laser cooling as a keyphrase.
And this would make it worse. It cools the lamp by sending the heat outward in the form of light. It does not cool the recipient in any way, in fact it requires that the recipient already be cooler than where the lamp is.
The led itself should cool down if it is producing more power in light then is being put in electrically. The energy is coming from the the lattice phonons and as a result the lattice cools down.
The paper[1] behind a pay wall show has a graph showing its cooling power vs current.
I think where the light is shining into is open space so the source temperature has to exceed what ever temperature the local background radiation is at; a common local minimum being the cosmic background radiation.
Yes, the LED is cooling down, but it will not cool down any lower than the ambient air. i.e. you can not use this as an A/C - you must already have a cold source in order to use it.
The cosmic background radiation is not exactly common. You'd have to be in space.
I can still see uses for it only cooling down to ambient air temperature.
I'm thinking of high-frequency transistors / ICs with heatsinks and a bank of these lights being attached.
It wouldn't be a primary cooling method, but if you needed a light on your device anyways then you could make use of this mechanism as well. Imagine backlights for LCDs in mobile devices. An iPad or Android tablet can easily reach 100 F when playing games, and for most cases that's warmer than ambient air temperature.
I don't have access to the original paper (haven't checked arxiv yet). Do the lights need to be at a certain temperature for this to work at all? I know that in similar experiments (condensates) the lattice vibrations only propagate at temperatures much lower than 70 F. At room temperature there is too much chaotic motion in the structure. I'd see that as a major limitation on scale and real-world application.
Yes, the LED is cooling down, but it will not cool down any lower than the ambient air. i.e. you can not use this as an A/C - you must already have a cold source in order to use it.
Where are you getting that?
A Peltier-effect refrigerator can certainly cool things lower than the ambient temperature. It does that by moving the heat from one place to another.
I'm not aware of anything fundamental that says that the heat energy has to be radiated as heat - it could just as easily be radiated as light. So why couldn't this cool things?
(Your argument that it could somehow work if you shone it on something that was cooler than the device implies time travel - the photons would need to now the destination is cooler before they are emitted.
I don't understand. Why can't it cool down more than ambient air? You put in X amount of power, and you get out 2X amount of power as light -- therefore additional energy is being taken in as heat and converted to light. Say you direct that light AWAY from the area through mirrors or something so it doesn't heat anything nearby up. Then, by simple energy thermodynamics, you can show that X amount of energy IS being taken away from this local system and deposited elsewhere as light.
Unless you claim that the LED stops producing the >100% efficiency levels at room temperature, I don't see why you're claiming it can't cool anything down.
Second law of Thermodynamics. It takes additional energy to move heat up a temperature gradient, and you're not accounting for that.
You can't move heat from a cold source to a hot sink without paying for it somewhere.
edit: I don't mind a downvote now and then, but I would really prefer a discussion or explanation to a fire-and-forget downvote.
> Why can't it cool down more than ambient air? You put in X amount of power, and you get out 2X amount of power as light -- therefore additional energy is being taken in as heat and converted to light.
Light is a more highly ordered form of energy than thermal noise; it can be used to do work. If you could convert thermal noise into light and beam it away, you could use that as the cool side for a heat engine and have a perpetual motion machine.
> If you could convert thermal noise into light and beam it away, you could use that as the cool side for a heat engine and have a perpetual motion machine.
By "beam it away", I assume you mean into space or somewhere else you don't care about heat being. In this case, the cold side of the device isn't the real cold sink, it's just a proxy for the several million cubic lightyears of background-temperature near vacuum which will eventually wind up absorbing the light. Aliens on a distant planet may be able to extract power from the light; bully for them, but I already had to pay extra for shipping, so there's no funny business going on.
What makes you think that violates thermodynamics?
(For what it's worth, "perpetual motion machines" are not outlawed by classical thermodynamics, only those which do additional work. There's nothing in theory which prevents you hooking up a 100% efficient emitter and a 100% efficient receptor up with superconducting wire and letting it run forever, so long as you don't extract any energy from the cycle, for example by observing that it is taking place. There may be something in quantum physics which states that such a system would have to leak information about itself somehow, but I don't know enough to say.)
ars: can you back up that claim? What about the description makes you think it is not a heat pump? Such a thing is possible and exists: energy in results in energy (i.e. heat) transfer from one ___location to another, against an energy gradient.
From the description given in the article, it seems that the energy input is simply being used to move energy from one ___location to another – it just so happens that the energy is sourced as heat, and is deposited as light. This is really not much different from a heat pump, so I see no reason it can't move heat up a gradient.
A heated semiconductor light-emitting diode at low forward bias voltage V<kBT/q is shown to use electrical work to pump heat from the lattice to the photon field.
When an electrical power input as low as 30 pico-watts is applied, the researchers consistently measured a 69 pico-watt optical power output, seemingly defying the law of conservation of energy. However, upon further investigation, the researchers showed that the system was able to derive extra energy from the thermal energy present in the environment. More specifically, the LED was able to convert the heat generated by the vibrations in the LED’s silicon lattice structure into additional emitted photons, increasing the LED efficiency to over twice unity.
Achieves greater than unity efficiency by simultaneously converting thermal and electrical energy into photons....Cools the surrounding environment as it draws thermal energy from it.
Energy input and heat in the "thermodynamic system", versus energy output and heat remaining stays in balance. If energy input (in electric current) is higher than energy output, then heat is added to the system. If energy input is lower than energy output, then heat is lost from the system.
All it takes is the ability to channel some of the light (energy output) away from the reservoir of heat to show that the reservoir of heat will decrease.
Ultimately you'd expect a temperature gradient at the system boundary to start transferring heat back into the system. through the first law of thermodynamics.