NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
The Failure of the Thermodynamics of Computation (2010) (sites.pitt.edu)
svnt 1 days ago [-]
This was published right before people started experimentally validating the Landauer limit. I am not sure why it hasn’t been taken down at some point as the evidence has accumulated:

2012 — Bérut et al. (Nature) — They used a single colloidal silica bead (2 μm) trapped in a double-well potential created by a focused laser. By modulating the potential to erase the bit, they showed that mean dissipated heat saturates at the Landauer bound (k_B T ln 2) in the limit of long erasure cycles.

https://www.physics.rutgers.edu/~morozov/677_f2017/Physics_6...

2014 — Jun et al. (PRL) — A higher-precision follow-up using 200 nm fluorescent particles in an electrokinetic feedback trap. Same basic physics, tighter error bars.

https://pmc.ncbi.nlm.nih.gov/articles/PMC4795654/

2016 — Hong et al. (Science Advances) — First test on actual digital memory hardware. Used arrays of sub-100 nm single-domain Permalloy nanomagnets and measured energy dissipation during adiabatic bit erasure using magneto-optic Kerr effect magnetometry. The measured dissipation was consistent with the Landauer limit within 2 standard deviations using the actual the basis of magnetic storage.

https://www.science.org/doi/10.1126/sciadv.1501492

2018 — Guadenzi et al. (Nature Physics) — Opens with:

The erasure of a bit of information is an irreversible operation whose minimal entropy production of kB ln 2 is set by the Landauer limit1. This limit has been verified in a variety of classical systems, including particles in traps2,3 and nanomagnets4. Here, we extend it to the quantum realm by using a crystal of molecular nanomagnets as a quantum spin memory and showing that its erasure is still governed by the Landauer principle.

https://www.nature.com/articles/s41567-018-0070-7

The Landauer limit is not conjecture.

qnleigh 13 hours ago [-]
I haven't finished reading this yet, but I don't think the author is saying that the Landauer limit for erasure is wrong. They're saying that there are other limits in computing beyond erasure. I think this makes sense; although reversible computing should be possible at zero temperature and infinite precision, realistic computers need some way to remove entropy that accumulates during the computation.

So I don't think their claim is in tension with any of the papers that you cite.

griffzhowl 22 hours ago [-]
Is the focus on the erasure of a bit, rather than writing a bit, just conventional or is there a significant difference between the processes?
svnt 20 hours ago [-]
Erasure is logically irreversible, writing a bit is not. When you erase a bit you compress the logical phase space of the closed system, which means the missing information has to go somewhere — in this case a couple of very low energy phonons into the larger environment.
griffzhowl 8 hours ago [-]
Ah, I thought writing a bit was irreversible, because after writing say 1, the previous state could have been a 0 or a 1. But in fact writing a bit should be thought of as the whole process "0 to 1" or "1 to 1", including the initial bit, so that the process is logically reversible. Is that right? Then what I had in mind as an irreversible process of writing would be equivalent to first erasing the bit and then writing the new one.
ogogmad 1 days ago [-]
I'm not sure, but isn't 2 standard deviations a bit low? Especially so for something that can be done in a lab. It seems that 2 SD is the minimum threshold for getting published. Can we be sure that these are properly reviewed?
spocchio 1 days ago [-]
Could it be possible that you confused the number of standard deviations one needs to falsify something? For instance, if two things are different we may want to be as many SD as we can apart. Here, on the other hand, the data agree _within_ 2S D.
svnt 23 hours ago [-]
That was the limit of just one experimental approach that was peer reviewed and published in a major journal. As you can see there are many experiments validating the limit and none invalidating it.

The reality is that the Landauer limit is vanishingly small. I would encourage you to review the experiment methodology and see if you can come up with better, fundable methods.

svantana 1 days ago [-]
It's an interesting article but I fail to see the point they are trying to make. I always thought of reversible computing as a sort of platonic ideal that cannot truly exist in real life, but the principle can still be used to reduce waste heat and energy use. For example, it will be interesting to see if the chips from Vaire ever become practically useful:

https://vaire.co/

https://spectrum.ieee.org/reversible-computing

cwillu 24 hours ago [-]
> I always thought of reversible computing as a sort of platonic ideal that cannot truly exist in real life

It's been experimentally demonstrated. Practical or not, the effect is real.

qnleigh 13 hours ago [-]
Where? If you mean that a computation was performed without creating any entropy, I'm skeptical. Surely some energy input was needed, and some energy was dissipated.
smitty1e 1 days ago [-]
From the abstract, the idea is that we can continue to shrink: "...in a manner in which no thermodynamic entropy is created or passed to the surroundings."

The objection seems to be the "free lunch" assumptions being made about shrinkability.

"What Is TANSTAAFL?" https://youtu.be/ZrZUe7R44eA?si=oK2H1L9ha1zQhDOh

debatem1 1 days ago [-]
The author should write a followup article about how theory of computation has failed because nobody makes a Turing machine with enough tape.
oh_my_goodness 1 days ago [-]
Very clear intro to this notoriously slippery area.
ogogmad 1 days ago [-]
I think I arrived at the same suspicion independently -- it was when I was trying to understand thermodynamic entropy as an instance of Shannon entropy - where the latter is defined abstractly as a property of probability distributions - which left me wondering about where the thermodynamic probabilities came from. I was wondering whether they were supposed to be subjective probabilities, or derived from ensembles. Then I recalled that entropy was originally defined non-probabilistically as dS = (1/T)δQ. Then I started reading about Boltzmann distributions as a bridge between Shannon's entropy and entropy in the earlier sense (Clausius entropy). I then concluded that instead of thinking about bits and bytes, it was much easier to think about gases and machines doing work, like a 19th century engin-eer building, er, engines.

I am pretty ignorant of this field.

cwillu 24 hours ago [-]
The effect has since been experimentally demonstrated.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 19:42:48 GMT+0000 (Coordinated Universal Time) with Vercel.