\"Writing.Com
*Magnify*
    December    
SMTWTFS
19
20
21
22
23
24
25
26
27
28
29
30
31
Archive RSS
SPONSORED LINKS
Printed from https://shop.writing.com/main/books/entry_id/1081275-Disorder-Up
Image Protector
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
<<< Previous · Entry List · Next >>>
#1081275 added December 18, 2024 at 9:58am
Restrictions: None
Disorder Up!
Getting back to science today, here's one from Quanta for all the opponents of nihilism out there.

    What Is Entropy? A Measure of Just How Little We Really Know.  Open in new Window.
Exactly 200 years ago, a French engineer introduced an idea that would quantify the universe’s inexorable slide into decay. But entropy, as it’s currently understood, is less a fact about the world than a reflection of our growing ignorance. Embracing that truth is leading to a rethink of everything from rational decision-making to the limits of machines.


It makes all kinds of sense that it took a French person to figure this out.

Life is an anthology of destruction. Everything you build eventually breaks. Everyone you love will die. Any sense of order or stability inevitably crumbles. The entire universe follows a dismal trek toward a dull state of ultimate turmoil.

That sounds more like a French (or possibly Russian) philosophy book than science, but I assure you, it's science (just without the math). As I've said before, philosophy guides science, while science informs philosophy.

To keep track of this cosmic decay, physicists employ a concept called entropy.

Keeping track of decay may sound like a paradox, and, in a way, it is.

Entropy is a measure of disorderliness, and the declaration that entropy is always on the rise — known as the second law of thermodynamics — is among nature’s most inescapable commandments.

That's slightly simplified. The Second Law of Thermodynamics states that in a closed system, entropy can never decrease. It can remain constant, just never decrease. And it specifies "closed system," which the Earth most definitely is not; we have a massive energy source close by (in cosmic terms), at least for now.

Order is fragile. It takes months of careful planning and artistry to craft a vase but an instant to demolish it with a soccer ball.

I've also noted before that creation and destruction are actually the same thing. What we call it depends on our perspective at the time. Did you create a sheet of paper, or did you destroy a tree? Well, both, really, but maybe you needed the paper more than you needed the tree, so you lean toward the "creation" angle.

We spend our lives struggling to make sense of a chaotic and unpredictable world, where any attempt to establish control seems only to backfire.

Who's this "we?"

We are, despite our best intentions, agents of entropy.

At the risk of repeating myself once more, it could well be that the purpose of life, if such a thing exists at all, is to accelerate entropy.

But despite its fundamental importance, entropy is perhaps the most divisive concept in physics. “Entropy has always been a problem,” Lloyd told me. The confusion stems in part from the way the term gets tossed and twisted between disciplines — it has similar but distinct meanings in everything from physics to information theory to ecology. But it’s also because truly wrapping one’s head around entropy requires taking some deeply uncomfortable philosophical leaps.

Uncomfortable for some, maybe.

As physicists have worked to unite seemingly disparate fields over the past century, they have cast entropy in a new light — turning the microscope back on the seer and shifting the notion of disorder to one of ignorance.

What he's basically saying here, if I understand correctly (always in question), is that they're trying to fit entropy into information theory. Remember a few days ago when I said information theory is a big deal in physics? It was here: "Life IsOpen in new Window.

The notion of entropy grew out of an attempt at perfecting machinery during the industrial revolution. A 28-year-old French military engineer named Sadi Carnot set out to calculate the ultimate efficiency of the steam-powered engine.

It's important, I think, to remember that the steam engine was the cutting-edge of technology at the time.

Reading through Carnot’s book a few decades later, in 1865, the German physicist Rudolf Clausius coined a term for the proportion of energy that’s locked up in futility. He called it “entropy,” after the Greek word for transformation.

I find that satisfying, as well, given my philosophical inclination concerning creation and destruction. If they're the same thing, then "transformation" is a better word.

Physicists of the era erroneously believed that heat was a fluid (called “caloric”).

Yes, science is sometimes wrong, and later corrects itself. This should, however, not be justification to assume that the Second Law will somehow also be overturned (though, you know, if you want to do that in a science fiction story, just make it a good story).

This shift in perspective allowed the Austrian physicist Ludwig Boltzmann to reframe and sharpen the idea of entropy using probabilities.

So far, they've talked about a French person, a German, and an Austrian. This doesn't mean thermodynamics is inherently Eurocentric.

The second law becomes an intuitive probabilistic statement: There are more ways for something to look messy than clean, so, as the parts of a system randomly shuffle through different possible configurations, they tend to take on arrangements that appear messier and messier.

The article uses a checkerboard as an example, but as a gambler, I prefer thinking of a deck of cards. The cards come in from the factory all nice and clean and ordered by rank and suit. The chance of that same order being recreated after shuffling is infinitesimal.

Entropy experienced a rebirth during World War II.

Now, there's a great double entendre. I wonder if it was intentional.

Claude Shannon, an American mathematician, was working to encrypt communication channels... Shannon sought to measure the amount of information contained in a message. He did so in a roundabout way, by treating knowledge as a reduction in uncertainty.

Sometimes, it really does take a shift in perspective to move things along.

In two landmark (opens a new tab) papers (opens a new tab) in 1957, the American physicist E.T. Jaynes cemented this connection by viewing thermodynamics through the lens of information theory.

Okay, so the connection between entropy and information isn't exactly new.

However, this unified understanding of entropy raises a troubling concern: Whose ignorance are we talking about?

And that's where I stop today. There is, of course, a lot more at the link. Just remember that by increasing your own knowledge, you're accelerating the entropy of the universe by an infinitesimal amount. You're going to do that whether you read the article or not, so you might as well read the article. As it notes, "Knowledge begets power, but acquiring and remembering that knowledge consumes power."

© Copyright 2024 Waltz Invictus (UN: cathartes02 at Writing.Com). All rights reserved.
Waltz Invictus has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.
<<< Previous · Entry List · Next >>>
Printed from https://shop.writing.com/main/books/entry_id/1081275-Disorder-Up