\"Writing.Com
*Magnify*
Printed from https://shop.writing.com/main/profile/blog/cathartes02
Rated: 18+ · Book · Opinion · #2336646

Items to fit into your overhead compartment


Carrion Luggage

Blog header image

Native to the Americas, the turkey vulture (Cathartes aura) travels widely in search of sustenance. While usually foraging alone, it relies on other individuals of its species for companionship and mutual protection. Sometimes misunderstood, sometimes feared, sometimes shunned, it nevertheless performs an important role in the ecosystem.

This scavenger bird is a marvel of efficiency. Rather than expend energy flapping its wings, it instead locates uplifting columns of air, and spirals within them in order to glide to greater heights. This behavior has been mistaken for opportunism, interpreted as if it is circling doomed terrestrial animals destined to be its next meal. In truth, the vulture takes advantage of these thermals to gain the altitude needed glide longer distances, flying not out of necessity, but for the joy of it.

It also avoids the exertion necessary to capture live prey, preferring instead to feast upon that which is already dead. In this behavior, it resembles many humans.

It is not what most of us would consider to be a pretty bird. While its habits are often off-putting, or even disgusting, to members of more fastidious species, the turkey vulture helps to keep the environment from being clogged with detritus. Hence its Latin binomial, which translates to English as "golden purifier."

I rarely know where the winds will take me next, or what I might find there. The journey is the destination.
<   1  2  3  4  5  6  7  8  9  10  ...   >
September 30, 2025 at 11:38am
September 30, 2025 at 11:38am
#1098323
I had to think about it before adding this one to the stack. It's from Big Think, which is okay, and it's about searching for extraterrestrial life, which I might have talked about too much already.

But the article propagates what I feel are misconceptions, so here I am, shaking my fist and yelling at the Cloud.

    David Kipping on how the search for alien life is gaining credibility  Open in new Window.
Big Think spoke with astronomer David Kipping about technosignatures, “extragalactic SETI,” and being a popular science communicator in the YouTube age.


This is the first I've heard of this guy, and I watch YouTube videos about space from time to time. For whatever reason, the Unholy Algorithm hasn't pointed me in his direction. Probably a good thing, because if they're anything like this interview, I'd break my streak of "forever" of not writing comments on YouTube.

Astronomer David Kipping has built a career not just at the cutting edge of exoplanet research but also at the forefront of science communication.

Don't get me wrong, though, we need science communicators and I'm glad he's got a following.

I first met Kipping at the famous 2018 NASA technosignature meeting in Houston, where the space agency first indicated they would be open to funding work on intelligent life in the Universe.

Sigh. Could we please not call it "intelligent?" All that results in is a bunch of misanthropes making tired old jokes about not even being able to find intelligence on Earth. Which is a self-contradictory statement, because just being able to say (or type) it is an indication of what we're calling intelligence. At least at a very, very minimal level. Being able to broadcast the "joke" to most of the world at a speed close to that of light is definitely what we're talking about.

Such a joke was funny exactly once, when Eric Idle made it in Monty Python's The Meaning of Life. Unlike other Monty Python quotes, it doesn't improve upon repetition. Ni!

So every time the article says "intelligence" in some form, in your mind, substitute "technology." As with "technosignature" in that quote.

The bulk of the article is in interview format.

I think I was five or six when my parents gave me this massive book with a black cover and pictures of the planets...

It felt different from my love of
Star Trek. That was fiction, but this was real.

I'm absolutely pleased that Star Trek has inspired many people. I love the show, in all its incarnations, through all its great and terrible episodes, and everything of in-between quality.

But I fear that it has given us a false impression of what's actually out there.

If you’re studying exoplanets, you’re not doing it just to know their rock composition. The ultimate question is: Does it have life? Could we communicate with it?

And I want to be clear: that's an important field of study, and I think it's a good thing that it's finally gaining some credibility. But those questions, posed in that way, can be misleading. "Does it have life?" refers to, well, life. I feel like I'm shouting this into the void, but "life" doesn't imply technology.

As for "could we communicate with it," we can barely communicate with our closest evolutionary relatives here on Earth, let alone fungi, trees, tardigrades, fish, or frogs—which, for the vast, vast majority of Earth's existence, made up the entirety of Earth's biosphere (alongside our own nontechnological ancestors).

And it's only within the past 200 years or so, a mere blink of an eye compared to the 4.5 billion year existence of life on earth, that we produced any kind of technosignature.

Things have shifted. NASA used to effectively ban the word “SETI” in proposals. Now there are grants funding it. Private money from people like Yuri Milner has energized the field. Students are excited to take risks and write SETI papers. That’s new and encouraging.

And this is, in my view, a good thing. I'm all for looking for it, especially if we're also looking for signs of non-technological life—which, as other articles I've shared here have noted, we are.

Thing is, though, it's hard to prove the absence of something. If we keep looking, and don't find any, that doesn't mean it's not out there, just that it's either farther away, or successfully hiding its signs. In an earlier entry, I compared this to the sea monsters and dragons in unexplored areas of old maps of Earth.

But the field is still small and vulnerable. One flashy claim can dominate the conversation, and in the social media era, sensationalism is amplified. That worries me. One bad actor could undo years of careful progress.

Like, for instance, Avi Loeb. Who is not only misleading the public with claims of technological origin for various objects passing through our solar system, but also destroying whatever's left of Harvard's reputation in the process.

It frustrates me when colleagues say, “When we detect life…” That assumes the answer. As scientists, we need to stay agnostic. We don’t know yet.

This, now. This, I agree with. I'm not a scientist, so I can say with some confidence that we will find signs of life, possibly even in the near future (provided we don't destroy ourselves or our own technological capabilities first). What I don't think we'll find anytime soon, if at all, though I wouldn't mind being proven wrong, is tech.

That means we have to concede the possibility that we are alone. I’m not advocating for that view; I’m just trying to remain objective. People sometimes misinterpret that as me wanting us to be alone, or even link it to religion. But it’s nothing like that — it’s just intellectual honesty.

Unfortunately, my idea that "we might be alone" does echo some religious views. But I don't approach the question from a religious point of view. I'd go so far as to argue that the assumption that we're not alone is also a religious view, because some people believe it with all the fervor of religion, without a single shred of meaningful evidence.

Science, however, requires that kind of objectivity and evidence-seeking.

The alien hypothesis is dangerously flexible — it can explain anything. That’s why we need extraordinary rigor. Carl Sagan said extraordinary claims require extraordinary evidence, but I’d add: The flexibility of the alien hypothesis makes it especially treacherous.

I agree with this point, too.

The hard facts are that Earth shows no evidence of outside tampering — we evolved naturally — and the Universe doesn’t appear engineered at cosmic scales. That suggests limits on how far technology tends to go.

Or maybe it suggests there's nobody home.

The rest of the article is about communication efforts—not with aliens, but with his human audience. It's interesting enough, but not really what I wanted to talk about.

I'll just end with this little thought experiment:

Imagine a solar system similar to ours. Similar sun, similar planets, one of them in the right place with the right composition to initiate and sustain life, like ours indisputably did. And one that formed about the same time our own solar system did

Evolution would not take the exact same path on that planet. Even assuming that it starts with single-celled organisms and, at some point, mixes two of them to produce a superorganism that, like our prokaryotes, allows the development of macroscopic life. Further assume that, against all odds, one of the species thus produced develops the ability to not only use tools (which lots of animals do), but use tools to create other tools. As yet another assumption, let's have this species build tools upon tools upon tools, eventually leading to space exploration and colonization.

That's a lot of assumptions, but I'm saying this to illustrate an evolutionary process similar to our own.

Now, in Star Trek, almost every tech-using species encountered has roughly the same level of technology as humans do. There are exceptions, of course, like the Organians, or the Metrons, or the Q Continuum, who are mostly used as god stand-ins. On the other side of the range, there are cultures who are just behind us on the tech ladder, and we can't contact them without violating the Prime Directive.

Here's the thing, though: in our own experience, technology accelerates fast. It took billions of years for humans to appear with their stone axes and fire; a few hundred thousand years for industrialization to happen; and then, in the span of little more than 100 years, we went from first powered flight to seriously considering a permanent human presence on the Moon and Mars.

My point is that, in my hypothetical almost-Earth above, that 100-year window could happen later. Or earlier. The chance of noticing someone with the same, or slightly lower, or slightly higher, level of technology is, pun intended, astronomical. Add to that the idea that some stars are older and some are younger, and many of them are too unstable to sustain life for the requisite billions of years, and the chance starts to decrease even further.

This is not even including the possibility that someone way ahead of us doesn't have a Prime Directive, and in fact desires to be the only tech-using species in the universe, and has the firepower to make that happen. Laugh all you like; I can see humans becoming that species. Or it could go in a more ethical direction, like in Star Trek. This is why science fiction isn't really about science or technology, but about humans.

With a large enough sample size, even the most improbable events become likely to happen somewhere. The galaxy might not be a large enough sample size. The entire universe might be a large enough sample size, but then you get into lightspeed issues where the further out you look, the further back in time you see.

So, yes, let's look.

But let's not get ahead of ourselves.
September 29, 2025 at 10:06am
September 29, 2025 at 10:06am
#1098242
The RNG is messing with me again, this time pointing me to another movie link. This one's from cnet.

     This is the Ultimate '90s Cyberpunk Movie (No, It's Not 'The Matrix')  Open in new Window.
Strange Days showed off a gritty, realistic VR dystopia that feels surprisingly relevant today.


Remember yesterday, I mentioned that The Italian Job is one of my favorite movies? Well, Strange Days is another, for different reasons. That's why, when I saw this article, I knew I had to talk about it.

The cyberpunk movement has given us some of the best science fiction movies: Blade Runner, Ghost in the Shell and, yes, The Matrix.

I would argue that Blade Runner (the Director's Cut of which is my absolute favorite movie, if anyone cares) predated and influenced cyberpunk, as the subgenre didn't really coalesce until the novel Neuromancer (William Gibson) two years after Blade Runner. Its roots stretch back into the 1960s, though, including the short story that eventually became Blade Runner.

But I'm not here to argue about the history of cinema and literature. What's generally meant by "cyberpunk" is a dystopian vision of "the future" (relative to when it was written) that puts technology and corporations into primacy over people.

Sound familiar? It should. We're living in one.

But there's one great tech noir flick that came out at the height of the cyberpunk craze -- and then all but disappeared. Maybe that's partly because of its title.

So, part of the problem with public reception of near-future SF is that it appears to obsolete itself very quickly. Blade Runner, for example, takes place in the unimaginably far future of 2019. Low-imagination viewers (I've known a few personally) see that and dismiss it, because "it didn't happen that way."

That's not the point. As I know I've said before, SF isn't about predicting the future, any more than Fantasy is about being true to the past. So part of the problem with SD, as I'll call it from now on because I'm lazy, was that it was released in 1995 with a setting of 1999. Most of the film takes place in the last days of the year/century/millennium (yes, I'm aware the millennium actually started in 2001, but I'm not being pedantic today).

This isn't unusual for SF. One of the most famous SF movies has the year right in the title: 2001: A Space Odyssey. But it was a cinematic and literary masterpiece that had several years to become lodged into the collective consciousness. Similarly, Blade Runner was set nearly 40 years after its release. Strange Days, however, imagined something only four or five years away.

Though Strange Days was released back in 1995, it looks and feels like it could've come out yesterday. It's one of those rare old movies that imagined the technology of virtual reality without turning it into a gimmick.

I wouldn't call the tech macguffin in SD "virtual reality," but that's a semantic argument.

The movie wasted no time dropping me into its jarring setting: The opening scene is an armed robbery filmed in first-person perspective, with the robber running from cops and jumping from one rooftop to another.

What the author here doesn't mention, or perhaps fails to realize, is that this was brand-new technology in the real world. That is, pro-grade video cameras had finally become light enough to be handheld. This isn't remarkable today, when almost everyone in the developed world carries one in their pocket, but, at the time, it was a Big Deal. Basically, the movie practically invented the "shakycam" style that was destined to annoy me for the next 30 years, but absolutely worked for this movie.

Director Kathryn Bigelow was influenced by the 1992 LA riots and incorporated those elements of racial tension and police violence into her work.

Which leads me to another speculation as to why the movie was all but forgotten, even among SF fans: sexism. A woman? Directing a science fiction flick? Horror! And don't try to tell me that's not a reason, because we still see it happening today.

As I alluded to yesterday, I don't give much of a shit about the personal lives of movie stars or directors, and I absolutely don't care what gender they identify as (though I guess I do care, at some level, because I notice that sort of thing). But it might be relevant to note that she was briefly married to the far more famous James Cameron. Not that relevant, though; she's a brilliant director in her own right.

So yeah, there's more at the link, though with possible spoilers. Yes, there are anachronisms, and the movie isn't what I'd call perfect, but it's still one of my all-time favorites. So I was pleased to find it still has other fans, though the lack thereof never stopped me from enjoying a movie before.

One final word of warning, though, if you haven't seen it before and want to: it's dark. As the article alludes to, it's Black Mirror-level dark. Today, it would generate numerous trigger warnings, and there's one scene in particular that stands out in my mind as really abyssally fucked-up (not the cinematography, but the subject matter). It's a scene among several that, if a dude directed something like it today, he'd probably never work in Hollywood again. And yes, I'm saying this as someone with a very high tolerance for horror scenes. Unlike most of Black Mirror, though, and unlike many other dystopian stories, though (some of which I also enjoy), the final message is one of hope.

I don't know about you, but I could sure use some of that right now.
September 28, 2025 at 10:40am
September 28, 2025 at 10:40am
#1098185
I don't believe in the concept of "guilty pleasures." Enjoy something, or don't; talk about it, or don't. If you like something, revel in it.

This is about as close as I get. From GQ:



I'm an educated man, an intellectual, a reader, and a writer. I started reading (or at least trying to read) Shakespeare at a very early age. I appreciate nuance and subtlety in books, movies, and shows, and I don't engage in macho, "he-man" behavior.

And yet, cinematic car chases are like candy to me.

Actually, that's not a bad analogy. Candy is simple, generally unsubtle, and not really good for you. But it's tasty and makes you feel good in the moment.

Sometimes, even I get tired of thinking, so I search my ever-contracting list of streaming services for movie or show candy, ones that promise fight scenes, good guys vs. bad guys (they don't have to be "guys"), and, of course, car chases.

Hence this article, which I only found yesterday, but the random numbers pulled it right up.

Paul Thomas Anderson's latest movie, One Battle After Another, features one of the best car chases in recent movie memory.

Dammit, now I'm going to have to go see that.

But which are the others?

As with most "best of" lists, this is very subjective. So I'm not going to highlight all of them.

The chase scene has been a staple of action movies for as long as they have existed, so there are plenty of options to pick from.

And, really, movies (and to a lesser extent, TV) are a perfect vehicle (pun absolutely intended) for the chase scene. You can write about them, sure, but even the most well-crafted prose is two-dimensional compared to the raw, visceral spectacle of watching cars hurtling across the screen, leaving chaos and destruction in their wake.

It's also almost exclusively a "car" thing—I'll include other highway vehicles in the category, though. You can involve horses, as many Westerns have, but, in general, horse chases don't involve one of the quadrupeds spinning off a cliff and exploding into a very satisfying fireball (besides, that would be cruel). There are foot chases, of course, but again: explosions are rare, and there's only so much damage a runner can do.

You can even go the SF route and do a starship chase scene, but the vast emptiness of space doesn't provide that sense of immediacy or relatability. The few really good ones usually involve very unrealistic dense asteroid fields, which... well, we don't know everything about outer space, but our own asteroid belt is so diffuse that you won't even see another asteroid from the one you're standing on. Besides, you can count the number of people who've been in space on your fingers (assuming you know the binary finger-counting code), but almost all of us have seen cars and streets and highways.

Real car chases, though, are most often anticlimactic, like when a bunch of cops chased a low-speed white Ford Bronco on live TV way back in the 90s. The cinematic version is, like most of cinema, exaggerated and choreographed for effect. It's art. Not high art, mind you. But art.

9. The tiny Fiat chase in Mission: Impossible — Dead Reckoning Part One

Okay, so, one of my other almost-guilty-pleasures? I really like the M:I movies, and I don't much care what anyone else thinks about them. Nor do I give a shit about Tom Cruise's personal life, extravehicular activities, or religious shenanigans. I will say that I was disappointed with Final Reckoning (aka Dead Reckoning Part Two), mostly because, spoiler, about 1/3 of the movie was a solo underwater fetch quest that, while cinematically impressive, dragged on way too long in my view.

In other words, they should have stuck to car chases.

Don't get me wrong; I enjoyed the mentioned chase scene, though mostly for the interaction between Cruise and Atwell. But there are better ones in that franchise than the Fiat one. Especially if one includes "motorcycles" in the list of chase vehicles.

It does have the bonus of including Hayley Atwell (name misspelled in the link article), whose roles are always awesome, and listen very, very, closely, Paramount: you need to make her the lead in future M:I installments.

8. The cop chase (and pile ups) in The Blues Brothers

This is an important scene in movie history, not least because it's in a low-budget comedy film about music, not a high-dollar action flick like M:I.

4. The tank chase in GoldenEye

Like I said, we can't limit these things to "cars."

There was one particularly memorable chase scene in a Jackie Chan movie that involved a hovercraft (yes, a fucking hovercraft, on land). So memorable, in fact, that all I can remember about the movie was Jackie Chan and the hovercraft, and I'm not 100% sure about the Jackie Chan part.

There are, as you might tell by the Cracked-style countdown numbers, several more at the link. But, like I said, these things are subjective, so I'm just going to add a couple of my personal favorites that didn't make the list:

*CarO* The Mini Cooper scenes in The Italian Job (2003). I don't know if it truly qualifies as a "chase" scene, but it has many of the same tropes and, well, that's one of my favorite movies, period.

*CarO* In Captain America: Civil War, there's a relatively short but impressive chase scene that, while starting out as a foot pursuit, evolves into an action-packed car chase. Mostly in a tunnel. Unlike these other movies, super-powered individuals are involved, but to me that makes it all the more fascinating. It's here,  Open in new Window. if you missed it or want to be reminded.

The amount of work, skill, art, and planning that goes into some of these scenes is truly staggering, and the best ones have the ability to make me forget, for a moment, that it's all meticulous choreography and stunt work.
September 27, 2025 at 9:19am
September 27, 2025 at 9:19am
#1098133
This Inverse piece is a couple of years old, but that's not the problem.

    Staying Up All Night Rewires Our Brains — This Could Be Key To Solving Mental Health Conditions  Open in new Window.
What happens in the brains of mice when they stay up all night could help us better understand mood and other psychological conditions.


The problem, or one of the problems anyway, at least in my view, is that the headline could easily be interpreted as "Stay up all night to fix your mental health problem!"

But we're all too smart to believe that. Right?

Everyone remembers their first all-nighter.

You know... I really don't. I know I pulled a few in college, just like many college students. I even did some at work, until I got too old for that shit (all-nighters, not work). I just have no recollection of when the first one happened.

I also regularly did what we called "tweeters," where you stay up studying until the sky just began to lighten and the birds started tweeting.

What’s probably more memorable, though, is the slap-happy, zombie-like mode the next day brings.

The best thing about all-nighters to me, back then, was finally being able to collapse into bed and get some decent sleep.

Scientists have long thought that there is likely a neurological reason for this sensation, and one group of researchers thinks they might have cracked it.

Um, how could there not be a neurological reason for the sensation? Also note the more restrained language here: "one group," "thinks," "might have."

A study published last week in the journal Neuron tracked what happened in the brains of mice when they stayed up all night. Their results, surprisingly, may even help us develop a better way to treat mood and other psychological conditions.

"Last week," as I noted, means about two years ago.

The researchers found that one all-nighter roughly had the same effects on the brain as taking the anesthetic ketamine.

"So I can just pop some ketamine instead?"

This isn’t an endorsement of acute sleep deprivation. “I definitely don't want the takeaway from the story to be, ‘Let's not sleep tonight,’” Kozorovitskiy says.

So she probably didn't write the headline.

Insufficient sleep brings risk for myriad conditions and events, such as heart attacks, high blood pressure, diabetes, and stroke, way up.

Worse, it can turn you into a grumpy asshole.

Going a night without shut-eye isn’t the latest craze that will cure your depression, but rather, this insight could shake up our approach to targeting different areas in the brain when developing antidepressant medications.

So, you know, just to be clear, this isn't a "something you can do about it" article, but a "look what scientists are working on now" article.

There's a lot more at the link, delving into some of the science behind it. I don't need to share most of that; it's there if you're interested. I just have one more quote to highlight:

The secret might lie in the neurotransmitter dopamine. Known commonly as the reward hormone, dopamine abounds when we eat and have sex.

At the same time? Kinky!
September 26, 2025 at 10:50am
September 26, 2025 at 10:50am
#1098089
Just in case anyone can still afford to go out to eat, here's a "helpful" article from bon appétit:

    All Your Restaurant Etiquette Questions, Answered  Open in new Window.
Is it okay to ask for a different table? How do you split the bill with friends? Our industry pro weighs in.


On any given night at your local watering hole or restaurant, bartenders are doing double time, dispensing drinks and life lessons from behind the bar.

Which is why we always tip bartenders.

Do I have to wait for everyone else’s food to arrive before cutting into my own plate?

Yes?


No.

Well, depending on what you mean by "have to." No one's going to fine you for it. The Food Police aren't going to swoop in and drag you off in handcuffs to Kale Jail.

It's rude, though.

Do you tip on the total price of the bill if you order a bottle of wine?

Yes. Next question.

Just kidding. There is a lot more nuance here.


No, there really isn't that much nuance. The article makes an argument based on server expertise. I'd prefer to see tipping go away entirely, as I've talked about many times, but as long as it's a thing, yes, if you order a $10 hamburger and a $190 bottle of wine (hey, don't judge me), you tip based on $200.

There is one exception I can think of, but it's kind of an edge case. My favorite local brewpub will put any to-go beers on your tab, for the convenience of only having to pay once. So, I might get, say, a $10 burger and a $6 beer to eat there, and then a couple of 4-packs at, say, $20 each. I tip on the "service" portion, $16. Basically, if you consume it at the table, it's tipped.

If I don’t like a table, is it okay to ask for a different location?

Yes. In my nearly two decades of experience working in hospitality, I’ve never observed a conspiracy to give people the worst tables possible just for fun.


You work in NYC. Every table is the worst possible table.

If there’s a social media influencer disrupting the meal with lights, cameras, and ruckus, who should speak up, the staff or the diner?

Wrong question. The correct question is, "Where's the next nearest restaurant?"

There's more at the link, but I feel like they left out the most important questions and answers, to wit:

Q: What's the best way to get the restaurant to let my dog in?

A: Go fuck yourself. (Exception: legitimate service animals)

Q: There are kids running around. Do I tell the staff or the parents?

A: Neither. Surreptitiously pass the kids some chocolate-covered espresso beans.

Q: What's the best way to make my date pay for everything?

A: Put out at the table.

Q: I'm at one of those weird hipster beer places and they won't serve me a Michelob Ultra! How can this be?

A: Good for them.

Q: How can I get my meal comped?

A: Spend a few hours washing dishes in the kitchen.

Q: Could you turn up the volume on the Lakers game?

A: No.

I should write an advice column.
September 25, 2025 at 11:05am
September 25, 2025 at 11:05am
#1098033
The random number gods have once again trolled me. Here's another one from Mental Floss:



Well, now, that depends. Snow? Cocaine? Talcum powder? Flour? Bread crumbs? I can think of at least one other possibility, but although I love cheese, I don't love it that much.

If your taste in cheese has evolved beyond the individually-wrapped processed American slices and into blocks of the hard stuff—like cheddar—you’ve probably noticed that some cheeses can develop a chalky white substance on their surface.

"Individually-wrapped processed American slices" are, unless we twist its definition beyond recognition, not cheese.

At first glance, it looks like mold. It’s usually not a good idea to eat mold.

Godsdammit, MF. This is why I don't trust you. A lot of cheese owes its entire existence to mold. What we call "mold" is a particular form of fungus, and, just as with more macroscopic fungi such as mushrooms, some are beneficial, some are neutral, and some are poisonous. Hell, penicillin is a mold.

I can understand being put off by it, much like I can (just barely) understand not liking cheese in general, but that there is either misinformation or anti-mold propaganda, or both.

Is it fungus? Or is your cheese harmless and just making you paranoid?

And, again, poorly phrased. Yes, some fungi are inedible (or, more precisely, edible only once). And if you reach into the back of your dairy drawer and pull out a biology project, I can't blame you for throwing that shit away. I'd do it.

The real question, though, the one I think they intended but maybe worded less than ideally, is simply "is it safe or not?" And that's absolutely legitimate.

It’s probably fine.

"Probably" doesn't fill me with a lot of confidence here, as, in the event that it's not fine, the result could be death or, worse, three days on the toilet.

The white stuff seen on cheddar is typically calcium lactate, which is the result of lactic acid interacting with calcium.

Both of which are rather famously found in milk, which, it might surprise you to discover, is what cheese is made from.

Someone once said something like "cheese is milk's attempt at immortality," and I laughed. Poetic, but there's some truth in it.

Anything "lactic" is milk-related. And yes, this includes "galactic." The general name for galaxies was derived from the name of our Milky Way. Not kidding here.

But I digress. There's no actual milk in the sky (as far as we know), though we know there's ethanol out there. Nor is there cheese (again, as far as we know). No, not even the Moon.

As cheese ages, some of the moisture moves to the surface, and the lactate moves with it. When that water ebbs, the lactate remains behind and can appear as powdery, crystal-like particles on the surface of the cheese.

"Crystal-like?" Okay, fine. I won't be pedantic about this one.

Calcium lactate is completely harmless. It might even be a sign the cheese has matured and is therefore tastier. But it’s also remarkably similar in appearance to mold. So how can you tell the difference?

Given the fast-and-loose definitional problems so far, I'd highly recommend double-checking any of the advice here with another source.

So why am I sharing this article if I don't fully trust it? Mostly just so I could quote this sentence:

Fondling the cheese should give you some indication of which is which.

I'm just going to pause for a moment here.

Okay.

Italian, Swiss, and Dutch cheeses may have visible Lactobacillus helveticus, which is added to helped create amino acids for flavor.

Okay, so, this is where years of learning about chemistry, biology, and Latin pay off. Low-information consumers might see that and freak out over the number of syllables. "Don't eat anything you can't pronounce" is one of the most ignorant and misleading pieces of advice floating around out there. Let's break it down:

Lacto - like I said, milk.
bacillus - bacteria (bacteria are not fungi, but, as with fungi or any other genus of organisms, there's the good, the bad, and the ugly)
helveticus - the Romans called the area now known as Switzerland "Helvetia." Yes, that's also the root of the name of one of the more popular typeset fonts.

In the end, I'm not going to rag on anyone for discarding something that's grown an ugly fuzz, unless we're talking about your teenage son. We don't generally have laboratories in our homes (if you do, I'll back into the hedge here), and even if you're an expert in biology and/or chemistry (which, I should emphasize, I am not), the term "better safe than sorry" applies.

Still, knowledge is power. And cheese is delicious.
September 24, 2025 at 9:04am
September 24, 2025 at 9:04am
#1097968
Some purported historical research from Mental Floss today.



I don't know, but shaking heads is a standard way of approaching a Mental Floss article.

Shaking hands seems like a gesture that has been around forever. Indeed, a throne base from the reign of ancient Assyria's Shalmaneser III in the 9th century BCE clearly shows two figures clasping hands.

Well, that certainly seems asserious.

It might seem like shaking hands is an ancient custom, the roots of which are lost to the sands of time.

The story I heard as a kid was that its origin probably came from demonstrating that you're not carrying weapons. That never quite sat right with me. One or both could be concealing weapons in their non-shaking hand. Or behind their backs. And the gesture could be meant by one as "See? No weapons!" and by the other as "I could kick your ass bare-handed."

Back in the days when I was learning karate (along with five or six other Japanese words), one of the standard beginner lessons involved using a handshake to draw the opponent off-balance.

Still, okay. As untrustworthy as MF can be, it does seem to be true that a handshake can be cross-cultural, and therefore might have shrouded origins.

Historians who have pored over old etiquette books have noticed that handshaking in the modern sense of a greeting doesn’t appear until the mid-19th century, when it was considered a slightly improper gesture that should only be used with friends [PDF  Open in new Window.].

I took the time to follow and recreate the link to the cited PDF. It's a bit long and I admit I skimmed a lot of it, but it does seem to go into detail about the "negative history" of the non-handshake. Fair warning, though: it contains misspellings that seem to indicate that it's a textified image, such as "niouthpiece," which most likely started off as "mouthpiece." The original text was apparently in Dutch, and translated to English, so, in short, I wouldn't trust it completely through all those different translations.

But one bit caught my eye, and latched on to my sense of humor. I present it without edits:

When speaking, the Dutch merely used tlie eye and 'a moderate
movement of the hand to support [their] words'. Because of their
lively gesticulation the Italians were put on a par with peasants or,
even worse, the 'moffcn' o r Westphalians, the immigrants the
Dutch loved to ridicule.' "


It's not the transcription error that sent me, though; it's how the perceived roles of "civilized" and "barbarian" had switched entirely.

The early handshakes mentioned above were part of making deals or burying the hatchet...

Ironically (or whatever), the term "to bury the hatchet" can be traced.  Open in new Window. It originated as an idiom for making peace during North American colonization. That worked out great for the American natives.

The modern handshake as a form of greeting is harder to trace.

The article makes a few references to handshaking in history and literature.

As for why shaking hands was deemed a good method of greeting, rather than some other gesture, the most popular explanation is that it incapacitates the right hand, making it useless for weapon holding.

Like I said, that's the one I always heard, but a few moments' thought cast doubt. Especially since a disproportionate number of the people I knew were left-handed.

Sadly, in a world where obscure Rabelais translations provide critical evidence, the true reason may remain forever elusive.

It's good to accept that maybe we'll never know everything. Lots of customs of etiquette seem arbitrary, like "how to set the table properly" or the American institution of not wearing white after Labor Day. Some are invented to deliberately distinguish your culture from the "barbarians;" then, you can say "those barbarians don't even [whatever arbitrary etiquette rule]."

But, I suppose, at least handshaking is marginally more hygenic than kissing each other on the cheek, as per the French custom. But it's less so than the mutual bow common in some Eastern cultures, where a handshake might just as easily end with one party on the floor, staring up at a smug blackbelt.
September 23, 2025 at 7:30am
September 23, 2025 at 7:30am
#1097917
"Hey, Waltz: the world is on fucking fire right now. Why are you banging on about shit that doesn't matter?"

Okay, no one actually asked me that question. But sometimes, I ask it of myself. My answer, at least for now? It does matter. Truth matters. Science matters. Philosophy matters. Humor matters. I don't think we can protest our way out of this mess. There's no god nor benevolent aliens to save us from ourselves, no monsters except our own fears. When people believe anything without evidence, it chips away at our collective reality. And I'm here to try to approach the truth. I may dwell in darkness, but my words are light. They may not illuminate much, but maybe they help someone to see some obstacle. Doesn't matter if it's a big thing or a small thing.

Here's what might be considered a small thing, from Atlas Obscura, two years ago:

    Why Halloween’s ‘Poison Candy’ Myth Endures  Open in new Window.
Even though we know better.


In the fall of 1982, an unfounded fear haunted almost every house in Chicago. As area children prepared to “trick” their neighbors with their impressions of werewolves, vampires, and zombies, their parents were much more terrified of the “treats” their kids were eager to devour.

And this was before people started bleating about "sugar is poison." They were worried about actual poison. Though I'm not sure it was entirely unfounded; it was, as the article finally gets around to noting later, right after the Tylenol poisoning incidents.

What is it with you people and Tylenol, anyway? Never mind; I'm not getting into specifics of current events right now.

However, according to sociologist Joel Best, this ghastly threat was about as real as the candy-seeking, child-sized ghosts and witches roaming around with pillow cases.

“All I can say is I don’t know of a single case of a child killed by a Halloween poisoner,” says Best. “I’ve seen five news stories that attributed deaths to Halloween poisoning. In one case, it was the child’s own father, and the other four were all retracted.”


Here's the thing, though: Lone Asshole Theory, as I call it. All it takes is one bad actor to ruin something. A million people might pass by a precious piece of artwork, harmlessly, and the one million and first decides to throw paint on it. This is not an indictment of people in general. Far from it; it demonstrates that most of us are good or, at least, unwilling to face negative consequences.

What it does show is that the one asshole ruins it for everyone else.

Worse, once the idea is out there, some sociopath who might not have otherwise considered it might decide to slip a razor blade into an apple, or whatever. The chances are very low, but the consequences of hitting those odds are horrible: even one dead child would be a tragedy.

Halloween sadism is defined as the act of passing out poisoned treats to children during trick-or-treating. But even before the term was coined in 1974, parents already feared a mysterious, mentally unhinged candy killer for decades, despite a lack of supporting evidence.

So, why am I arguing in favor of being concerned about tampering? Well, what I'm trying to say here is that, considering risk management, even if it's never happened, there needs to be some vigilance. The problem is people get hyper about this stuff and go overboard with imaginary scenarios, while ignoring, or at least downplaying, more plausible ones.

Best, who maintains that children are much more likely to be harmed on Halloween by cars than contaminated treats, has continued his never-ending task of quietly and efficiently unmasking the fraudulent claims that darken his door.

Good. It's important to keep these things in perspective.

Many of the needles in apples and poison-laced treats turned out to be hoaxes. In some cases, the children themselves perpetrated the hoax, perhaps to get attention.

Kids can be sociopaths, too. They're not done being built.

So why does this fear continue to endure and flourish even in the absence of evidence to support it?

“This is, first and foremost, a worry about protecting children,” says Best, who categorizes Halloween sadism as a contemporary urban legend.


Sure. But, unlike myths such as Slenderman or werewolves, this sort of thing is, at least, possible.

While Best has been tracking the phenomenon since 1958, folklorist Elizabeth Tucker noted similar themes in other myths, like Blood Libel, a myth dripping in antisemitism that blamed Jewish people for kidnapping Christian children to use in illicit rituals.

Oh, come on. Kids aren't even kosher.

The fear of drug-laced Halloween candy was further intensified in 2022 with news reports of “rainbow fentanyl”—a form of the highly addictive narcotic produced in bright colors, allegedly to appeal to children.

A moment's thought should be all it takes to dismiss this nonsense. Unlike razor blades and rat poison, drugs (so I've heard) are expensive; why waste them on kids when you can do them yourself?

Doing the drugs, dammit, not the kids. "Doing" kids is also very wrong. Unlike poisoned candy, that happens all too often here in reality. But it's rarely a stranger. Usually, it's a pastor, coach, cop, friend, or family member.

Nonetheless, respected outlets like The New York Times, as well as trusted advice columnists Dear Abby and Ann Landers, all weighed in with alarmist articles warning parents of Halloween night dangers.

Humans. Just can't put their fear in the right places, can they? Terrified of sharks; step into the bathtub like there's no chance of slipping and dying. Anxious about flying; think nothing of taking an Uber to the airport. In either of those cases, what they're afraid of has a much lower probability than what they're not afraid of.

In summary: maybe we, collectively, would be better off putting our energy into addressing real dangers than freaking out over imaginary ones. What we fear the most is the unknown, so maybe address that by fighting against ignorance.
September 22, 2025 at 8:35am
September 22, 2025 at 8:35am
#1097850
I saved this Noema article a while back, but I can't remember why, only that I found it interesting (even though it's not even two months old now). Which doesn't mean that I agree with all of it.

    What Searching For Aliens Reveals About Ourselves  Open in new Window.
Looking for life beyond Earth changes the way we perceive life right here at home.


You know how, on old maps, you often see art in the unexplored places, usually things like sea monsters or fire-breathing dragons? That's what our various conceptions of alien life reminds me of. We don't know, so we make shit up. This is okay; it's part of what makes us us. Without this kind of imagination, fiction writing would be a lot more boring. The trick is, we need to sometimes step back and separate imagination from reality.

I'm as big a fan of Star Trek as anyone, and more than most, but aliens aren't going to be humans with forehead makeup or rubber suits. Hell, I expect the vast majority of them won't even be what we call sentient, just like the vast majority of life on Earth isn't.

As an astrobiologist, I am often teased about my profession.

Hey, at least you don't get "Oh, you must be a virgin wearing a pocket protector and horn-rimmed glasses, and can't write good." Which is what we engineers have to put up with.

I'll have you know that I don't wear glasses at all.

The moment we realized our entire biosphere existed on the skin of a rocky planet hurtling through the void around a very ordinary star — one of some 100 to 400 billion stars in our galaxy, which is one of perhaps 2 trillion galaxies in the universe — we discovered life in space.

This argument is like when I talk about "artificial" vs. "natural." It's a point of view. Kind of like how today's southward equinox can be described as "the Sun crosses the equator" or "the Earth's orbit and axial tilt moves the equatorial plane to intersect with the Sun." Both are true, depending on your vantage point.

Everything on, in, or gravitationally attached to Earth is, ultimately, from space, but it can still be useful to categorize "terrestrial" as opposed to "extraterrestrial."

Astrobiology seeks to uncover generalizations about life: how it comes about, where to find it, and what it is in the first place. Because we are part of the set of all living things in space, astrobiological progress reflexively reveals new truths about us. Even if we never find other life out there, the search itself shapes how we understand our own stories right here on Earth.

How people describe, define, and defend their own professions is also interesting.

Astrobiologists, however, are most interested in the at least two dozen worlds that we know of, so far, that are just the right size and distance from their host stars to potentially support life as we know it.

Which makes perfect sense; you hone in on what's most likely to have what you're looking for. If you're in a strange-to-you city and want a beer, you go to a taproom, not an art museum, even though you might find beer in an art museum.

Could life "not as we know it" exist elsewhere? Sure. But they're playing the odds.

What we need to watch out for, as ordinary people reading stuff like this, is making the unsubstantiated jump from reports of "this planet could potentially support life" to "they've found an alien civilization!"

The latter is the drawing of sea monsters in unexplored corners of the map.

The article goes into some of the tools (mostly telescopes and computers) they use, then:

While many of our simulations will be mere fictions, what makes them scientific is that these thought experiments are constrained by the known laws of physics, chemistry and biology. In the end, we produce scores of imaginary worlds that give us clues about what we need to look for to find another Earth-like planet using future observatories like HWO.

Look, all the simulations are fictions. If they weren't, they wouldn't be simulations. It's like making computer models of next week's weather forecast: you might get close, and it's better than not making the prediction, but you won't be spot on.

And as we know, the "known laws" are subject to tweaks. Especially in biology. People keep finding biology here on Earth that doesn't follow the Rules. Again, though, you have to start with what's known.

Although many exoplanet scientists describe their work as a search for “Earth 2.0,” I find this phrase extremely misleading. “Earth 2.0” conjures images of a literal copy of the Earth. But we’re not looking for an escape hatch after we’ve trashed version 1.0.

Yeah, I'm pretty sure some people are. Else there wouldn't be so much fiction about it.

The article continues with some stuff I've discussed in here (and in the previous blog) numerous times, so I'm going to skip it this time—except to say that it seems like the author falls into the common fallacy of thinking that evolution (which is arguably a prerequisite for considering something "life") must necessarily produce tool-users who go on to shoot rockets into space and look for evidence of aliens doing similar things. We know it's possible because that describes us. What we don't know, and can't know yet, is how common that is in the universe.

In other words, don't expect Vulcans and Klingons. But, as I've also said numerous times, even unequivocal evidence of microbes or their equivalent would be a paradigm shift for us.
September 21, 2025 at 9:22am
September 21, 2025 at 9:22am
#1097783
I see stuff from Mental Floss from time to time. As I'm sure I've noted before, I don't really trust their fact-checking (if, indeed, any takes place). And I'm entirely too lazy to fact-check things point by point. But sometimes, something catches my eye enough to share; I just feel like I have to throw in the disclaimer.



Okay, well, my mind remains firmly un-boggled after reading this, but that may be because I'm familiar with the ideas. Or it could be because I've become jaded, cynical, and almost completely out of shits to give.

In any case, as the headline notes, there are 20 of them here. The URL says 10, but apparently, that's because they updated the article at some point. I'm not going to bore you with all 20, but I have comments on some of them.

A paradox is a statement or problem that either appears to produce two entirely contradictory (yet possible) outcomes, or provides proof for something that goes against what we intuitively expect.

A fair enough definition, I think. Delving into the details of what makes a paradox paradoxical can give us new insights, and often shows that "common sense" is bullshit.

But most paradoxes that I've encountered aren't paradoxes after all, or have a very simple solution, or result from the imprecision of language. There might be exceptions. Take, for example, the sentence: "This sentence is false." It's a pretty famous example of a paradox (and shows up on the linked list at #7), because if the sentence is false, then it's true; while if it's true, it's false. Hell, Spock used it in the original Star Trek to make an AI run around in mental circles until it exploded. But here in reality, it's merely a reminder that things aren't always either heads or tails, one or zero, true or false, no matter how much we might want things to be that simple.

So, some of the other examples:

1. The Paradox of Achilles and the Tortoise

The Paradox of Achilles and the Tortoise is one of a number of theoretical discussions of movement put forward by the Greek philosopher Zeno of Elea in the 5th century BCE.


As far as I know, all of Zeno's paradoxes are essentially the same idea. What's interesting about them now isn't the paradoxes themselves, but that it took 2000 years for someone to come up with a compelling reason (other than "common sense," which is not compelling at all) why they're not paradoxes.

The trick here is not to think of Zeno’s Achilles Paradox in terms of distances and races, but rather as an example of how any finite value can always be divided an infinite number of times, no matter how small its divisions might become.

Well, kind of. What it took was Newton and Liebniz developing (mostly independently) a new system of math that addressed the infinitesimal.

2. The Grandfather Paradox

This, along with a few other time-travel paradoxes (another is noted in the article), has a simple way out: perhaps time-travel, at least as it's presented in popular fiction, is flat-out impossible.

I've often said that imagining the impossible is one of our great superpowers, but like all superpowers, it can be used for good or evil. Or, in the spirit of what I said above, something else entirely.

4. The Ship of Theseus Paradox

One of the more famous paradoxes, thanks in part to the Marvel show WandaVision, is the Ship of Theseus Paradox. Here’s a brief summary.

In the interest of time and space (specifically, my time and space in this blog), if you're not familiar with this one, go to the article or, better yet, find it on Wikipedia.  Open in new Window.

I'm including this one because I don't really consider it a paradox. Though I think of myself as a materialist, I can think of an analogy: our own bodies. It's been bandied about that all of our cells replace themselves every seven years. This isn't really true; some cells replace faster, some slower, and some do, indeed, hang in there for life (though their individual sub-components may be replaced). And yet, I remember things that happened to "me" as a kid, and I feel continuity with Kid Me, through memory and experience.

The point being that, in my view, the key to identity isn't physical components, but pattern.

As kind of an aside, I sometimes imagine a band whose members individually swap out, until none of the original band members remain, but the band plays on, with the same name. I'm sure this has happened, and yet no one has named themselves "Band of Theseus."

5. and 6. The Sorites Paradox and The Horn Paradox

These being not nearly as famous as the previous paradoxes, I was tempted to skip them. But then I realized that the list item includes something I've been saying all along:

The Sorites Paradox is all about the vagueness of language. Because the word heap doesn’t have a specific quantity assigned to it, the nature of a heap is subjective. It also leads to false premises.

Sometimes, paradoxes are easily resolved once we remember that language is imprecise.

7 through 10 are really the same paradox. I referred to it above: "This sentence is false."

11. Newcomb’s Paradox

We can safely dismiss any "paradox" that hinges on the existence of an omniscient entity. Not that we shouldn't think about it, but don't expect to find a definitive answer.

12. The Dichotomy Paradox

Lazy bastards. This is just a restated Zeno paradox, and has the same solution: calculus. Same with #14 (well, the first #14; they flubbed and presented two #14s). I'm skipping #13 entirely because it relies on people not understanding probability theory, which, well, I think more people understand the basics of calculus, if that tells you anything.

(the second)14. Galileo’s Paradox of the Infinite

I don't pretend to understand everything about mathematics, myself, but I do know that this one was made obsolete by set theory in, like, the 19th century or something.

15. The Potato Paradox

Also fixed by understanding mathematics.

16. The Raven Paradox

Also known as Hempel’s Paradox, for the German logician who proposed it in the mid-1940s, the Raven Paradox begins with the apparently straightforward and entirely true statement that “all ravens are black.”

Yes, yes, I know. Albino or leucistic ravens exist, and they're not black. This is not the way out of the paradox, though. Assume, for the sake of argument here, that all ravens are black.

This is matched by a “logically contrapositive” (i.e. negative and contradictory) statement that “everything that is not black is not a raven”—which, despite seeming like a fairly unnecessary point to make, is also true given that we know “all ravens are black.”

By "fairly unnecessary," I think they mean "tautological."

17. The Penrose Triangle

This is more of an optical illusion than a "paradox," leading me to believe that maybe these word-logic paradoxes should be considered "cranial illusions" or something, to emphasize that they take advantage of our limited mental capacity the way optical illusions take advantage of our limited optical capacity.

You know the famous duck-rabbit illusion?  Open in new Window. No one calls that a paradox, as far as I know. And yet, it shares traits with some of the more wordy paradoxes.

Apparently to make up for repeating #14, they skip #18. So there's still 20 here. The last two are just reminders that math can't always be described using plain language, which we should already know, because if it could, they wouldn't have come up with precise mathematical notations.

So, yeah, congratulations if you made it this far, or skipped down to here. To summarize: most paradoxes aren't paradoxes at all, but if they make us think, well, mission accomplished.
September 20, 2025 at 9:50am
September 20, 2025 at 9:50am
#1097732
While this Quartz article is a few years old now, I've been saying stuff like this for way longer.

    The case for puns as the most elevated display of wit  Open in new Window.
Humor me please, and consider the pun. Though some may quibble over the claim, the oft-maligned wordplay is clever and creative, writer James Geary tells Quartz. His upcoming book Wit’s End robustly defends puns and tells the distinguished history of these disrespected witticisms.


And yes, this is basically a book ad. I've repeatedly stated my position on book ads.

Humor me please, and consider the pun.

You want me to dump a bucket of bile on you?

Though some may quibble over the claim, the oft-maligned wordplay is clever and creative, writer James Geary tells Quartz.

My father (who, being a dad, made plenty of what are now called "dad jokes," which involve puns) tried to teach me that the pun is the lowest form of wit. No, Dad; respectfully, it's the highest. The lowest is fart jokes, which is why they're the oldest known form of comedy.

“Despite its bad reputation, punning is, in fact, among the highest displays of wit. Indeed, puns point to the essence of all true wit—the ability to hold in the mind two different ideas about the same thing at the same time,” Geary writes.

Yeah, well, I've written things like that, too, but I have yet to be interviewed by a magazine.

The bible, the Indian epic the Ramayana, and the classic Chinese philosophical text the Tao Te Ching all avail themselves of puns, he notes, though we may not recognize these ancient jokes.

That's because puns are heavily language-dependent (unlike farts, which are recognizable in any language).

I've probably said this before, but when I was in Tours, I asked a tour guide if the city was named for towers (because "tower" in French is "tour.") No, apparently, it's named for an ancient Celtic tribe who lived along the Loire, similar to how we here in the US kicked the Indians out and named some towns after them. But the city's flag features three towers. Because it's a pun, albeit a visual one.

I could also make a pun about "Tours guide," but that would be beneath me, wouldn't it?

There is some truth to the rumor that I started learning French so I could pun in other languages.

Geary also points out that William Shakespeare, the greatest English language playwright of all time and an acknowledged master of rhetorical jousting, loved puns.

Anyone with a passing familiarity with Shakespeare would know this.

Indeed, many a great mind has been inclined to pun. The 18th-century English poet and philosopher Samuel Taylor Coleridge thought it was practically a prerequisite to intelligence, declaring, “All men who possess at once active dance, imagination, and philosophical spirit, are prone to punning.”

This should also come as no surprise. I've often wondered about the title of his second-most-famous poem: does "rime" play a dual role, referring to ice and verse? I haven't seen anything definitive on that, yet. But the story starts in the Antarctic, so I'm going with "yes."

Geary admits that he often makes pun in his head—but he mostly keeps them to himself. He can’t explain why the wordplay’s not appreciated.

Because they break a listener's mind and are painful. They are only funny to the punster. Which is why I'm fond of my puns, but not your puns. And especially not puns related to the planet whose orbit lies between those of Saturn and Neptune.

In short, puns are the highest form of humor, demonstrating the punster's high levels of intelligence, mental alacrity, and good looks. They prove you're a wit, and not just half of one.
September 19, 2025 at 7:45am
September 19, 2025 at 7:45am
#1097658
This might not be helpful to anyone else here. It's barely helpful to me. But, despite it being a Popular Science article, I might have learned something.

    Learning a new language? Here’s how to perfect your pronunciation.  Open in new Window.
Tips to sound better in the top 5 languages Americans are learning right now.


I'm still working on French. The thing I struggle most with now is pronunciation, so the article caught my eye. I'm also not very good at following spoken French, but, to be fair to myself, I need subtitles on my screen to follow the English.

In your language-learner dreams, you may be asking a local what time the train is coming in a perfect Parisian accent, or ordering scialatelli as if you’ve spent your entire life vacationing on the Amalfi Coast.

I'll settle for just not being mocked for my accent.

You know the brand Lululemon? I've never bought anything from them, but even as someone who goes out of his way to avoid ads, it pinged my radar a while back. I was curious why it was named that, so I looked it up, and the claim is that the founder called it something with three Ls so he could laugh at Japanese people trying to say it.

Dick.

If you’re learning a language that doesn’t share roots with your mother tongue, pronunciation can be hard. So hard in fact, that it may hinder the learning process altogether.

It's certainly an obstacle, especially at my advanced age. Not letting that stop me, though.

Mother tongues can also make picking up new lingo simpler or more difficult; it’s easier for an English speaker to learn the similarly-rooted German, and harder for them to learn Italian.

I have my doubts about this claim, but I haven't really tried to learn either. What I do know is that so many of our words come from French (or Latin via French) that English might be more properly classified as a German-French creole. It's considered a Germanic language, as I understand things, because of the grammatical structure more than the words themselves.

Consider the words “this” and “these.” For Spanish speakers, these words are tricky because the “I” sound in “this” doesn’t exist in their native language. They tend to pronounce it as “these.”

And, apparently, so it is in certain British accents. I always used to rag on the Eurythmics for rhyming "this" with "disagree," but that was before I found out about this concept.

The song still sucks, though.

The same happens with the French “R”—it sounds lovely, but if you have a hard time pronouncing it, you’ll still probably be able to communicate with locals during your trip to the Pyrenees mountains.

No, I won't, because the Pyrenees are *shudder* outdoors.

The article dives into specific tips for certain languages, including the bold claim that "Japanese is very similar to Spanish," which I guess is true from the point of view of what sounds are in the language.

The thing that learning another language has really done for me, though? Besides being able to ask a French person what color their cat is, it's that I've become more patient with people for whom English isn't their native language, and speak it with a heavy accent. Though it's embarrassing to me that I was ever not patient with that.
September 18, 2025 at 9:54am
September 18, 2025 at 9:54am
#1097606
I've talked about the possibility of Mars colonization before. But I'm pretty sure not based on this Vox article, because it's from just last month.

    Living on Mars would suck  Open in new Window.
The billionaire space race is a dangerous fantasy.


Elon Musk wants a self-sustaining settlement on Mars as a backup for humanity in case the Earth gets destroyed. Jeff Bezos wants us to move heavy industry and all polluting industries to space to save Earth’s climate, and envisions a trillion humans living in space.

Thus showing once again that you don't have to be smart to accumulate wealth. Just lucky.

Mars, for all its flaws — and there are many, including radiation, dust storms, and unbreathable air — is the only planet in our solar system that’s a candidate for settlement.

I shared an article here recently that proposed Titan. Even so, point stands; Titan is a moon.

The rest of today's link is a transcription of a podcast discussion with Adam Becker, who wrote a book, and so this is actually an ad for that book.

Incidentally, "podcast" is going into my list of anachronyms, along with "filming" for making a video, and "footage" for the resulting video.

Mars is a horrible idea. Mars is a terrible place; it’s awful. There’s nothing to breathe. You’ll die of cancer if you hang out there for too long because it’s covered in radiation. The dirt is poisoned. The gravity’s too low. It gets hit with asteroids more often than Earth does. There’s no biosphere. There’s nothing to eat. There’s nothing to breathe. If you hung out on the surface of Mars without a spacesuit, you would asphyxiate while the saliva boils off your tongue.

Also, it ain't the kind of place to raise your kids. In fact, it's cold as hell. (Okay, no, Elton John was wrong there. Sure, a lot of it is cold as hell. But its surface can occasionally reach around 60F, which is cold to me, but not cold as hell.)

When I was a kid, I thought that the future was in space. I watched a lot of Star Trek because I’m a huge nerd, and a young growing nerd needs to consume healthy amounts of Star Trek in order to grow up to be a big, strong nerd.

Now that's poetry right there.

Mars is awful, and there is nothing that could happen to Earth that would make it a worse place than Mars. You could have an asteroid hit as bad as the one that killed off the dinosaurs 66 million years ago. And the day that that happened, which is the worst day in the history of complex life on Earth, was a nicer day than any day on Mars in the last few billion years.

That's a long way to say "A bad day on Earth is still better than a good day on Mars."

And this part, I also agree with, and I've written some variation of it before:

But science fiction is fiction. It is a set of stories that we tell not to predict the future, but as a setting to explore some questions about being human.

There's quite a bit more at the link. I still think a lot of the issues with Mars are engineering problems, and engineering problems can, eventually, be solved. But it's not going to happen anytime soon.

I'd be remiss if I didn't mention something that's been in the news: a rock found by one of our Mars robots that, possibly but not definitely, has features that could be fossilized former life from way back in the planet's history. (Note that this does not mean, or imply, little green Martians. Just microbes or the equivalent.) To have any greater degree of certainty, it'll need to be brought back to Earth. Possibly by humans.

That doesn't mean we can set up shop there permanently. Yet.
September 17, 2025 at 10:52am
September 17, 2025 at 10:52am
#1097537
This article from Big Think is a few years old now, but I only found it recently:

    Geopsychology: Your personality depends on where you live  Open in new Window.
Research suggests there’s truth to regional stereotypes in the U.S. — with some caveats.


First issue: the headline. Suppose they take a poll of 100 people. 99 of them say they're extroverts. You're the one self-described introvert. Headline: You're an extrovert. Reality: you're an outlier.

Further, judging by the headline alone, in which direction does the arrow of causality point, if any? Are you the way you are because you live in a certain place? Or do people with a given personality type tend to prefer the place? Or is it just correlation?

You will, of course, have to go to the article to see the maps and graphs and charts and whatnot.

Does where you live have any bearing on the kind of personality you have? Science says yes, and these maps show how.

"Science" says no such thing. One particular branch of science is trying hard to make "yes" happen.

“Psychogeography” is already taken — basically, it’s a fancy term for “walking while moody.” “Geopsychology,” however, is still available. And it sounds just about right to describe the systematic study of regional differences in the distribution of personality traits, especially since those differences do indeed seem to be “robust.”

Already, I can see a problem: You live in, say, a place associated with a high degree of emotional stability. But you're in the minority, not very stable at all. Someone who's well-versed in this new "science" (which actually appears to be akin to an actuary in insurance) might assume that you're emotionally stable. You can easily prove them wrong. You might end up in prison for it, though.

The usual caveat applies: None of these traits should be taken in isolation, neither for cause nor effect. Studies — of twins, for instance — show that these characteristics are about equally influenced by nature and nurture.

This sounds like astrology. "Your sun sign should never be taken in isolation. Your personality is also affected by your moon sign, rising sign, the position of the planets relative to each other, etc."

At least they admit that each trait is on a spectrum, not flipped like a coin; that's one of my major issues with INTJ tests.

Also interesting is the finding that while four out of five traits remain stable into old age, “agreeableness” does show variation as subjects get older, showing that people tend to become more compassionate, cooperative, and trusting as they age.

Are you fucking kidding me? Us oldies are cantankerous and grumpy.

On these maps, orange means higher than average, blue means lower. Darker means greater distance from the average.

Oh, good, an explanation of the colors. I thought I was going to have to get grumpy about that, too. Still, would it have killed them to put a legend on the graphics themselves?

The article goes into more details of the geographic distribution. It is, I must emphasize, mostly limited to the contiguous US. There's also a quick overview of the UK at the end. I'd be curious to see data for other countries.

Do I trust it? No. Does it show promise? Maybe. Is it useful? Eh, I don't know. It's like calling anyone from Gen-X a slacker. Sure, many of us are, and I, for one, have embraced that description. But how much of that is actually me (I was a damn hard worker when I was younger), and how much is a self-fulfilling prophecy?
September 16, 2025 at 9:03am
September 16, 2025 at 9:03am
#1097470
An amusing "news" story from Boston:

    Who put this obscene, somewhat official-looking sign up near the Zakim Bridge?  Open in new Window.
The sign’s appearance prompted one Reddit user to remark, “Stay classy, Boston.”


You'll have to visit the site to see the actual "somewhat official-looking sign," but I'll describe it here:

With the bridge towers in the background, the sign, in standard white-on-brown historical marker colors, features an image of a side view of one of the towers with the quote "This bridge looks like a ***** that's being held up by wire" -Conan O'Brien

So, yes, I find it highly amusing that someone did this. Little disappointed that John Oliver didn't notice it first, because his commentary would be funnier and more racy.

But let's get this clear: there's nothing "obscene" about the sign. The one word that might be objectionable is self-censored. As there are five asterisks in the censorship, I have to assume that the original word wasn't dick, or todger, or tallywhacker, or cock, or schlong or ding-dong or weenie or pecker or tool or knob or... (you get the idea), but "penis."

Sure, it might have been "prick" in O'Brien's original quote. I don't know. But "penis" isn't obscene; it's the dry, official, medical name for the wang.

A guerilla street sign recently popped up near Boston’s Leonard P. Zakim Bunker Hill Memorial Bridge, immortalizing Brookline native Conan O’Brien’s rather poetic description of the iconic infrastructure.

You know what is obscene? Comparing a little street sign to guerilla warfare. Also calling a harmless joke that could bring joy and laughter to millions of people "obscene." And also, not least of all, saddling a poor bridge with the name "Leonard P. Zakim Bunker Hill Memorial Bridge." Even in Boston traffic, by the time you said the name of the bridge, you'd have crossed it.

The above isn't a slight on the late Mr. Zakim, just to be clear. He seemed like a good guy. Just that it's an obvious committee compromise name.

Exactly who erected the sign — and when — remains a mystery...

Okay. "Erected." Now that's funny.

“MassDOT dispatched resources to the location for the sign’s removal, and the sign was removed yesterday.”

But not until after a picture of the sign hit the internet, to be stored there until the heat death of the universe, or until we stop producing electricity, whichever comes first.

I get the need to remove unauthorized signs, of course. Allow even one, and you get more, which leads to transportation chaos—though in Boston, I wonder how anyone could tell the difference.

But look, I don't know how many dicks O'Brien has seen, but those towers don't look anything like human hydraulics. Sure, from some angles, they resemble obelisks (like the Washington Monument), but the whole point (pun intended) of an obelisk is that it's a stylized depiction of the male member. That is, it's not intended to be realistic, but a metaphorical, artistic, deliberately formalized representation of masculine potency. Also, I challenge anyone to come up with a suspension bridge design that doesn't involve tall towers, and tall towers will always look phallic if that's what you're looking for.

Maybe the Brooklyn Bridge in that other major East Coast city isn't phallic, but it's an older, less materially efficient design.

The sign, though? Not obscene. "But, Waltz, what about the children?" And? Half of them have johnsons, and all of them owe their existence, in part, to one. By the time they'd be able to actually read the sign, they'd have to understand what the asterisks might be censoring, and by then it's too late: their little innocence is already gone. As for adults who find it objectionable, fucking get over yourselves.

I'd be remiss if I didn't acknowledge that even the word "penis" is indeed non-E on Writing.Com. But, to put it in WDC content rating terms, that sign is ASR at worst, especially with the self-censorship.

So, in summary, yes, signs like that have to be removed. Not due to obscenity, but to maintain some semblance of order on the streets. And I disagree with the comedian's assessment.

But that doesn't mean I didn't laugh.
September 15, 2025 at 8:36am
September 15, 2025 at 8:36am
#1097412
Okay. Fine. I give up.

I ran the RNG as usual this morning, but when I went to the corresponding saved link, I found it had been paywalled. Too bad; it was an interesting story about the origins of a cryptid. But I couldn't find a non-paywalled version, so, ploink, into the trash.

Then I pulled another number out of the metaphorical hat and, behold: another paywall. This one would have been about the conception of the universe in the Middle Ages. But no, we can't have nice things.

Just to be clear, I'm not expecting everything for free. If I'm interested enough in a website, I'll subscribe. But what it does is put a damper on sharing articles. Now they've missed out on not only my subscription, but that of my legions of fans. I'm sure they're wailing and moaning about having three fewer subscribers, right now.

People who do the work deserve to be paid for it. Even writers. Traditionally, the main source of income for article writers is, ultimately, advertising. But no one seems to be able to make non-annoying ads, ads which, when I find something to read on my phone (which doesn't have ad-block), render the site essentially unreadable, what with hidden click-through points and constantly shifting floating ads.

Hence my use of ad-block, which, look, I know that makes me part of the problem, but there's a war on and I need my defenses.

Anyway, from what I've been hearing, ad revenue is drying up (possibly due to the proliferation of ad-blockers). Likely, this has prompted more sites to go to a paid subscription model. That can work if the site has decent content on a regular basis, but again, I can't share those articles here.

The existential problem with the advertising model, though, is that the advertiser gets to control your content, at least to some extent. Publish something controversial? Ads get pulled. Show a political bias different from the advertiser's? Ads get pulled. Dare to show a bare breast? Ads (and maybe other things) get pulled.

It's the golden rule once again: they who have the gold make the rules.

I'm not saying I'm changing the way I do blog entries. Just taking a break today to express my annoyance at the festering pile of bantha fodder most of the internet has become.
September 14, 2025 at 10:47am
September 14, 2025 at 10:47am
#1097342
Phun with philosophy today, from aeon:

    Reality is evil  Open in new Window.
Everything eats and is eaten. Everything destroys and is destroyed. It is our moral duty to strike back at the Universe


You know, every so often, Gnosticism pops up with different clothing and a fake nose. One of its basic beliefs is that the world is evil and one must work to find whatever "real" reality there is.

So, just based on the headlines, I figured here was another Gnostic in disguise.

Reality is not what you think it is.

One way to control you is to convince you that what is obviously reality is not reality, but the real reality is hidden away. This is, I believe, a form of gaslighting.

Like everything else that exists – stars, microbes, oil, dolphins, shadows, dust and cities – we are nothing more than cups destined to shatter endlessly through time until there is nothing left to break. This, according to the conclusions of scientists over the past two centuries, is the quiet horror that structures existence itself.

I suppose, from some point of view, it's horror. From another point of view, perhaps there's some comfort to be had in knowing that everything faces the same fate. No one is privileged.

Reality, as we now understand, does not tend towards existential flourishing and eternal becoming. Instead, systems collapse, things break down, and time tends irreversibly towards disorder and eventual annihilation.

And? it's not going to happen for billions of years. As I've always said: there's no such thing as a happy ending; there's only stories that end too soon.

We must start by admitting that the Universe is finite and will eventually end. Moreover, we must accept that the function of the Universe is to hasten this extinction.

I've said before that it very well may be that if life has a purpose, it's to accelerate entropy, so I've already done that admitting and accepting.

A metaphysics that responds to the full scope of the thermodynamic revolution needs to acknowledge the dissipative and destructive function lying behind the ‘generative’ force seemingly at work within reality. To do so requires moving from the classical optimistic metaphysics of becoming to a much more pessimistic metaphysics of absolute finitude and inescapable unbecoming: a metaphysics that reconceives of beings as nothing more than dissipative cogs in an annihilative machine.

Wow, I bet this guy's fun at parties. I should know; I'm fun at parties.

There's a lot more, of course. And just to summarize my own thoughts: I start with the same facts this author does, but I come to a somewhat different conclusion. My conclusion is this: Sure, the universe is trying to kill us. All the more reason to laugh in its face.
September 13, 2025 at 9:19am
September 13, 2025 at 9:19am
#1097273
After yesterday's gaze into the abyss, it's only fair that I share today's article, from Slate, which talks about something everyone knows: zucchini.

    The Vegetable That Wants to Die  Open in new Window.
Zucchini doesn’t even like itself, yet every summer, we pretend it’s worth growing and cooking. Is there any way to actually make it taste good?


Except not everyone knows it, do they? A good chunk of English speakers, as well as most Francophones, call it courgette. Just to add to the confusion, "squash" refers to a British drink that isn't beer or tea.

Why we refer to it by its Italian name is interesting, too. All squash is native to the Americas. But the particular variety with the green skin was bred in Milan less than two centuries ago. I can only assume that the French called it something else because they're French and needed to distinguish their cuisine from that of Italy. ("Zucchini" is plural; the singular form is zucchino, which I played with in the title today.)

A few summers ago, while I was visiting family in western Pennsylvania, my parents’ neighbor sauntered over and “gifted” us some garden zucchini... I was annoyed. Our neighbor hadn’t gifted us anything, he’d encumbered us with tough, water-logged, flavorless vegetable mass.

I'll admit, I'm not a big fan of zucchini. Partly, this is because, when I was a kid, we were that neighbor; there was entirely too much of it in our garden for us to eat. Partly, it's because my mom couldn't cook it worth a damn, and that sort of thing sticks with you. But to call it "flavorless" is a bit of a stretch, in my opinion. It has a flavor; just one I don't particularly like.

To be clear, I still consider it food. This is not the case with, say, eggplant (aubergine), which I consider not-food.

Tastes, however, vary, and I know from experience on both sides how hard it is to get someone to like something they just don't.

Because zucchini is a culinary pain in the ass, and people are running out of ways to cook it.

It can also be a literal pain in the ass, if you know what I mean.

It’s a scourge of a plant that grows fast and is prolific.

Can't argue with that. While it's been decades since I grew zucchini (or anything else, since as soon as I left the farm, every useful plant in my vicinity commits suicide), I still have nightmares about the bushels of the stuff I'd have to pick.

The hard fact is that among summer produce, zucchini just isn’t desirable, and it’s not an ingredient that I particularly like to cook with, either.

I don't think of it as an ingredient. I think of it as a side dish.

The traditional cooking methods that most recipes bring to bear don’t really do zucchini many favors (or flavors). The internet is overgrown with zucchini bread recipes (which, guys, is just spice cake with weird, wet fiber smuggled inside).

A friend of mine once made a zucchini pie. Like, a sweet pie, not a savory one, more like apple pie. Well, she might have done it more than once, but I was there for it once. Best I can say about it is that it was edible.

Sautéed zucchini and yellow squash is a classic side dish, but generally just reminds me of the boring meat/potatoes/vegetable plates you’d encounter at a middle-of-the-road steakhouse.

I think the problem with being a food writer is that you get jaded pretty quickly, and always seek out the new and shiny over the tried and true classics. Me? I'm not a food writer; I just sometimes write about food. So rather than go seeking out a new way to cook something, I prefer to master the old standbys.

Could it be that zucchini is a good or, dare I say, even great vegetable that’s been the victim of passionless flavors and ideas? Maybe zucchini isn’t the problem. Maybe I am.

No comment.

More flavorful solutions for zucchini abound. A proper Indian or Thai zucchini curry is wonderful, and the squash swells with deliciously flavorful aromatics like ginger and cumin.

And this is where I started to get interested. Not in zucchini, mind you: I'm of the considered opinion that if you have to do too much to a food to make it palatable, it becomes not worth it unless you're in survival mode. But, as I noted above, zucchini is a squash bred in Italy from stock that's native to the Americas. And now, here we are, with south Asians incorporating it into their dishes, on the entire other side of the planet from its origins.

That's the thing about food: once it's out there, it's out there, and it's fascinating to me how this works on a global scale.

Which is to say, zucchini is definitely work. It needs to be cared for, both in the garden and in the kitchen. Understandably, that may be more effort than the average person is willing to put in.

And way more effort than I, a distinctly below-average person, am willing to put in.

But I'm perfectly willing to try the product of others' creativity and hard work. I might even compliment it. As long as no eggplant is involved.
September 12, 2025 at 7:59am
September 12, 2025 at 7:59am
#1097212
This Nautilus article is fairly old, and long, and honestly may not be of interest to anyone but me.

    The Man Who Tried to Redeem the World with Logic  Open in new Window.
Walter Pitts rose from the streets to MIT, but couldn’t escape himself.


And no, he wasn't a Vulcan.

Not wanting to risk another run-in that night, Pitts stayed hidden until the library closed for the evening. Alone, he wandered through the stacks of books until he came across Principia Mathematica, a three-volume tome written by Bertrand Russell and Alfred Whitehead between 1910 and 1913, which attempted to reduce all of mathematics to pure logic.

While I'd never heard of Pitts before this, the Principia Mathematica is something I'd known about. A lot of work went into that ponderous set of tomes before Russell apparently came to the conclusion that it can't be done.

But this is more than a story about a fruitful research collaboration. It is also about the bonds of friendship, the fragility of the mind, and the limits of logic’s ability to redeem a messy and imperfect world.

One of the lessons of the story of PM itself is that there are some things that can never be proven within a logical framework. Another is that it certainly seems weird even to arithmophobes that it took over 350 pages to show that 1+1=2.

McCulloch, 42 years old when he met Pitts, was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m. Pitts, 18, was small and shy...

And yet, the two died within half a year of each other.

The moment they spoke, they realized they shared a hero in common: Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.

Liebniz is better known as Newton's rival in mathematics. The two developed calculus independently.

McCulloch, Pitts, and Lettvin were all poets at heart and in practice, and McCulloch and Lettvin regularly published their verse.

I'm just including this bit to show how you don't have to be a Writer to be a writer.

There is, of course, a lot more at the article, and it delves into the development of neuroscience and computer science, among other tough subjects. So, like I said, I understand if it's a little overwhelming. But personally, I like reading about the people doing the science, because they are, after all, people, with all the illogical and poetic crud that implies.
September 11, 2025 at 9:19am
September 11, 2025 at 9:19am
#1097110
Discussions like this one in Big Think have been going on for a long time, and will continue.

    What exactly is “life”? Astrobiologists still have more questions than answers  Open in new Window.
Our Earthbound definitions of life could leave us blind to the Universe’s strangest forms.


Okay, but I'd think that's inevitable, since we don't and can't know everything.

Defining exactly what we mean by “life” — in all its varied forms — has long been a formidable challenge.

Some things resist easy categorization.

Physicist Erwin Schrödinger wrote a book titled What is Life? in 1944.

I haven't read the book, so I wonder: is the title question asked in a tone like a little kid asking their parents the Big Questions?

By the traditional dictionary definition, “life” requires metabolism, growth, replication, and adaptation to the environment. Most scientists, therefore, don’t consider viruses alive because they can’t reproduce and grow by themselves and do not metabolize. Yet they possess a genetic mechanism that enables them to reproduce, with the help of a living cell.

So the problem here, as I see it, is one of categorization, not of science. Not to mention "traditional dictionary definitions" aren't the same thing as scientific definitions. But mostly, we like to label things and put them in clearly-divided cubbies. Unfortunately for us, things are rarely that neat. For another, simpler example, consider the uproar when the definition of "planet" was set up, and the definition excluded Pluto. But people can make up their own set boundaries (they just won't be accepted by science). If you want to expand the definition of "life" to include any reproducing system, you can do that. It may not comport with what we colloquially know as "life," but you can do it.

Because astrobiologists think not only about life as we know it but also life as we might find it, some of them gravitated toward the broad definition of life proposed by NASA: “A self-sustaining chemical system capable of Darwinian evolution.”

I'm sure a great deal of thought went into that definition, but it seems pretty broad from my outside point of view. The article questions it, too.

As usual, when considering such questions, we’re hampered by our limited state of knowledge.

Well, yeah. If we didn't lack the knowledge, we wouldn't be seeking it, would we? You can use "hampered," but I might have picked "challenged."

There is also the N=1 problem. How can we expect to arrive at a good definition of life when we have only one example: life on Earth?

Well, see, we can't even arrive at a good definition for life on Earth, as the article shows. If we find stuff on another planet that quacks like life and waddles like life, we'll call it "life," and modify the definition accordingly.

That's not how science works, I know, but again, the categorization question isn't really a science question, but a philosophy question that needs to be informed by science.

Maybe it’s partly a linguistic question. Grammatically speaking, “life” is a noun. But in biological terms, it’s more like a verb — more of a process than a thing. Defining life is something like defining wind, which describes air in motion — a state of being rather than a specific object. Wind molecules are the same as those of air, but their dynamic state is what defines them.

Okay, but there are processes that we don't define as life: the hydrologic cycle, for instance; or fire.

Maybe we should be consulting philosophers.

Finally.

I'm not sure if I was really clear above, so I'll reiterate this: Sure, we don't know. As I noted a few days ago, the universe is large and, practically, no one can explore it completely; consequently, there will always be things we don't know. And to me, that's a good thing; it means we can still learn.

202 Entries *Magnify*
Page of 11 20 per page   < >
<   1  2  3  4  5  6  7  8  9  10  ...   >

© Copyright 2025 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://shop.writing.com/main/profile/blog/cathartes02