*Magnify*
SPONSORED LINKS
Printed from https://shop.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/2
Rated: 18+ · Book · Personal · #1196512
Not for the faint of art.
Complex Numbers

A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number.

The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi.

Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary.

Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.




Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning Best Blog in the 2021 edition of  [Link To Item #quills] !
Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2019 Quill Award for Best Blog for  [Link To Item #1196512] . This award is proudly sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . *^*Delight*^* For more information, see  [Link To Item #quills] . Merit Badge in Quill Award
[Click For More Info]

Congratulations on winning the 2020 Quill Award for Best Blog for  [Link To Item #1196512] .  *^*Smile*^*  This award is sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] .  For more information, see  [Link To Item #quills] .
Merit Badge in Quill Award 2
[Click For More Info]

    2022 Quill Award - Best Blog -  [Link To Item #1196512] . Congratulations!!!    Merit Badge in Quill Award 2
[Click For More Info]

Congratulations! 2022 Quill Award Winner - Best in Genre: Opinion *^*Trophyg*^*  [Link To Item #1196512] Merit Badge in Quill Award 2
[Click For More Info]

   Congratulations!! 2023 Quill Award Winner - Best in Genre - Opinion  *^*Trophyg*^*  [Link To Item #1196512]
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the Jan. 2019  [Link To Item #30dbc] !! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on taking First Place in the May 2019 edition of the  [Link To Item #30DBC] ! Thanks for entertaining us all month long! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2019 round of the  [Link To Item #30dbc] !!
Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the September 2020 round of the  [Link To Item #30dbc] !! Fine job! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning 1st Place in the January 2021  [Link To Item #30dbc] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congratulations on winning the May 2021  [Link To Item #30DBC] !! Well done! Merit Badge in 30DBC Winner
[Click For More Info]

Congrats on winning the November 2021  [Link To Item #30dbc] !! Great job!
Merit Badge in Blogging
[Click For More Info]

Congratulations on winning an honorable mention for Best Blog at the 2018 Quill Awards for  [Link To Item #1196512] . *^*Smile*^* This award was sponsored by the blogging consortium including  [Link To Item #30dbc] ,  [Link To Item #blogcity] ,  [Link To Item #bcof]  and  [Link To Item #1953629] . For more details, see  [Link To Item #quills] . Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the January 2020 Round of the  [Link To Item #30dbc] ! Blog On! *^*Quill*^* Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the May 2020 Official Round of the  [Link To Item #30dbc] ! Blog on! Merit Badge in Blogging
[Click For More Info]

Congratulations on your second place win in the July 2020  [Link To Item #30dbc] ! Merit Badge in Blogging
[Click For More Info]

Congratulations on your Second Place win in the Official November 2020 round of the  [Link To Item #30dbc] !
Merit Badge in Highly Recommended
[Click For More Info]

I highly recommend your blog. Merit Badge in Opinion
[Click For More Info]

For diving into the prompts for Journalistic Intentions- thanks for joining the fun! Merit Badge in High Five
[Click For More Info]

For your inventive entries in  [Link To Item #2213121] ! Thanks for the great read! Merit Badge in Enlightening
[Click For More Info]

For winning 3rd Place in  [Link To Item #2213121] . Congratulations!
Merit Badge in Quarks Bar
[Click For More Info]

    For your awesome Klingon Bloodwine recipe from [Link to Book Entry #1016079] that deserves to be on the topmost shelf at Quark's.
Signature for Honorable Mentions in 2018 Quill AwardsA signature for exclusive use of winners at the 2019 Quill AwardsSignature for those who have won a Quill Award at the 2020 Quill Awards
For quill 2021 winnersQuill Winner Signature 20222023 Quill Winner

Previous ... 1 -2- 3 4 5 6 7 8 9 10 ... Next
June 10, 2024 at 9:56am
June 10, 2024 at 9:56am
#1072422
The person (P.Z. Myers) who writes the blog I'm linking today is a scientist—but he's a biologist, not an astronomer. So his take on astronomy probably isn't any more reliable than mine. This doesn't stop either of us from writing about it.



I'm sharing this because 1) I would love to see a supernova before I kick it (though hopefully not one that's too close) and 2) I'll take any opportunity to call out misinformation.

Something funny is going on 650 light years away…or should I use the past tense? Something funny was going on 650 years ago.

Yes, we know light has a speed, and we don't have good grammatical tenses for it. The way I look at it is: what matters is when we see the event; the distance (or time) when it happened is immaterial.

The star Betelgeuse is/was acting up, dimming and then brightening (well, it’s always been flickering a bit, but this was a greater reduction in brightness than usual.)

This is cool by itself, because it contributes to astronomers' understanding of how stars work.

And now some people are saying it’s about to go supernova!

Well, it is. For cosmological definitions of "about to."

There is a real-time deathwatch on YouTube. “LIVE Betelgeuse Supernova Explosion Is Finally HAPPENING NOW!” it says.

And that's where the misinformation, or at least misunderstanding, comes in.

I'll let PZ explain there at the link, or at least quote actual star-gazer type scientists.

Awww, but it sounds like it will be spectacular when we do get the Giant Space Kablooiee, and not spectacularly dangerous, the best kind of spectacular there is.

The Giant Space Kablooiee is not to be confused with the Horrendous Space Kablooie,   which was, to the best of our knowledge, what kick-started this enormous thing we call the Universe. It's not even in the same ballpark. To quote Jules from Pulp Fiction, it's not even the same sport.

But yeah, the science-talkers who should know stuff about these things don't think it'll do much, if any, harm to Earth when it happens. But "spectacular" is an appropriate adjective.

I do wonder if that guy running the live video feed is prepared to keep it going for 10,000 years.

No, but he's probably prepared to collect ad revenue from YouTube, taking advantage of the misinformed. Which is why I'm not going to click on that video.

Perhaps I am more cynical than PZ.

To be clear: we don't know enough about stellar lifecycles to predict just when Betelgeuse, or any other star of appropriate mass and age, will become a supernova. When it happens, next year or 10 millennia from now or somewhere in between, if there are still scientists around at that point, we'll learn more.

Hell, even the "650 light years" thing might not be correct. For whatever reason (it's discussed at the link I'm about to provide, though with an overwhelming amount of math), it's incredibly hard to estimate that star's distance with high certainty. Wikipedia   puts it at 400-600 ly, which, even on the low end, is still supposed to be comfortably far for a supernova (though close enough to look awesome).

Now, I've written about the probably-not-a-supernova-anytime-soon bit before, notably here: "Stardust. Today's update is more about the "live feed" nonsense. And maybe a bit of a distance correction.

If you want a live feed that's actually interesting, I usually remember to check this one   out occasionally around late spring/early summer. Bears!
June 9, 2024 at 8:44am
June 9, 2024 at 8:44am
#1072376
Delving into the depths of the past once again, the random numbers landed me on this origin story of sorts, from all the way back in 2008: "Housekeeping

On my computer at work, I have a Favorites folder called "Blog Fodder."

Fortunately, I no longer waste time at work, and my Blog Fodder list resides on my laptop. Nowadays, of course, these things are portable between devices, just when that feature became nearly useless to me.

Into it I drop the random links people send me, some of which end up here.

Every once in a while, someone will still send me a link, and I usually appreciate it. But I find most of my material from other sources. No, not social media, which I generally shun.

But there's usually more links than I want to blog about, or maybe some of them fit a theme while others don't.

Obviously, I ended up resolving this non-issue by picking just one at random when it's blogging time.

The rest of that entry was me cleaning house (hence the entry title) by dumping three links with brief quotes and commentary.

I won't bother rehashing that bit. The links are, remarkably, all still active as of right now, but being older and dumber now, I don't think I find them as amusing as I did 16 years ago.

And it's not really an origin story; apparently, I'd been commenting on links for a while even then. But that might have been the first time I explained anything about my process.
June 8, 2024 at 9:18am
June 8, 2024 at 9:18am
#1072332
This piece, a recent one from The Guardian, is about communication, so it may be of interest to writers here.

    The big idea: the simple trick that can sabotage your critical thinking  
Influencers and politicians use snappy cliches to get you on side – but you can fight fire with fire


Ha! I see what you did there, headline writer.

Since the moment I learned about the concept of the “thought-terminating cliche” I’ve been seeing them everywhere I look: in televised political debates, in flouncily stencilled motivational posters, in the hashtag wisdom that clogs my social media feeds.

No, you've always been seeing them. Now, you recognize them for what they are.

Coined in 1961 by psychiatrist Robert Jay Lifton, the phrase describes a catchy platitude aimed at shutting down or bypassing independent thinking and questioning.

These days, when I see or hear "platitude," I picture an angry platypus (one with an attitude). Look, it helps me, okay? Apparently, the word derives from the French plat, meaning flat or dull (I guess in the same sense that we use "flat" for non-glossy paints). But in that same language, plat also means dish—in the same way that, in English, "dish" can refer to both a plate (a word obviously also derived from plat) and the food you put on it to eat. This makes a platitude feel like something easily prepared and consumed, which is really damned appropriate.

Also, I'm pretty sure it was Ogden Nash who came up with the phrase "duck-billed platitude."

But I digress, as per usual.

In his book Thought Reform and the Psychology of Totalism, Lifton wrote that these semantic stop signs compress “the most far-reaching and complex of human problems … into brief, highly selective, definitive-sounding phrases, easily memorized and easily expressed. They become the start and finish of any ideological analysis.”

That's a whole lot of words to express what I call "bumper-sticker philosophy," which I suppose is my own thought-terminating cliché.

As the article points out, though, TTCs (do not expect me to type the phrase every damn time) aren't necessarily bad. Like any word or phrase, they can be used for good or evil.

Unfortunately, mere awareness of such tricks is not always enough to help us resist their influence. For this, we can blame the “illusory truth effect” – a cognitive bias defined by the unconscious yet pervasive tendency to trust a statement simply because we have heard it multiple times.

This gets weaponized a lot, too. Keep repeating things like "greed is good," and people start to believe it. Or, like someone we all know of, continue to lie and your minions lap it up as divine Truth, even though it's objectively a lie.

But what if we could do that with the actual truth?

To compete in the marketplace of thought-terminating cliches, then, our best bet might be to take what we know about illusory truth and harness it to spread accurate information.

Like that.

Beyond repetition, studies show that people perceive statements as more believable when presented in easy-to-read fonts or easy-to-understand speech styles, such as rhyme.

Or, you know, alliteration like in my example above.

This, of course, is where writers come in.

And a 2021 study showed that humour is among the qualities that make information more memorable and shareable. A titbit is “just more likely to spread if it’s funny”, says Scheirer.

Did you know that we here in America call small morsels of food or information tidbits instead of titbits because our Puritanical sensibilities required the change? Yeah? Then you were subjected to misinformation.

Probably. I mean, it felt right to me, too, because I'm not without bias, myself. But apparently, "tit" has at least two entirely separate etymologies.

It doesn’t only have to be shameless disinformers who exploit the power of repetition, rhyme, pleasing graphics and funny memes. “Remember, it’s OK to repeat true information,” says Fazio.

The trick, of course, is to know what's true, and in a postmodern world, that's becoming increasingly difficult. Even facts that should be universally accepted, such as the approximate shape or age of the Earth, are subject to strong opposition.

As the article implies, though, this doesn't mean we shouldn't make the attempt.
June 7, 2024 at 10:40am
June 7, 2024 at 10:40am
#1072278
Well, this is one way to get attention. And possibly draw more countries into WW3.

    Croissants aren’t French and pizza sauce isn’t Italian – the national dishes that aren’t from where you think  
A food historian has kicked up controversy after claiming that there is ‘no such thing’ as Italian cuisine, sparking debate over the origins and ownership of food. But perhaps we should reconsider our ideas about so-called ‘national dishes’, suggests Hannah Twiggs


In a new book called La Cucina Italiana Non Esiste (literally “Italian Cuisine Does Not Exist”), food historian Alberto Grandi claims, among other things, that Italians only discovered tomato sauce when they emigrated to the Americas, where tomatoes are native, in the 19th century.

It's trivially true that Europe in general had no idea about tomatoes until they started sailing back and forth across the Atlantic. That's hardly news. Reports of the fruit/vegetable's suspected toxicity may have been exaggerated, but clearly, at some point, they started eating them over there. The "tomato sauce" thing, though, I hadn't heard that before.

In an interview with the FT last year, he said that everything from parmesan and panettone to carbonara and tiramisu weren’t fundamentally Italian.

Okay, for various definitions of "fundamentally," I suppose. Are we going to take what they ate in ancient Rome, or Pompeii, and call that the only true Italian cuisine? Do we reject the idea of pasta as Italian because Marco Polo brought the idea for noodles back from Cathay? I mean, I don't know for sure if that happened, but my point remains.

Perhaps most controversially, he claimed that parmesan produced in Wisconsin was more authentic than Italy’s because it was closer to the original cheese produced in Parma-Reggio a millennium ago.

Ah, yes, the old "it's only authentic if it's done the old way" argument. Which is, generally, nonsense. Beer, for example, is a much more palatable product now than it would have been "a millennium ago," thanks to hops and science. Advances in the culinary arts, from technology, experimentation, imports, or just plain accident, contribute to the cultural landscape, not detract from it.

He was quick to point out, though, that he’s never questioned the quality of Italian food or products. “The point is that we confuse identity with the roots, which we are crossbreeding,” he told La Repubblica. “We wrongly talk about identity: cuisine changes continuously.” For example, the Italians and French are the biggest consumers of sushi in Europe.

As far as I can tell, everyone borrows or steals stuff they like from other cultures' cooking. Sometimes, it becomes so popular that it's identified with the new culture more than the old.

Case in point: Apple pie wasn't invented in America.  

Michele Pascarella, owner of Chiswick restaurant Napoli on the Road – who has won countless awards for his pizza (eighth best in Europe) including being declared best pizzaiolo in the world last year – says it’s not about who does it first, but who does it best. “Italy is a country with an enormous food culture, passed down through generations, that doesn’t need to win any contest for who did it first in the world,” he tells me. “Our cuisine is envied all over the world and we even continue to make a difference today. Alberto Grandi is the flat-earther of gastronomy.”

I'm including this quote because "Alberto Grandi is the flat-earther of gastronomy" is one of the sickest burns ever ignited.

Nor do you hear protestations from the Japanese about tempura or the Indians about vindaloo. Both have Portuguese origins. Catholic missionaries brought the Western-style cooking method of deep frying to Japan in the 16th century, while vindaloo is derived from the Portuguese “vinha de alhos”, referring to the dish’s two main ingredients, wine and garlic.

I've written about this sort of thing before, when another source claimed there's no such thing as English food: "Food, Glorious Food

With all that in mind, I have to wonder: does it really matter who invented tomato sauce on pizza? Or where your croissant is from? To acknowledge that one country might have had an impact on the food of another isn’t to be complicit in cultural appropriation. Point to any dish on a menu and you’ll have a hard time finding one that hasn’t got war, politics, economics, emigration or poverty to thank for its place there.

Not just the dishes, either. Here in the US, it's standard practice to keep salt and pepper shakers on the table. It's ubiquitous. Look up the origin of black pepper sometime. I've done that before, but I can't be arsed to find it again right now.

I'll just close today with this: one of the greatest benefits of living in a technoglobal society is that we get to pull from everywhere. All due respect to the well-meaning folks who urge "eat local," but there's no way I'm willingly giving up the benefits of being able to order tea from China, spices from India, hot sauce from Belize, beer from Belgium, wine from France, etc. (Yes, I know we have perfectly good beer and wine right here; it's the principle of the thing.) And those are just the tangibles; more importantly, we get ideas from all over the world. Some of those ideas suck, but some of them (like beer) make life worth living.
June 6, 2024 at 9:51am
June 6, 2024 at 9:51am
#1072234
From Wired, some panic about a thing that might not even happen.

    What Happens If a Space Elevator Breaks  
These structures are a sci-fi solution to the problem of getting objects into orbit without a rocket—but you don’t want to be under one if the cable snaps.


Yes, they're a staple of science fiction, but early concepts for them were thought up way back in the 19th century, before even controlled powered atmospheric flight and way before achievement of space travel.

They're also called Tsiolkovsky towers after the Russian who apparently first conceptualized them. According to Wiki,   other names include "space bridge, star ladder, and orbital lift." Personally, I find all those names descriptive but boring. No, I think it should be called a space lift. Because it rhymes with "face lift," and I find that amusing. Everything should be named in accordance with what I find amusing.

In the first episode of the Foundation series on Apple TV, we see a terrorist try to destroy the space elevator used by the Galactic Empire. This seems like a great chance to talk about the physics of space elevators and to consider what would happen if one exploded.

Not even the prospect of a Foundation series is enough to get me to pay any attention to Apple TV. But the idea, as I said, isn't new, nor is thinking about the security issues surrounding them. If you think about it, the idea of a tower connecting Earth to space will inherently offend a large portion of the population, who will inevitably call it the Tower of Babel, decide that it's against God's will, and actively try to blow it up.

This is why we can't have nice things.

Well, actually, we can't have a space lift because, so far, no one's been able to come up with a material that with the appropriate requirements, or address numerous other issues including powering the thing or shielding any passengers from space radiation. As far as I know.

Anyway, the article mentions that, sort of, but starts out by explaining, in great detail, how a space lift would be an improvement over rockets. This takes up most of the article, and I'm not quoting that part. Only after that does it get into what the headline promised.

In the first episode of Foundation, some people decide to set off explosives that separate the space elevator’s top station from the rest of the cable. The cable falls to the surface of the planet and does some real damage down there.

Again, not the first time this has been considered. I don't remember a scene like that from the Foundation books (by Asimov, whose entire output I devoured as a teen), but it's been a very long time. I do remember Kim Stanley Robinson addressing the issue in his Mars series.

The (over-simplistic, which the author acknowledges) video model in the article aligns with what I recall of Robinson's description.

So not only is building a space elevator very difficult, but you really don’t want the cable to snap and fall.

At this point, though, it's firmly in the realm of speculation, both in science fiction and actual science. The problem may be so tough that we'll actually develop antigravity first, which would negate the need for a space lift. Which is why you don't see many space lifts in Star Trek (budget considerations notwithstanding).

Of course, all of that assumes that a small remnant of humanity doesn't get plunged back into subsistence living before either happens, which I'm not prepared to rule out.
June 5, 2024 at 9:47am
June 5, 2024 at 9:47am
#1072179
Part of the whole point of experiments is to falsify a hypothesis. But sometimes, if your hypothesis is precious to you, and your experiment fails to support it, well, really, your only choices are to lie about it or suppress the results.

Or you could, you know... own up to being wrong, like some of these folks highlighted in Cracked.

    5 Experiments That Proved the Exact Opposite of What They Wanted  
Turns out Flat Earthers and anti-Semites are no match for science


Did you know that visitors to this site are statistically likely to be sexier than people in general?

Well, obviously I always suspected it.

At least, that’s the hypothesis behind the new study we’re doing.

For some reason, I wasn't contacted about the study.

Sometimes, a study doesn’t prove what it sets out to.

Seriously, though, that's usually a good thing. For starters, it's job security for researchers.

5 The Wallet Experiment

You find a wallet someone dropped. Do you seek to return it, or do you keep the cash and toss the rest?


Toss a perfectly good wallet containing perfectly good identity theft material? No way!

Some wallets contained cash, while others did not. Researchers predicted that people would more likely return wallets that held no money. The researchers also held surveys before the experiment, asking either the general population or hundreds of economists, and all agreed that the cashless wallets would more likely be returned.

Well, sure. That's just common sense.

The experiment ended up offering all kinds of data about how people vary in honesty, but it showed one consistent trend everywhere: People were always more likely to return a wallet that held money.

And this supports my opposition to "common sense."

One possible explanation, that people hoped for a larger reward from the money wallets, did not hold water.

They seem to have overlooked the most likely explanation: Contrary to common sense, people in general don't suck. Most of them are good, or at least neutral. This ties in with my Lone Asshole Theory, that while the majority of people are decent, all it takes is one bad one to ruin your day.

4 The Attempt to Make Cuddlier Hamsters

There's science, and then there's mad science.

In 2022, scientists used CRISPR tech to totally remove a type of receptor from the brains of hamsters. These receptors are called Avpr1as, and they respond to a chemical called arginine-vasopressin, which makes males aggressive. By removing the receptors, the scientists figured we’d render the males passive and cuddly.

Spoiler: it had the opposite effect.

And this is why science fiction should be required reading/viewing for scientists. In this case, specifically, Serenity, from 2005.

3 The German Census of Jewish Shirkers

Today's equivalent might be "the fundamentalist Christian census of drag queen pedophiles."

So, the government had expected to find evidence of Jewish soldiers shirking duty, but they found the opposite.

Bet they didn't like that.

Instead, the census really did debunk those theories, the way the government claimed they wanted but really did not. So, authorities responded by refusing to release the results.

No, they did not like that at all.

Apparently, the other option (flat-out lying about it) didn't occur to them, so... point for prewar Germany?

Nah.

2 Zeeman and His Math Sphere

And we both just lost half our readers.

Not going to quote this one; the article makes it complex enough.

1 The Flat Earthers’ $20,000 Gyroscope Investigation

Speaking of dimensional spheres, if you’re seeking evidence that the world is round, you’ll manage that easily. Just watch the sunrise over the horizon, and boom, you’ve seen the curvature of the Earth.

Except that flat-earthers find ways to convince themselves that this doesn't mean Earth is roughly spherical.

If you’re seeking evidence that the world is flat, however, well, that’s a bit tougher.

Yeah, that would require a trip to North Dakota.

(People think Kansas is flat, but it's got nothing on ND.)

A second experiment used more advanced technology. Host Bob Knodel brought out a $20,000 laser gyroscope, which should keep to the same vertical orientation no matter what goes on with the ground beneath it.

At least they came up with an experiment. Most flat-earthers just spout their nonsense on YouTube or whatever.

What I find ironic about this is that the idea that Earth is round, and the technology behind $20,000 laser gyroscopes, are both products of science. How a person can accept one without the other is questionable.

This, of course, reminded me of "Mad" Mike Hughes, a flat-earther who built a fucking rocket to "prove" his hypothesis. Spoiler: He died in the process. I've written about him before, mostly here: "Bad Advice

Troublesome data must be rejected. When experiments challenge your world view, you must switch to a different experiment you can trust.

Well, no. But that's certainly how some people get through life.
June 4, 2024 at 9:27am
June 4, 2024 at 9:27am
#1072131
There are topics I've never discussed before in here. This isn't one of them. From Gizmodo:

    Updated Formula on Alien Intelligence Suggests We Really Are Alone in the Galaxy  
An adjustment to the famous Drake Equation could radically refine estimates of intelligent civilizations in our galaxy.


Sigh.

Using the adjective "intelligent" there just begs people to make tiresome "no intelligent life here, either" jokes. In an effort to forestall this, I'll use "tech-capable," as that's really what we're looking for, and there's little debate over whether we fit that description (any such debate that takes place on the internet would be the dictionary definition of irony).

Astronomer Frank Drake...

Unclear if he's related to Sir Francis Drake.

...formulated his influential equation in 1961 to estimate the number of civilizations in the Milky Way capable of communicating with us.

Eh... not really. It's more subtle than that. Look, this article fails on many levels, but the very top level is that it doesn't explain the Drake Equation, which isn't really an equation but a way to think about these things systematically.

I'm not going to copy it here, either, because I can't be arsed to do the formatting. But at least I'm providing a link to the Wiki page on it.  

I also want to emphasize that the search for tech-capable life (or, by extension, any autonomous machines they might have created) isn't the same thing as the search for extraterrestrial life. The latter doesn't necessarily result in the former; as I've noted before, there's nothing inherent to evolution that requires the development of tool-using species who go on to build spacecraft. All we know for sure is that it can happen, as it did happen here. On the other hand, I think it's a safe assumption that tech-capable life requires non-tech-capable predecessors.

Fortunately, the Drake Equation explicitly separates those two.

Since Drake thought up this way of looking at things, we've gotten a better handle on a lot of the parameters. For one thing, back in 1961, we had absolutely no evidence that other stars had planets orbiting them. Now, based on the sample set of our relatively close stellar neighborhood, it seems likely that the vast majority do.

Anyway, back to the article.

Our understanding of planetary science has changed a lot since then, leading a team of scientists to propose a pair of important adjustments that produce an answer that could explain the Great Silence.

This is, as the article notes in the next paragraph, related to the Fermi Paradox, which I mentioned last week.

However, as we'll see, these "adjustments" don't need to be incorporated into the Drake Equation, as they're inherent to one of its factors.

Planetary scientists Robert Stern from the University of Texas at Dallas and Taras Gerya from ETH-Zurich, the two co-authors on the study, suggest that the presence of both continents and oceans, along with long-term plate tectonics, is critical for the emergence of advanced civilizations.

With all due respect to the credentials of the scientists in question, I, though admittedly an amateur, feel like it really just puts a lower limit on the fi term in the Equation, the one that represents the fraction of life-bearing planets that go on to produce tech-capable species.

In other words, as we have absolutely no idea what fi actually is, having only a sample set of 1 to work with, it's just as valid to incorporate the ideas of continents and continental drift into that term, which probably makes it lower. That fraction might be 1/1. But it also might be 1/10100. Or lower. Or anything in between. As I've noted before, we might have won the planetary lottery, and once you've won the lottery, the prior chance of having won it is irrelevant.

They consequently propose the addition of two factors into the equation: the fraction of habitable planets with significant continents and oceans and the fraction of those planets with plate tectonics operating for at least 500 million years. This adjustment, however, significantly reduces the value of N in the Drake Equation.

So does plugging in arbitrarily low values. I've seen online Drake Equation calculators, and none of them that I've found allow for what I consider to be realistically low guesses. And "guesses" is what they would be.

“Our work suggests that both our planet Earth with continents, oceans, plate tectonics, and life and our active, communicative, technological human civilization are extremely rare and unique in the entire galaxy,” Gerya told Gizmodo.

To be clear, I have little doubt that they're right about the continental drift thing, and a slightly higher but not significant doubt that they're right about this guess. What I object to is further muddying the waters by adding terms to the DrEq (I'm calling it that from now on), which is already confusing to a large number of humans, thus adding fuel to the "no intelligent life down here, either" crowd.

The researchers argue that the presence of large oceans, plus Earth’s shift from single-lid tectonics (a stable surface layer) to modern plate tectonics about 1 billion years ago, were critical to the rapid development of complex life.

Again, not arguing that point. But it's just part of fi.

According to the new study, plate tectonics are crucial for developing complex life and advanced civilizations. Earth’s plate movements create diverse habitats, recycle nutrients, and regulate climate—all vital for life. It’s important for plate tectonics to last for 500 million years, Gerya explained, because biological evolution of complex multicellular life is extremely slow. “On Earth, it took more than 500 million years to develop humans from the first animals, which appeared around 800 million years ago,” he said.

Something about that math doesn't work out, as humans branched off from other apes something like 6 million years ago, admittedly with a large error bar.

I should take this opportunity to point out that the other reason I use "tech-capable" instead of "intelligent" is that there's more and more evidence that "intelligence" is not unique to humans. Nor is tool use. But the idea of making a tool to make other tools, and so on recursively until we get spaceships, well, that seems to be unique to humans on this planet. Dolphins don't seem to show any interest in creating computers. Crows (see yesterday's entry) just seem to use their intelligence to fuck with us.

The modified Drake Equation suggests that advanced civilizations are extremely rare, with the chance of planets having the right conditions being between 0.0034% and 0.17%. This means there could be anywhere from as few as 0.006 to as many as 100,000 active, communicative civilizations in our galaxy, with the actual number likely being on the lower end, considering the limited time these civilizations might communicate due to potential societal collapse or extinction.

Oh, Thor's balls. 0.006 is obviously wrong, because we'd make it 1. And that's still a damn wide range. Not to mention the "limited time" bit is already in the DrEq; it's represented by L.

Because the low estimate is really close to zero, it means there’s a good chance there might not be any other civilizations in our galaxy. This would help explain why we haven’t detected any signals from other civilizations yet.

Well, DUH. That's what I've been trying to say.

In the past, the Drake Equation gave a much higher low-end estimate, suggesting that it was almost certain we weren’t alone and that there should be at least 200 civilizations trying to communicate with us.

That's not a failure of the equation. That's a failure of humans to comprehend the possibility of really large denominators.

Stern and Gerya aren’t the first to propose the idea that suitable planets for advanced life are few and far between. This suggestion, known as the Rare Earth Hypothesis, was first articulated in the 2003 book Rare Earth: Why Complex Life is Uncommon in the Universe, written by scientists Peter Ward and Donald Brownlee. Interestingly, Ward and Brownlee were likewise fixated on plate tectonics as a factor.

I don't think this is an ad for that book. But, full disclosure here, I've read it. Some of its hypotheses, such as the requirement to have a relatively big moon to help stabilize polar shifts, have been pretty much ruled out, from what I understand. Not surprising; a lot has been discovered in the last 21 years. But that doesn't negate the whole book.

The Rare Earth Hypothesis, while seductive, fails to account for the adaptability of life and the potential diversity of habitable environments.

Neither of which guarantees the development of tech-capable species. It's been a while since I've read that book, but as I recall, their guess was that life is pretty common, but tech-capable life, not so much.

Another limitation of this study, and this is no fault of the researchers, is that we’re still far from knowing which values to plug into the equation.

Exactly. Though, as I've said, we continue to refine some of the estimates.

Part of the problem with adding more terms is that each term magnifies the possible range of the result. Say you have a=b*c. Each term on the right of the equation has a range of possible values. By multiplying them together, you're also multiplying the range. Now incorporate d and e into the equation, both ranged, and those ranges multiply, too. All of which is to say that the "we just don't know" factor increases substantially with each term. And that's pretty much the reason Occam shaved with his Razor.

None of this gets past the "we don't know" stage. I know some people are dead certain, one way or the other. What's important, to me anyway, is that we're prepared to adjust our prior beliefs in the face of evidence. If aliens landed in my street, for example, I'd suddenly believe in them (provided, of course, someone hasn't slipped me some psychedelics).

What I really object to is lazy science reporting. Especially when the article, at base, does make some good points.
June 3, 2024 at 8:51am
June 3, 2024 at 8:51am
#1072084
It's fairly well known that corvids are hardly birdbrains, but this recent article from SciAm delves into further evidence, while also punning on one of my favorite bands.

    Crows Rival Human Toddlers in Counting Skills  
Counting crows proclaim “caw, caw, caw, caw” when staring at the number four


The rock group Counting Crows were onto something when they chose their band name. Crows can indeed count, according to research published this week in Science.

That's not why they picked the name, but by Waltz's Law, never let facts get in the way of a good joke. Or a bad one. Especially a bad one.

But, you know, that's just, like, a pinion, man.

Okay, okay, puns out of the way for now, there's some interesting stuff here.

The results show that crows have counting capacities near those of human toddlers who are beginning to develop a knack for numbers, says lead study author Diana Liao, a postdoctoral researcher in neurobiology at the University of Tübingen in Germany.

So, no, they're not about to take over the world or build rockets to Mars.

“We think this is the first time this has been shown for any animal species,” she adds.

I know there's been research done in this area, but I can't be arsed to search for examples countering what the researcher says here. Yes, there have been claims that horses, e.g., can count, but those usually turned out to be hoaxes or misunderstandings.

They presented the birds with randomly ordered cues, four of which were visual—colored Arabic numbers that appeared on a touch screen—and four of which were auditory, including a short guitar chord and a drumroll.

Please, please tell me they sampled the latter from Counting Crows albums. Well, in my headcanon, they absolutely did.

Onur Güntürkün, a biopsychologist at Ruhr University Bochum in Germany, who was not involved in the research, says the new paper is “excellent”—even if the findings are “not unexpected” given all that scientists already know about crows and many other species’ intelligence.

Like I said: we know they're smart. They also apparently have a long memory. There are plenty of stories about people who pissed crows off, only to find themselves harassed by black-winged birds for the rest of their lives, indicating that not only does the pissed-off crow take revenge, but gets all their corvid buddies and offspring to punish the offender. People associate elephants with never forgetting, but elephants ain't got nothin' on crows.

On the flip side, do something nice for a crow, and they could become your army of darkness. I've thought about doing this by leaving shiny things out for them to collect, but I'm entirely too lazy, and what would I do with bird minions, anyway? Get them to attack surveillance drones? No, they might get hurt. Besides, my cats might get jealous.

Obviously, I can't leave this entry without an appropriate video:



If you think you need to go
If you wanted to be free
There's one thing you need to know
And that's that you can't count on me
June 2, 2024 at 10:06am
June 2, 2024 at 10:06am
#1072011
As regulars know, I do these retrospectives, in part, to see what's changed about the world and in me since the original post. Usually, I don't find my attitude much different. That's not really the case with this entry from February of 2020: "Nonbinary

Today's thing is a bit different from usual: a gender-issues retrospective from a male homo sapiens.

For starters, I cringe at "different from."

The article   in question, from 2018, is still up at this time, but I didn't re-read most of it.

Not only is the weight issue a pressing thing for many men, but I avoid online dating specifically because most of the women on there specify 6' or taller.

That's actually not the only reason I avoid online dating. But I wonder what some of these women's reaction would be to a man who requires "natural D-cup or larger only." Not great, I'd imagine.

I can do something about my weight, and I am.

Turned out I couldn't, so I stopped trying.

Can't do anything about my height.

Since then, I've seen pieces about some kind of surgery that puts extenders in your leg bones to make you taller. The surgery isn't risk-free, and from all accounts, it's painful with a fairly long recovery time. Not worth it, in my book.

We desperately need "accept yourself as you are" messages, but there's no money in that, so instead we're stuck with "you're just not good enough, but if you buy my crap, you'll finally be satisfied." (Spoiler alert: you'll never be satisfied.)

What's height got to do with anything besides blocking the people behind you in the theater?

I recently read an article in which some vapid lady expressed a desire to be "swept off her feet" both literally and physically, and apparently that could only be done by a 6'+ man? Okay, well, I wouldn't be interested in her, either.

I have to admit, as an aside, that I find it hard to see the whole "gender is a social construct" thing. I mean, yes, certain aspects of it are, in my view, but they're mostly superficial things: a particular style of haircut, wearing bigger watches (if one wears a watch at all these days), a lack of makeup.

Since then, I've come around more to to that point of view.

Basically, if something is a social construct, the social construct can be changed.

I stand by the "can be," but I recognize that those things take time, often longer than a human lifespan.

One of the greatest failings of modern society, I think, is this "binary" myth.

Standing by this statement, too. I've railed on binary thinking many times since then. And no, this isn't limited to the male/female binary; I think Netflix jumped the shark, for example, when it removed star ratings in favor of Siskel and Ebert style thumbs. But, lest anyone be confused by the entry's title, this sort of thing is what it referred to.

But still, my views have evolved somewhat. One thing, though, that I hope never changes about me:

And everyone deserves basic human dignity and rights (at least until they prove themselves unworthy of such by their actions), regardless of what pigeonhole you or society says they belong in.

I trust that's clear enough.
June 1, 2024 at 9:13am
June 1, 2024 at 9:13am
#1071970
Wars have been fought over less than this.

    The Man Who Went to War With Canada  
For centuries, the United States and Canada’s only remaining land border dispute has been kept alive by a single family.


Article is from Atlas Obscura and 2019, but the Wikipedia article   suggests that the issue hasn't been resolved over the last five years.

After probably a few hours at sea, they reached the island’s rocky shore, and managed to land among the island’s main residents: puffins, razorbills, murres, and Arctic terns.

If I were that guy, I'd have pointed at one of the latter birds and proclaimed: "It's my tern now!"

Depending on whom you ask, Machias Seal Island is either off the coast of Maine or of Grand Manan. It’s also either American or Canadian. It is the only place with this particular unsettled identity that you can actually stand on top of. Although the ownership of some stretches of water is still contested, this island—and neighboring North Rock, which is even smaller and barer—are the last crumbs of their land the two countries don’t agree on.

Just wait a few years, and climate change will settle the debate for us.

I feel obliged to point out that while I've never been to this island, it sits within easy distance of the easternmost point of the continental US: Quoddy Head Lighthouse. I visited that spit of land when I decided to travel from it to the westernmost point, in the state of Washington. Quoddy Head was the easier of the two to reach, requiring significantly less hiking and way fewer bears.

The article delves into the source of the land dispute, then:

If the world were more just, this would all be moot: The people of the Passamaquoddy Nation likely used the island long before anyone else even knew it existed. (“Machias,” also the name of precipitous local river, is a Passamaquoddy word that means “bad little falls.”) Instead, even as their identifications and affiliations have shifted, the neighbors have kept squabbling over it, like a pair of growing siblings in a shared bedroom.

Look, let's hope it stays on the "siblings" level of quarrel. We don't want to go to war with Canada. We have a history of losing said wars.

The rest of the (rather long) article delves into the American side of things, focusing, as the headline implies (look at that, a headline matching up with the text), on one person's, or more properly one family's, quest to keep the US claim to the island.

I won't quote further, but it's a way more interesting read than the Wiki page.
May 31, 2024 at 7:14am
May 31, 2024 at 7:14am
#1071930
If you're not conscious, this article isn't for you.

    The nature of consciousness, and how to enjoy it while you can  
In his new book, Christof Koch views consciousness as a theorist and an aficionado.


Well, I guess he's solved the Hard Problem,   then.

Now, with AI systems behaving in strikingly conscious-looking ways, it is more important than ever to get a handle on who and what is capable of experiencing life on a conscious level.

Oh, that's an easy one. Only I experience life on a conscious level. The rest of you, including the AIs, only mimic actual consciousness.

Solipsism makes everything simpler.

Koch, a physicist, neuroscientist, and former president of the Allen Institute for Brain Science, has spent his career hunting for the seat of consciousness, scouring the brain for physical footprints of subjective experience.

So, okay, someone with credentials, and not just some guy (like me). Doesn't mean he's right, mind you (pun intended), but it does mean it catches my interest.

It turns out that the posterior hot zone...

Posterior hot zone? Seriously? That's what you geeks are going with? You're just begging for it, aren't you? Okay, I'll bite: "Scarlett Johansson has a gorgeous posterior hot zone."

(In reality, I don't find asses to be attractive. But that's never stopped me from making jokes.)

Seriously, though, shouldn't they have looked up those words in their handy Latin dictionaries and called it that, like they do with most chunks of anatomy? Google tells me it's "calidum zona," because I haven't had an actual Latin course in 40 years.

Moving on...

...a region in the back of the neocortex, is intricately connected to self-awareness and experiences of sound, sight, and touch.

This ties in with what I believed to be the reason for consciousness: the nervous system had to evolve in such a way as to integrate sensory experiences, and those mechanisms got hijacked into "awareness." But I'm just some guy, so you shouldn't take that seriously.

Dense networks of neocortical neurons in this area connect in a looped configuration; output signals feedback into input neurons, allowing the posterior hot zone...

Snort.

...to influence its own behavior. And herein, Koch claims, lies the key to consciousness.

Makes sense, sure, but has he, or anyone else, done the science to back that up?

This declaration matches the experimental evidence Koch presents in Chapter 6: Injuries to the cerebellum do not eliminate a person’s awareness of themselves in relation to the outside world.

Okay, so there is some support.

His impeccably written descriptions are peppered with references to philosophers, writers, musicians, and psychologists—Albert Camus, Viktor Frankl, Richard Wagner, and Lewis Carroll all make appearances, adding richness and relatability to the narrative.

I mean, that's probably good writing; I'm not sure it's good science. As this is basically a book ad, though, I can cope.

The takeaway from the first half of Then I Am Myself the World is that IIT might offer the best-fit explanation for consciousness—a view, it’s worth mentioning, that is highly contested by many other neuroscientists.

Good. That's how science gets done.

Koch discusses transformative states of consciousness in the second half of his book, including near-death, psychedelic, and mystical experiences.

Aaaaaand that's not.

He also discusses the expansive benefits of sustained exercise—drawing upon his personal experiences as a bicyclist and rock climber—through which a person can enter “the zone.”

The zone? You're telling us how to enter the posterior hot zone?

Koch suggests that exercise, meditation, and the occasional guided psychedelic might be beneficial to many people.

Jokes aside, he's not the first scientist to come up with that nugget. Timothy Leary comes to mind, though it's arguable whether psychologists are real scientists.

Oh, and no, he didn't solve the Hard Problem of anything except "how to market your book." Nevertheless, I found this review/ad interesting enough to share. Even if he is talking out of his posterior hot zone.
May 30, 2024 at 8:09am
May 30, 2024 at 8:09am
#1071890
From Cracked, an example of how we can understand and not understand science at the same time. Kind of a superposition of states, like an unobserved particle. See? I can science metaphor. It's wrong, but I can do it.

    4 Scientists Who Only Added More Mystery to the World  
Their problem-solving skills are only rivaled by their problem-creation skills


The misunderstanding is that scientists solve problems. I mean, sure, you get some answers, but those answers always lead to more questions. This is good, though; it's job security.

You want actual problems solved? That's what engineers are for.

There’s a bit of an erroneous belief that in order to become a famous scientist, you have to solve problems.

At least the author admits that it's a misunderstanding.

However, there’s another, arguably vastly more annoying way to get your name on an enduring thought. That’s to come up with a brand new intellectual mess for all the other scientists to have to try to figure out.

Like I said. Job security.

4. Thomas Young

I will admit that I've either never heard of this individual, or forgot that I did.

In 1801, Thomas Young disagreed with the popular belief that light was made up of particles. He believed that light was, in fact, a wave, and so he cooked up an experiment to prove it. He cut two slits in a sheet of metal and shone light through them.

Oh, yeah, the double-slit experiment. I've certainly heard of that. Hell, in physics lab in college, we performed it, only we used lasers (which hadn't been invented yet in 1801). It's clearly more famous than its creator.

Young set out to perform a simple experiment with light, and ended up creating what would be called “the central mystery of quantum mechanics.”

Sometimes, complex questions have simple answers. This is a case of the opposite. The article goes on to explain exactly why, and, miraculously, it conforms with my prior knowledge (that is, I'm sure an actual physicist could pick it apart, but for a comedy site, it's remarkably accurate).

3 Fritz Zwicky

This one, I'd heard of.

You might not know the delightfully named Fritz Zwicky, but you have heard the two words he coined in combination: dark matter.

Thus also providing job security for science fiction writers who are free to give it all kinds of magical properties.

Dark matter, which is — keep in mind that as an art major, I am fighting for my life here — matter that contains mass but emits no light and therefore cannot be observed, was his best, confident attempt at making some very nonsensical measurements make sense.

I gotta say, I'm impressed that an art major got so many things right. No disrespect intended to art majors, but they're not known for understanding physics. On the flip side, I have some small understanding of and education in physics, but I can't do art to save my life, so it all balances out.

Anyway, in my own amateurish way, I tend to see the concept of "dark matter" as a placeholder, a concept expected to have certain properties. Kind of like the luminiferous ether proposed before we more fully understood the nature of light (as per #4 above). When we finally figure it out, I'd predict that "dark matter" will be an historical relic, like luminiferous ether or the humour theory of medicine.

2. Enrico Fermi

You know this guy developed other stuff, right? Like, his famous "paradox" (which I've insisted in the past is not an actual paradox) was kind of a footnote to an incredibly productive career? And that he has a whole class of subatomic particles named in his honor?

Honestly, I think if he hadn't been so prolific, no one would have paid attention to the "paradox."

Nobody was arguing that the question “does extraterrestrial life exist” was too easy to answer. Yet, a man named Enrico Fermi decided to add another layer of unsettling confusion to that little gray layer cake, just in case anyone was feeling they had a good handle on it. Even worse, he reportedly rattled off his new addition casually at lunch, and every scientist since has been dealing with his bullshit. Bullshit that’s most commonly referred to as the “Fermi Paradox.”

It's nice to see my opinion on the matter confirmed, even it is by an art major writing on a dick joke site.

The article, again, does a good job explaining the questionable conundrum, so I won't repeat it here.

1 The Guy Who Invented Mystery Flavor Airheads

That wasn't a scientist. That was a marketer who probably got ordered to find something to do with the excess Airhead slurry left over after batches had been produced.

And we don't even know if it was a guy.
May 29, 2024 at 7:25am
May 29, 2024 at 7:25am
#1071843
Well, if it's from MIT Press Reader, surely it has the weight of scientific research backing it up, right?

The Time Hack Everyone Should Know  
Much like Dorothy discovers at the end of “The Wizard of Oz,” the key to hacking time is a tool we’ve had all along: Choice.


But then you won't get to meet friends, have adventures, and defeat angry old women. Unless you spend your time playing D&D.

I’m in a complicated relationship with my phone.

Oh, dear gods, this is going to be a screen rant, isn't it? FFS. Another thing we're all Doing Wrong.

So much so that I’ve never used the screen time function, choosing to live in denial rather than dealing with the hard truths of our relationship.

Or you could, and I know this is a novel concept here... just not worry about it.

Imagine my horror then, when my 14-year-old son surreptitiously turned it on and started reading off my statistics from the previous week.

It's your own damn fault for a) reproducing and b) not securing your phone.

We all know that time is our most precious resource: It’s the one thing money cannot buy.

Glib, but demonstrably false on two counts. One, the Beatles weren't lying about "money can't buy me love" (though it can buy beer, which is close enough). Two, if you manage to accumulate enough money, invested wisely, to live off the interest... well, then, you can quit working (also known as retirement) and thus use it to buy time.

And with smartphones in everyone’s pocket these days, we’ve never been more able to track how we use every minute of it.

Wrong pronoun. Should have been "they've," not "we've."

By pressing a button or downloading an app, we can track the time we spend exercising, sleeping, and even scrolling through our social media feeds.

That information is not there for your benefit or neurotic attention; it's there so that companies can track trends.

All of this reads like the "TV will rot your brain" rants from a few decades ago, or the "comic books will rot your brain" rants from before then, or the "radio will rot your brain" rants from before then. Hell, I'm willing to bet that as soon as someone invented fire, someone else was like "That stuff makes life too easy. Kids these days, so lazy!"

Take, for example, the American Time Use Survey.

Taking surveys is one way to waste time, I suppose.

The Bureau of Labor Statistics has been collecting data on a variety of time use markers for almost 20 years.

So they can say things like "Clearly, Americans have too much free time. Let's find ways to work them harder!"

According to their 2020 findings, the average American has enough leisure time to fit in lots of healthy and life-enriching activities: 5.5 hours per day to be exact.

Okaaaaay. Here we go.

The "average American" possesses slightly fewer than 2 legs. The "average American" sports fewer than 10 toes, and fewer than 10 fingers. I can't be arsed to find the precise mean, but considering that amputees exist, it's indisputable that the average (the mean) is lower than the most common number (2 in the case of legs, 10 in the case of fingers or toes). Probably around 1.98 legs and 9.5 fingers/toes. Maybe not those exact numbers, so don't quote me on that, but it's very likely some decimal number close to but less than 2 or 10. What matters in that case isn't the mean, but the mode. And, of course, that people who don't conform to the mode are accommodated in public, but that's not relevant to this discussion.

My point is that the "average" here is probably misleading, and meaningless. If two people are working full-time for every one who is not, that 5.5 hours may be low for the latter and high for the former. Plus, even full-time workers usually get weekends and holidays; I'm not sure if that's included in the average, but you might have a good 32 hours of leisure time (not working or sleeping) on the weekends and 0 on weekdays.

I've gone through the calculations before of a typical (not "average") full-time worker, and deduced that the actual number, when removing times for things like sleep, work, getting ready for work, commuting, dealing with household chores, etc., is closer to 1 hour on a work day. And that's assuming one job, which we just can't.

And that's not even getting into the bigger, less pedantic point, which is that as soon as you add "healthy and life-enriching activities" to your day, that time no longer counts as leisure time, because you're doing stuff you feel like you have to do instead of stuff you want to do.

Or that for some of us, screen time is "life-enriching." I've been using some variant of a computer every day since February of 1979. Yes, even before the internet. I played games, sure, but I also learned programming basics and other stuff. And then the internet came along and, well, here I am.

Pant. Pant. Back to the article.

But the survey also showed that we often eschew healthy, happiness-driving activities for passive, screen-based ones.

Newsflash: how we achieve happiness is very personal. I get that some people who use computers all day at work may feel the need to disconnect in downtime (I was never one of those people; I used computers for work and play.) I also get that some of those who do manual labor might want to crack open a beer and watch spurts or whatever. It's different for all of us.

The average American spends 22 minutes a day participating in sports, exercise, and recreation; 32 minutes per day socializing or communicating; and 26 minutes per day relaxing or thinking. In contrast, they spend 211 minutes per day watching TV. That’s 2.6 times more time watching TV than exercising, relaxing, and socializing combined.

Without reiterating my "average" rant from above, I'll just note that this is presented as a Bad Thing and Something That You're Doing Wrong.

Studies have shown that heavy TV watching is related to lower life satisfaction, and that people who use social media the most are also the most socially isolated.

Assuming those studies are even valid, how much of that is because you're watching shows or spurts, versus how the inevitable barrage of ads designed to make us feel worthless and incomplete is making us feel worthless and incomplete?

The article goes on to play more funny games with averages, and ends with some not-so-bad psychology that, while I call it not-so-bad, still won't work for everyone because, just like not everyone has 2 legs, not everyone responds in the same way to the same psychological mind-games.

For instance, the article seems to make a distinction between socializing in person and socializing over the internet. I've been at this for long enough that I don't make that distinction.

In the end, if you feel like you're too focused on something, whether it's exercise or screen time, change it if you want. If you don't, then you didn't really want to; you just thought you wanted to.

Relax. Have a beer (or whatever equivalent you use to say "fuck it" to the world). And don't let other people dictate how you should be spending your time... unless they're paying you.


May 28, 2024 at 1:00pm
May 28, 2024 at 1:00pm
#1071808
Short entry today. I had a dental checkup this morning, which took up way too much of the morning, and I just don't feel like doing a longer entry.

I will say this, though:

Afterwards, afflicted with a powerful thirst, I stopped at the nearby 7-11 to get some ice-cold Crack Zero for the purposes of slaking said thirst while at the same time getting the cloying taste of tooth polish out of my mouth.

The convenience store was, inconveniently, infested with an outbreak of teen. Kids running around, pushing each other, being loud, discussing their latest trivial but earth-shattering teen drama, flirting, and generally both ignoring and getting in the way of oldies like me who just wanted to do some commerce.

I started to prepare a "kids these days" rant in my head, but then, just as my fist closed over the smooth plastic of an ice-cold Crack Zero, the realization hit me just like some kid had hit the door as I held it open:

Given the way my friends and I acted in 7-11s when we were that age, I was experiencing no more and no less than long-delayed karmic retribution.

Okay, Universe. Okay. You win this time.
May 27, 2024 at 10:22am
May 27, 2024 at 10:22am
#1071760
One reason I think science confuses people is that nutrition science, in particular, is a hot mess that can't seem to make up its mind about anything. Case in point, from Inverse (reprinted from The Conversation):

    The Ultraprocessed Food Hype Is Masking This Other Major Health Predictor, 30 Years of Data Reveal  
Much of the recent evidence related to ultra-processed foods tell us what we already knew.


Even I gave up on trusting its results after several iterations of eggs being good, then bad, then good, then bad, then good, then bad, then good, etc. Not to mention all the questionable studies funded by people who push for a particular result. Nowadays, I mostly just eat for enjoyment; let other people get neurotic about what's healthy or not this week.

I recognize that part of the problem here is that biology is fiendishly complex, and has all sorts of mechanisms to return to equilibrium after a push away from it, some of which we don't fully understand. Another part of the problem is that it can be extraordinarily difficult to remove confounding variables from a nutrition experiment. And a big part of the problem is breathless reporting, eager to announce the Next Big Bad Thing or explain why the Last Big Bad Thing wasn't so bad after all—the latter of which seems to be the case for today's article.

In recent years, there’s been increasing hype about the potential health risks associated with so-called “ultra-processed” foods.

There's always a bogeyman, because we can't just let people enjoy their meals in peace. First it was fats (turns out we need some), then carbs (turns out we need some of those too), then gluten (still a source of grave misunderstandings) or whatever. Now, apparently it's ultra-processing, which I'm sure is defined somewhere.

But new evidence published this week found not all “ultra-processed” foods are linked to poor health. That includes the mass-produced wholegrain bread you buy from the supermarket.

Like I said, carbs went from good to bad to maybe good. The "maybe" is probably a matter of high vs. low glycemic index. Meaning, for example, whole wheat bread is probably better for you than white bread, which I did a whole entry about a couple of weeks ago.

Ultra-processed foods are industrially produced using a variety of processing techniques. They typically include ingredients that can’t be found in a home kitchen, such as preservatives, emulsifiers, sweeteners, and/or artificial colors.

I told you it was defined somewhere. Well, sort of. I'm pretty sure we all have things in those categories in our home kitchens. Salt, for example, is a preservative. Egg protein is an emulsifier (a thing that keeps oil and water from separating as nature intended). And so on.

Common examples of ultra-processed foods include packaged chips, flavored yogurts, soft drinks, sausages, and mass-produced packaged wholegrain bread.

In other words, anything that actually tastes good.

The new paper just published used 30 years of data from two large US cohort studies to evaluate the relationship between ultra-processed food consumption and long-term health. The study tried to disentangle the effects of the manufacturing process itself from the nutrient profile of foods.

I'm not reading that, so I'm not going to comment on the validity of the study. I'll just point out that this is just throwing another osis into our collective neuroses.

The study found a small increase in the risk of early death with higher ultra-processed food consumption.

The obvious question here is: correlation or causation?

Existing national dietary guidelines have been developed and refined based on decades of nutrition evidence.

But mostly based on pressure from lobbyists who work for manufacturers of ultra-processed foods, so that sentence makes me laugh. Nutrition science is convoluted enough as it is; throw in government bullshit, and you can see why I've given up.

Which doesn't mean I'm going to snack on Doritos all day. Just that I'm done worrying about every little thing that goes into my food and drink.
May 26, 2024 at 8:30am
May 26, 2024 at 8:30am
#1071700
Way back in May of 2008, I did an early version of my "comment on a link" thing, combined with a mini-rant: "Big Bang

As that was 16 years ago (I can math), it's not surprising that the link is no longer functional, but I think we can all get the general idea: someone found a stash of expired fireworks, and authorities decided to blow them up using plastic explosive.

Looking back on this now, I'd still take the attitude that they only did it because they could. From what I've heard since then, the proper and safe way to dispose of stale firebangers is to immerse them in water and let them soak for a good long time. But where's the fun in being proper and safe? I may be older and (arguably) wiser now than I was when I found that link, and I may have turned into the neighbor who moans about other people in the neighborhood illegally making booms and whistles around every Fourth of July, but that doesn't mean there isn't, somewhere inside me, Kid Me, who would definitely have wanted to see a bunch of old fireworks get blasted by 30 pounds of C-4.

The only difference is nowadays, I'd be more careful about where I did so.

As for the rant, it was about Mother's Day, which apparently was the day of that entry:

I thought about writing something about Mothers' Day here, but what's the point?

This year, I completely ignored it.

My mom died nine years ago this summer...

Obviously, that's 25 years ago now. That's a longer span of time than I spent living with her.

...and why the hell is there a special day reserved for people who managed to reproduce?

In fairness, there's a special day reserved for just about anything. Today, for example, is Sally Ride Day,   celebrating the birthday and legacy of the first female US astronaut (and who had one of the most awesome names in the history of names, thanks to a certain Wilson Pickett song). Now, there's a woman who accomplished something.

Hell, hamsters can do that. How about reserving a day for those who care enough about the planet and its other life forms that we did not breed like rabbits?

Note to self: stop mixing rodent and lagomorph metaphors.

And that goes double for Father's Day.

Lest anyone labor under the misconception that my rant was reserved for females alone.

Of course, at the end of the entry, I clarified that the whole rant was satire:

Oh, about the first paragraph? I'm just kidding. Mothers - and fathers - should definitely be acknowledged, not for breeding, but for bringing you up right.

If, that is, they did so.

About the only thing I truly regret about that long-ago entry was not being more explicit in tying the "Big Bang" of the fireworks disposal with the "Big Bang" usually associated with the conception of offspring. It's even possible that I didn't make the connection back then, but now, being older and (arguably) more foolish, it jumped right out at me like a rabbit. Or a hamster.
May 25, 2024 at 10:31am
May 25, 2024 at 10:31am
#1071672
Fair warning: this article, from Mental Floss, never really answers the headline's question. I'm still posting it, because it's may be of interest to the readers and writers here.

    What Was the First Banned Book in History?  
Book bans are hardly a new practice.


For a country claiming to abide by freedom of speech, we sure do love our book bans and censorship. I suppose it's just human nature: it's not enough to get your own ideas out there; sometimes, you also have to suppress those of others.

Obviously, the US isn't the only place to ban books, but in our case, it's the hypocrisy that leaves me shaking my head.

There’s no more potent evidence of the power of the written word than the fact people have historically looked to ban them.

In my opinion, if your ideas can't hold up to scrutiny and argument, then they're not great ideas to begin with. Also see: blasphemy laws.

Cultural norms, politics, personal beliefs, school policy, and other factors can all conspire to deem a book too incendiary to circulate in America.

This article is from October, so I don't remember which specific book-banning event spurred its writing. There have been so many. As for school policies, though, there's a difference between outright censorship and desiring age-appropriate materials. Some "censorship" arguments actually boil down to differences of opinion over what's age-appropriate and what isn't. Obviously, we make those distinctions right here in this community.

I've seen (and participated in) plenty of discussions about age-appropriateness, in the context of content ratings here. There are legitimate differences of opinion there. So when I talk about book bans, it's usually about people who try to keep grown adults from making their own decisions about what to read or not.

But just how far back does this policy of thinly-veiled thought control go?

If I had to guess, I'd postulate that book-banning is as old as books. Some Sumerian probably cuneiformed a dirty joke into clay, and other Sumerians got offended and tried to burn the clay, which obviously would have had the opposite of the intended effect, leaving the joke literally etched in stone.

Shattering works much better on clay tablets.

As is often the case when you look back into history, there’s more than one possible answer. But one of the leading contenders has a fairly predictable culprit: the Puritans.

Ah, yes, that marginalized group who fled religious persecution in England so they could practice it themselves in America.

In 1637, a man named Thomas Morton published a book titled New English Canaan.

So, potentially the first banned book in the Americas, but that could hardly be the first banned book, period. Incidentally, I did a whole blog entry on that author a few years ago: "Vile, Virile Vices

His book was perceived as an all-out attack on Puritan morality, so they banned it—and effectively banned Morton, too.

The real miracle here is that their descendants ended up signing on to the whole "freedom of speech and religion" thing when they grew up.

You can go further back to find more startling examples of banned books, though the definition would have to expand to include the execution of authors.

Yeah, writing may be a fairly safe activity now, free of the occupational hazards of, say, firefighting, but it hasn't always been the case.

In 35 CE, Roman emperor Caligula—certainly a man of strong moral stuff if ever there was one—discouraged people from reading Homer’s The Odyssey because it could give them a taste of what it meant to be free.

A lot of the stuff you've heard about Caligula might have been political bickering. It would be like if the only surviving history of Kennedy's presidency was written by Nixon. And the infamous movie with his name on it wasn't exactly a documentary.

Most telling, though, there's a huge difference between being "discouraged" from reading something, and having that something banned or burned.

What book bans and censors attempt to do in the curtailing of reading is often futile.

Here is, in my view anyway, the most insidious thing about censorship: the censor has either read the book, or has not read the book. (Reading enough of the book to know you don't like it counts as reading it, in this argument.) In the latter case, it's ridiculous to ban something you haven't even read. In the former, you're setting yourself up as an arbiter, someone more qualified to make that decision than, say, me. That's also ridiculous.

If your ideas are sound, they'll withstand argument. If not, and you try to do an end-run around public discourse by banning opposing viewpoints, well, that might just make you a tyrant. And at least here in the US, whenever someone bans a book, well... that's publicity for the book, isn't it?
May 24, 2024 at 7:04am
May 24, 2024 at 7:04am
#1071617
As I normally travel alone, today's article seemed relevant to my interests. (Source is Condé Nast Traveller, if it matters.)

    The golden rules of solo travel  
We ask our editors and favourite solo travellers for their savviest tips and tricks


Apparently, there's a US and UK version of that outlet. The spelling gives away that it's from the latter. But as it's about travel, I don't think the country of origin matters much.

The joys of travelling solo are endless.

I wouldn't say "endless." Just "ending later than if you have someone with you to eventually argue with."

There is something truly freeing about exploring new places alone – you can go where you please, eat when you want, and have uninterrupted quality time with yourself.

No sleep-shaming, no pressure to fit too many things into one day, no bargaining about "If we do X today, I want to do Y tomorrow," etc.

The interest in solo travel has been slowly rising for a while, but new data from UK-based travel organisation ABTA shows that 16 per cent of travellers went on holiday by themselves in 2023, a five per cent increase from the previous 12-month period.

I wonder if there had been a global problem that made people tired of always seeing the same other people all the time.

But, if you’re not a seasoned solo traveller, it can be a daunting prospect. In an age of constant connectivity, the idea of being alone for an extended period of time is a convoluted one.

The only way I'd be "alone for an extended period of time" would be if I went hiking by myself in the wilderness, which is not only a bad idea to begin with (though I've done it), but it would involve being *shudder* outdoors.

Below, we spoke to travellers who frequently book solo trips about their golden rules for travelling alone.

"Rules?" Hey, I travel alone so no one gives me "rules."

Dining alone isn’t weird
For most people, the thought of dining alone is one of the biggest barriers to travelling solo.


Yeah, I just can't relate to that. If I'm alone, I can focus on the things that really matter: the dining and drinking experience. Besides, no one is there to tell me they just can't stand whatever cuisine I've decided to try.

Fake it til you make it
Most people feel nervous about meeting new people, and introverts especially can struggle to make the first move when arriving in a new place.


I'm more introverted than extroverted, but my only apprehension involves language and cultural barriers.

Book counter dining at restaurants
If you are someone who does feel uncomfortable about dining alone, opt for a bar or counter seat.


While I don't travel internationally as much as I'd like, here in the US at least, the bar usually doesn't require reservations or other planning ahead, apart from maybe figuring out a time to go when it's not too crowded.

Plan around cultural events
Arriving at a destination just as the locals are gearing up for an important cultural event can be an incredible way to immerse yourself straight away.


It's also an incredible way to have everything crowded and sold out. Hell no. Give me off-peak travel any day.

Exception: my desire to visit Scotland during the Islay Festival for the best whiskey in the world.

Build in group activities
Booking tours and group events is a great way to meet other travellers. Most hostels have a list of activities available for guests to sign up for, and if not, then there are walking tours or live music events at local bars.


Honestly, I'm torn about this bit, personally. First of all, I'm not interested in hostels, but let's leave that aside for now. And while I love music and bars, I despise music events at bars, because I can't hear the bartender.

My passport expires in 2026. I've never used this incarnation of it, because, well, you know. I want to use it at least once, and I don't mean crossing the border into Canada. As the person I was planning to go to Belgium with has other priorities now, I'll be going alone, which is fine. Maybe France, first, though... but not until after the Olympics (see above re: crowds).

Now I just have to get on my ass and make the plans.
May 23, 2024 at 10:01am
May 23, 2024 at 10:01am
#1071563
Appropriately from Atlas Obscura, the tale of a great world explorer:

    The First Viking Woman to Sail to America Was a Legendary Traveler  
Back when the Icelanders called a part of Canada the “land of grapes.”


Now, it's possible that we shouldn't be using the word "Viking" like that, as I mentioned long ago here: "The Once and Future Viking. But I'm just quoting the article here.

Her full name, in modern Icelandic, is Guðríður víðförla Þorbjarnardóttir—Gudrid the Far-Traveled, daughter of Thorbjorn.

I'll just note here that some of those weird-to-us letters used to be in English, too.

She was born around 985 AD on the Snæfellsnes peninsula in western Iceland and died around 1050 AD at Glaumbær in northern Iceland.

Just looking at that, one might conclude that she didn't travel very far at all.

What little we know of her comes from the Saga of Erik the Red and the Saga of the Greenlanders. These are collectively known as the Vinland Sagas, as they describe the Viking exploration and attempted settlement of North America—part of which the explorers called “Vinland,” after the wild grapes that grew there.

A few entries ago, some article said that meant "Land of Wine," which may be inaccurate, but I like it better anyway.

Also, they freely mix fact with fiction. Their pages crawl with dragons, trolls, and other things supernatural.

How else are you going to scare the kiddies into behaving?

But the central tenet of the sagas has been proven by archaeology: In the 1960s, the remains of a Viking outpost were dug up at L’Anse aux Meadows, on the northern tip of Newfoundland.

I wouldn't say "proven." "Supported," maybe. It's not news anymore that Scandinavians made it to North America long before Italians did.

Among the rubble was found a spindle, used for spinning yarn, which was typical women’s work and thus possibly handled by Gudrid herself.

Right, because Gudrid sounds like the kind of chick who would do "typical women's work."

And in the Saga of the Greenlanders, Gudrid is called “a woman of striking appearance and wise as well, who knew how to behave among strangers.” That’s a trait that may have come in handy when dealing with the Native tribes of North America, whom the other Vikings dismissively called skrælings (“weaklings,” “barbarians”).

As I've noted before, who's the "barbarian" depends on who you're asking.

The article continues with a summary of the sagas involving Gudrid, and while I'm sure the originals (well, the original written-down versions, I mean) would be fascinating, the Cliff's Notes here seem to provide the pertinent details.

Another story from the sagas that has mystified readers for centuries because it mentions two “Gudrids” and has traditionally been dismissed as a ghost story could in fact be the earliest recorded conversation between a European and an American.

And no, they didn't discuss trade agreements or war. Or much of anything, considering they apparently didn't have time to learn each other's languages.

There's a lot more at the article, but as it notes, her relatives Erik and Leif got all the PR, but they didn't travel alone.
May 22, 2024 at 9:26am
May 22, 2024 at 9:26am
#1071517
I'm usually a law-abiding citizen. Or, well, I try to be; sometimes it seems laws are designed so that if They want to get you for something, They can find a reason.

But some laws are fundamentally unjust, and need to be broken. Cracked provides some examples in:

    6 Loopholes People Used to Break the Law and Get Drunk  
Technically, if you’re on a train, everything is legal


6. Pay to See the Blind Pig

During Prohibition, a bar could not legally operate and sell alcohol. No one could legally sell alcohol (without receiving special exemptions, such as for medicinal use), or manufacture alcohol, or transport alcohol. The law didn’t ban drinking alcohol, however, or handing the stuff to someone else without charging them any money.


It could, of course, be argued that it's hard to drink alcohol if one is prohibited from buying, making, or moving it. But, at least here in the US, lawyers thrive on technicalities.

But suppose an establishment were to hand out a drink for free and charge customers for something else? Say, they charge a customer for some entertainment — for instance, the chance to look at a marvelous animal. As for the drink the barman serves the customer, well, no one’s purchasing that.

Not mentioned: how the pig got blind. Look, I'm all for eating the tasty critters, but mutilating them doesn't fly. This was nearly a hundred years ago, though, and people didn't generally think that way.

This idea is why one alternate name for a speakeasy is a “blind pig.” And if you’re wondering why the police would ever be fooled by this, know that plenty of police didn’t really care about Prohibition laws and were possibly drinking right along with everyone else.

Which may explain why some cops love do do drug busts: it provides them with free product.

5. Instructions on How to Absolutely Not Make Wine

They had this sort of thing for beer, too.

Individual families were still allowed to make a limited amount of wine, but if you were to sell people the raw materials for making wine, along with instructions, you might find yourself in trouble.

This would be like selling fertilizer with instructions on how to ensure it never becomes a bomb. Except there's good reason to keep people from making bombs.

“After dissolving the brick in a gallon of water, do not place the liquid in a jug away in the cupboard for twenty days, because then it would turn to wine.”

These days, of course, the surest way to get a certain group of people to do something is to tell them not to do it.

4. Stick Everyone on a Train

Even when alcohol is legal in the country, you need a license to sell it. One British gin maker, Tapling & Meegan Distilling, dutifully applied for this license, but it was taking too long to get approved. So they did the only reasonable thing and turned to steampunk.


Now, there's a story idea.

Within all the country’s many alcohol regulations is a line of law saying the usual license requirements do not apply to trains in motion.

The downside of this is obvious: motion sickness exacerbated by drunkenness.

3. Let’s Call Beer Something Else

Complex regulations define exactly what beverages are, which is why you generally cannot put lemonade in a bottle and sell it as tomato juice. In Texas, they had a rule about beer: It could not contain more than 4 percent alcohol by volume.


Ah, yes, the old "name a thing something else to get around regulations" trick. No wonder fermented beverage categories can be misleading.

A lager that appears to be beer by most conventional definitions would be labeled, in fine print, “In Texas, malt liquor.”

I'm pretty sure some states still have ABV maximums for beer. As the article notes, Texas isn't one of them. But it's not a good look for a state that prides itself on limiting government interference in peoples' lives. Which, incidentally, they clearly don't do.

2. Turning Nightclubs into Pop-up Restaurants

This next law lasted from 1935 to 2000, in Ireland, a place not entirely unfamiliar with alcohol.


Now that's a cheap shot. Pun intended.

Establishments were not allowed to serve alcohol at night unless they also served a meal.

This sort of thing has been the law here in the US, off and on, depending on where you are. Hell, even my state has a version of the rule, which is why you'll technically never find a bar in Virginia; only restaurants that happen to serve booze.

The dish of choice at these clubs? Curry.

Yeah, that couldn't have ended well.

1. The Inedible Sandwich

That Irish policy hearkens to an older and famous law from New York. Way back in 1896, the same time that they made the controversial decision to raise the drinking age from 16 to 18, the state passed a law saying bars couldn’t serve alcohol on Sundays.


Like I said.

Bars, which served no actual food, qualified for the exemption by offering a sandwich. Not sandwiches, but a singular sandwich that a bar would pass from customer to customer without anyone eating it. This was named the “Raines sandwich,” after John Raines, the senator who wrote the law.

I'd heard about this loophole before, of course. It still amuses me.

The takeaway here is that, if you're clever, you can find a way around unjust laws without flagrantly breaking them. And I approve.

2,744 Entries · *Magnify*
Page of 138 · 20 per page   < >
Previous ... 1 -2- 3 4 5 6 7 8 9 10 ... Next

© Copyright 2024 Robert Waltz (UN: cathartes02 at Writing.Com). All rights reserved.
Robert Waltz has granted Writing.Com, its affiliates and its syndicates non-exclusive rights to display this work.

Printed from https://shop.writing.com/main/profile/blog/cathartes02/sort_by/entry_order DESC, entry_creation_time DESC/page/2