Items to fit into your overhead compartment |
| The Random Number Gods have once again conspired against me by continuing some of yesterday's theme. I normally avoid HuffPo, but I was unable to resist this article from two months ago: More And More People Suffer From 'Chemophobia' — And MAHA Is Partly To Blame The fear tactic strikes a nerve with both conservatives and liberals alike. Here’s what you need to know. If you’ve ever muttered to yourself, “I should really get the organic peaches,” or “I need to replace my old makeup with ‘clean’ beauty products” or “I really want to buy the “non-toxic’ laundry detergent,” you may have fallen into the chemophobia trap, an almost inescapable phobia that’s infiltrating lots of homes. I have never, not once, muttered or even thought anything resembling those phrases. The closest I ever came was "You know, I really ought to stop eating Twinkies." Chemophobia is complicated, but, in short, it’s a distrust or fear of chemicals and appears in many of aspects of life from “chemical-free” soaps and “natural” deodorants to vaccine distrust and fear-mongering about seed oils. Everything is chemicals. Every soap, every drink, every morsel of food, every article of clothing, every folksy treatment for whatever ails you; it's all chemicals. And natural, however you might define it, isn't always good. I'm hardly the first to push back on this. This article Next time you find yourself thinking, "natural is good; artificial is bad," please remember that tobacco is natural. Appealing to the left, it was seen as counter-culture and opposed the “evil market forces,” said Timothy Caulfield... On the right-leaning side, chemophobia appears as a distrust and demonization of things like studied vaccines and medications and the pushing of “natural” interventions, “when those have no regulatory oversight compared to regulated medicines,” Love noted. So they're against what doesn't have regulatory oversight, but vaccines have regulatory oversight, but *head explodes* Chemophobia leads people to believe that synthetic, lab-made substances are inherently bad while “natural substances” — things found in nature — are inherently good, and that is just not true, Love said. I've said it before, as recently as yesterday and so far back that I can't even remember. But it's nice to see it from someone else, even in HuffPo. But I'm not the only one trying to shine a light into the darkness here. There's even a name for it: Appeal to Nature, The article eventually goes into this. Then: At the core of chemophobia and appeal to nature fallacy is also a “romanticization of ancestral living, when, in reality, we lived very poorly, we died very young and often suffering and in pain,” Love said. The past, with no exception, sucked. The future might suck, too; we can't know, and that's a good thing, because it enables us to keep up some semblance of hope. What I know for sure is that I wouldn't be here to gripe about it if it weren't for "artificial" chemicals and modern medical science. Some people might consider that a good thing; I do not. If someone on social media says that a certain ingredient is harming your kids, you’ll be scared and want to make lifestyle changes. If someone claims your makeup is bad for you, you’ll also be scared and want to make changes. This is, therefore, a very effective advertising technique: Brand X contains chemical Y. So buy my Brand Z, which is completely Y-free! “The reality is, our brains want simple. They want black and white,” said Hardy. That statement is, ironically, simplistic. We make choices all day long, which makes categorizing things, like food, as “good or bad” appealing to our minds, Hardy said. Here in reality, everything's got its benefits and risks. But like I said yesterday, humans are utter shit at risk assessment. If you’ve ever fallen into the chemophobia trap without knowing, you aren’t alone. It’s complicated and nuanced, and the science is, at times, messy. This is also true. I've pushed back before on questionable studies and reports on science. I expect to continue to do so. “So, it all just becomes slogans and wellness nonsense,” along with the peddling of unregulated, unproven supplements (that are basically just untested chemicals), Caulfield added. I don't have an inherent issue with so-called "natural" products. That is, while I object to the marketing, some of them may very well be effective. We don't know, though, because many supplements don't have to go through the rigorous testing that, say, vaccines do. And, many of the people who claim to be so concerned about chemicals then profit from the sale of unregulated supplements, Caulfield said. Which rephrases what I said up there. Whether someone has conservative or liberal views that fuel their chemophobia, the fear of chemicals is dangerous. And, it’s, sadly, more prevalent than ever, Caulfield said. I deliberately avoid getting political in here, and I really don't want to get into the "both sides are bad" argument. Hell, even the idea that there are only two sides is oversimplifying things, which, well, didn't we just rail against that? The side I'm on is science and knowledge. I'm not perfect in that regard. Neither is science. But it's the best tool we have right now, and the benefits outweigh the risks. |
| This one from RealClear Science is over six months old, and touches on the US political sphere, so some of it's already outdated. It's really the basic message that's important to me here. Indeed, there is ample evidence that the case for truly scientific policymaking must be made no matter who is in charge. Difficulty: people like to pick what science they accept and what they don't, and that tendency is different across the political spectrum. Consider a recent example: the FDA’s ban of red dye no. 3, which was based on anecdotal evidence, motivated by fear and, unfortunately, required by much older law grounded in bad science. And just one of the many things that have fallen off the radar from early this year. Advocacy groups used a 49-word law from 1958 called the Delaney Clause to force the Food and Drug Administration (FDA) to ban the product, a synthetic food dye known as “FD&C Red No. 3.” The clause states that FDA cannot approve food additives that cause cancer in humans or animals, regardless of whether the cancer is relevant to humans. I just can't get worked up over the dye removal. Yes, the issue was based on bad science. But it's not like when they banned alcohol: I don't know of anyone who ate stuff with this red dye because it was tasty. Moreover, science is never fixed or “settled.” Therefore, laws or regulations based on science are also never settled. Rather than a collection of facts, science is an ongoing voyage of discovery. This is really the part I wanted to emphasize. Again, though, there's a complication: there's a significant number of people who will not change their minds when the facts change. And a lot of them are in politics. The “zero-risk” concept used by the Delaney Clause was always problematic for its flawed science, and it remains an ongoing source of controversy. There is no such thing as "zero-risk." As I like to say, your chance of getting killed by a shark while inland is astonishingly low, but never zero. And there's this, which supports things I've written in the past: Tom Standage, deputy editor of The Economist, wrote in his book, “An Edible History of Humanity,” that “The simple truth is that farming is profoundly unnatural … and all domesticated plants and animals are man-made technologies.” The upshot is that every single item in supermarkets has received some amount of manipulation by humans and finding any boundary between natural and unnatural becomes impossible. In general, people are terrible at risk management. I'm seeing this now in the ongoing development of autonomous vehicles: it seems like many people wouldn't be satisfied with reducing the current human-driver accident rate; they would only accept AVs if there were zero risk. And that doesn't make sense to me: If 40,000 people a year have been killed in automobile accidents, and AVs could get that number down by a decent amount (say 20%, or 32,000), then the choice is clear to me. But no, ooga-booga robots. So, yes, I'd like to see laws based more on science than fear or greed. But even in that utopia, what science are we going to emphasize? And in the end, laws are more about philosophy than about science. |
| Bread is food. Everything else is a condiment. From Vox: Sorry, it’s true: The US really does have crappy bread The difference between a baguette here and in France is a matter of law. And don't get me (re)started on the differences in cheese. Have you ever gone on a trip to another country and thought, “Why does the food here taste so much better than the food in America?” No. I mean, yes, I've gone on trips to other countries. But I just take the "why" as a given. Yes, the fact that you’re in a new and exciting environment is a factor. I'm assuming the article is about vacationers, not soldiers. But you also aren’t imagining things: other countries have different ways of preparing and producing food that factor into what you’re tasting as well. And yet, it's not always "better," which is largely a matter of personal taste, anyway. Take the French baguette: that iconic bread that brings to mind berets and bicycling along the famous Champs-Élysées avenue as accordion music plays in the background. Arrêtez avec les stereotypes. I know I've mentioned this before, but there's a bakery near me that makes excellent baguettes (and other breads, and pastries). The baguettes there are, in my opinion, superior to any I had in France. Before you start dusting off the guillotine, France, I freely admit that most of the bread I had there was in hotels, and they likely sourced cheaper loaves. And don't get me wrong; they were delicious. I'm just lucky to live in a town with a really good artisanal bakery. According to Eric Pallant, the author of Sourdough Culture: A History of Bread Making from Ancient to Modern Bakers, that image is no accident; France is so invested in its bread that the country made a law protecting it from the encroaching mass-produced bread market. If only we could have done that in the US with beer. The rest of the article is a podcast transcript with more from Pallant. Just one more quote from him: For 6,000 years, nobody knew what made bread rise. It was just magic. You put this glop called sourdough starter into a dough, and like magic, it rises. By the 1870s, 1880s, Louis Pasteur discovered that yeast are living things. Beer was magic until then, too, and for much the same reason. (Oh, who am I kidding? It's still magic.) And, let's see, what nationality was Pasteur, again? Ironically (maybe), French cheese is so good because much of it is not made from pasteurized milk. And no, I'm not agreeing with a certain public official here; raw milk is dangerous. But the cheesemaking process all but eliminates that danger in cheese. There's still a very tiny chance of bad things happening, but a) people in the US have died from eating fuckin' lettuce and b) good cheese is worth the infinitesimal risk. As usual, there's more at the link if you're interested. The one downside of the local bakery I mentioned? Because they don't use preservatives, you gotta eat the bread pretty much the same day you buy it. Or, well, I seem to have cracked the code: Freeze it and eat it within a week or so. One more interesting thing: in the French versions of fairy tales, what we call a magic wand, they call une baguette. I find this highly appropriate. |
| The Guardian tackles the burning question of our time: Why is that even a question? It's blindingly, mind-numbingly obvious that night owls are superior in every way. We're also more attractive and intelligent, and wealthier. We all know that early birds get the worm. But who wants a worm? My favorite riff on that is forever "The second mouse gets the cheese." The article, however, saves that bit of wisdom for (spoiler!) the end. US work culture is not really optimal for night owls. Rather, it favours CEOs who get up at 4am and run a marathon while the rest of us hit the snooze button. Funny how the author cites US work culture and then uses the UK spelling of "favours." Anyway, whatever, I don't believe one bit of the mythology surrounding CEO "success." I think in most cases, they've got cause and effect reversed. Still, I always consoled myself with the idea that night owls are actually more intelligent and creative than their early bird counterparts. And prettier, don't forget. Franz Kafka and Thomas Wolfe wrote at bedtime; Bob Dylan recorded at night. Even scientific studies indicated it was true. Okay, well, to be serious for a moment, those are probably more cases of causal reversal. Musicians, in particular, tend to be night owls; that doesn't mean all night owls are musicians. Me, for example. Couldn't carry a tune in a bucket with robotic assistance. No, seriously, even autotune would take one listen to me, then spark and smoke. Also, professional writers generally aren't forced to follow anyone else's schedule. However, something weird has happened. Due to a combination of existential dread, cutting out alcohol and having a small child who wakes me up at an ungodly hour, I’ve started to go to bed earlier and earlier. This is why I don't cut out alcohol or have kids. The existential dread thing is, however, inescapable. This shift has me questioning everything. Is it actually possible for your body clock to change? Am I really turning into an early bird or have I just been forced into a child-dictated schedule? And if I am an early bird, does that make me a lesser being? My guesses: Maybe, the latter, and definitely. My first discovery: you can’t help who you are; your optimal bedtime is hardwired into you. I really dislike "hardwired" when applied to biological entities. It's a metaphor, yes, I know. I've shifted my sleep hours many times over the course of my life. That doesn't mean I've enjoyed it. Fortunately, I'm now in a position to sleep when I want and, usually, wake up naturally. Back to that all-important question, though: does being a night owl make you more creative and intelligent? Jokes aside (my answer is always "yes"), let's not confuse correlation with causation. More recently, researchers at Imperial College London studied UK data on more than 26,000 people who had completed various intelligence tests. The 2024 paper found that those who stay up late had “superior cognitive function” to morning larks. Like that, for example. My actual (non-comedic) hypothesis is that your mental function is optimized when you can most closely adhere to your chronotype, and a lot more owls are forced into a lark schedule than vice-versa. “The timing of the biological clock determines more than just when people like to go to bed and get up and when they feel most alert during the day,” says Van Dongen. “It also determines in part the kinds of activities they may end up participating in and the experiences they get exposed to.” If you naturally get up early, it may be easier for you to thrive in a corporate work environment, for example. Or, you know... that. One thing sleep experts all seem to agree on is that trying to force a routine that is at odds with your biological clock is unhealthy. And yet, people are forced into such routines, well, routinely. Being a night owl may come with other risks. A 2024 study from Stanford Medicine researchers found that being up late is not good for your mental health, regardless of chronotype. The reasons are unclear, but researchers suspect it’s because unhealthy behaviours such as drinking alcohol or eating junk food are more likely late at night. Counterpoint: I do most of my drinking (which does not happen every day) during the afternoon. I don't call it day-drinking, though. There's no such thing as day-drinking. There is only drinking. There's quite a bit more at the article, so you can go there if you want. Meanwhile, I'll be over here sleeping. |
| Oh, this is rich coming from Mental Floss, not the most trustworthy source of information. "Waltz, if they're not trustworthy, why link them?" Well, someone's gotta point this shit out. No outlet is right all the time. Not print encyclopedias, not Wikipedia, not the BBC, not ChatGPT, and certainly not me. If you’re looking for some unbelievable tidbits to whip out at parties, the latest episode of The List Show has plenty. You can read a few of them below, and get the full list by watching the video above. Because of all my script blockers and add-ons, I don't see the link to the video. I assume someone will. But I don't usually work with videos in here; I'd rather have text I can copy/paste. I really only have a few to comment on. Letters are called “uppercase” and “lowercase” because of where they were stored. The technical terms for the letters are majuscule for uppercase scripts and miniscule for lowercase scripts, which both have Latin roots. While this isn't incorrect, I noticed that they used both "minuscule" and "miniscule" as the lowercase spelling. According to at least one Sharks are older than trees. The first thing we’d call a tree, Archaeopteris, dates back 350 million years, and it’s now extinct. Sharks, meanwhile, have been around for 400 to 450 million years. Well, first of all, why name an old tree so similarly to the name for the flying dinosaur archaeopteryx, if not to confuse all of us? And second, I do hope that people understand that this means "sharks were around before trees" and not "the sharks swimming in the ocean right now are older than the living trees on land." Bananas are naturally radioactive. Oh no! Ooga-booga! Fear everything! I've also noticed an uptick in "Your ______ is absolutely loaded with bacteria!" articles, which experienced a downswing during Covid, because all the fearmongering was centered on that. For the record, almost everything is radioactive to some degree, and bacteria are almost everywhere. Now, when radioactive bacteria get loose and start taking over the world, maybe then you need to start being afraid. Butt is a real unit of measurement. If you’ve ever talked about having a “buttload” of something, hold on to your butts, because butt is a real unit of measurement. As a student of the lore surrounding fine fermented and distilled beverages, I knew this. I also know there's a large barrel called a tun, which is equivalent to four hogsheads. I know a butt-tun of this archaic information. Ladders kill more people than sharks. Sharks again? Fine. Cows also kill more people than sharks. Not just bulls. Cows. Lots of this is just availability bias. We don't share a habitat with sharks. Most of us don't share a habitat with cows, either, but many do. Sharks are also older than the North Star. As we saw earlier, sharks, in something resembling their current form, are thought to have been around for nearly half a billion years. Turns out they think Polaris is quite a bit younger But maybe the real shocker here, for some, is that Polaris wasn't always the North Star, and in the future, it once again won't be. Farts can leak out of your mouth if you hold them in. And we're done here, so you can go try that one for yourself. |
| Not my usual kind of thing, I know, but this one from Taste caught my tongue. I mean, eye. Whatever. The Chain That Defined “Mexican” for 1990s America Wants Back In Chi-Chi’s once filled dining rooms with fajitas and frozen margaritas. Will diners still buy what it’s selling? Send the immigrants back to Mexico but keep their food? Yeah, that's on brand for the US. When I was in college at California State University, Northridge, Monday nights meant sorority meetings that sometimes stretched late into the evening. You want me to make a joke about sororities here, but I'm not going to. Afterward, we’d usually end up at Denny’s for midnight mozzarella sticks, boneless Buffalo wings dipped in ranch, and water with lemon. Note the phrasing there: "end up at Denny's." That's because one doesn't choose to go to Denny's; one ends up at Denny's. For decades, Chi-Chi’s offered its own version of “Mexican” dining: chimichangas as big as your head to go with giant goblet-size margaritas and family celebrations staged in suburban strip malls. For the record, frozen margaritas are abominations. Founded in 1975 in the Twin Cities suburb of Richfield, Minnesota... I didn't make a joke about sororities up there, but I'm going to make a joke about Minnesota. Ready? What were they thinking? Minnesotans consider mayonnaise to be too spicy. Okay, I'm done. But the energy Alvarez remembers so vividly didn’t last. By the 1990s, the buzz that once made Chi-Chi’s a magnet for large gatherings had begun to fade as tastes shifted. Across the United States, taquerías and mom-and-pop Mexican restaurants gained traction, offering food that felt closer to home for Mexican Americans and newly arrived immigrant families and more enticing for diners searching for “authenticity.” I don't give a shit about "authenticity," and I don't believe in food as cultural appropriation, but, to be blunt, Chi-Chi's and its ilk were simply embarrassing. Not that I've never eaten at one. We had one here for a couple of years, before they went tetas-up. But they didn't last very long. See, we have a long-standing local restaurant here called Guadalajara, which is run by actual Mexican-Americans, and not Minnesotans in plastic sombreros. Now, after more than two decades, Chi-Chi’s is staging a comeback. Its first U.S. restaurant opened on October 6 at 1602 West End Boulevard in suburban St. Louis Park, Minnesota led by Michael McDermott, son of the chain’s co-founder. Wow. That's an ad. Dammit. I think I see what's going on here. Obviously, Chi-Chi's, while dormant, spent millions lobbying the US government to crack down on immigrants, and now they're hoping to recoup the cost by restarting their soulless fake-Mex chain. Arellano has long argued that chains like Chi-Chi’s mattered because they normalized Mexican food for mainstream America, even if in caricature, and primed the way for more authentic iterations of the cuisine down the line. Okay, I guess? Pretty sure that my town isn't the only one that had decent Mexican food before Chi-Chi's. I don't have much of a point here today except to rag on big chain restaurants. And maybe Minnesota, just a little bit. Okay, I'm kidding; last time I was there, I made it a point to eat at places that are generally known for spicy food, such as Indian and Thai. Neither one of them disappointed, and both were well-attended. But they weren't big soulless corporate chains. |
| You know how the British Museum is often criticized for appropriating bits of history from other countries? Well, here's a chance to turn the tables on the UK. From BBC: People can own a piece of Isaac Newton's home as part of a conservation project. And why would anyone want to do that? Well, I think there are more math/science/history nerds out there than pop culture would have us believe. We are legion. Still, I'm not going to bite. Why? Because I've been conditioned by the hucksters running rampant across my own country to believe that anything like that is ripe for grift. For instance, one could chip a piece of brick from any old house and call it Newton's. It is claimed the scientist, born at Woolsthorpe Manor near Grantham in 1642, observed an apple falling from a tree that led to his theory of gravity. "It is claimed," BBC? Get out of here with the passive voice. Newton himself started that bit of mythology (the article does nod to that at the end). What almost certainly didn't happen was the way it's generally portrayed: the apple falling on Newton's head while he sat contemplating under an apple tree. Anyone who's been around an apple tree for any length of time has seen an apple fall to the ground, which is why smart people don't sit under those trees when they're fruiting. Or any tree during a lightning storm. The giant intuitive leap for Newton was to connect that fall with the orbits of the moon and planets. Okay, I'll shut up now. Math/science/history nerd mode off. Ms Johns said the local limestone being replaced had been there for about 500 years. I forget who said it first: In the UK, a hundred miles is a long way. In the US, a hundred years is a long time. 500 years is twice the age of my country. But it is quite a long time for exposed limestone to last without eroding beyond use. Visitors can buy various sizes of the local limestone, with a donation that will go towards conservation work. So despite my cynicism above, I don't think this is an actual grift. Sizes vary from the "height of a desktop computer or a little slither that could act as a paperweight", Ms Johns said. Yes, in British English, "slither" can be a synonym for "sliver." So when reading that, I imagined chipping off smaller and smaller "slithers" from the base stone, until what's left is so tiny as to be... infinitesimal. That's a calculus joke. So is the title of today's entry. I'll be here all week, folks. Be sure to tip your servers. |
| I'm a little concerned about the name of this site (Popsugar), but the article was relevant enough to my interests to save it. Shouldn't that be "Sleep Unsupportive Family Trauma?" Like, who has trauma from a family who doesn't sleep-shame? I was today years old when I learned I don't come from a "sleep supportive family." Okay, that opening is Strike 2. The construction "I was today years old when..." was trendy and precious ten years ago when I first heard it; now, it's just tiresome. (The site name was Strike 1.) As Dana Joy Seigelstein describes it in her now viral TikTok video, a sleep supportive family is when the loved ones in your life encourage sleep. Aaaaaaaand strike 3: referencing a DikDok video and calling it viral. So why am I still here? Because my anger at people who sleep-shame exceeds my frustration at social media trends. If I so much as closed my eyes on the couch or spent a few extra minutes in my bedroom alone, I would have to endure passive-aggressive comments from my sisters for the rest of the day like, "Good morning, princess" or "Oh, are you done being lazy?" My parents would also decide that if I had enough time in the day to take a nap, I would have enough time to complete a new chore when I woke up. Bet I know where the sisters got it from. Naps weren't the only issue in my household, though: sleeping in was also not recommended. (Again, never a rule said out loud, but always implied. I'm sensing a theme here.) If I slept in too late, I'd wake up to different people taking turns poking their head into my room to see what I was doing — not in a loving, "Hey, I'm checking in on you way" but in a "Why are you still sleeping?" way. That would get real old real fast. And when it's time for future me to have children and be a mom, I plan on breaking the cycle. If you really wanted to break the cycle, you would remain childfree. For those out there who are not, though: I wonder how much of the prenatal "I'm going to do my parenting thing this way" goes right out the window when faced with the reality of a larval human? The real issue (apart from the trendy nonsense, which I can kind of excuse given the author's apparent age) is that there are people who like to impose their own sleep schedules and preferences on others, and I really wish that would stop. Not everyone's a morning person. Some of us are biphasic, relying on two shorter sleep cycles instead of one longer one. Like when they say "Good morning" when you wake up from an afternoon snooze. Or "Look who decided to join us" when you're the last one waking up. All of these things are major contributors to me wanting to live alone. Not that I currently live alone; apart from the cats, I have a housemate. But I've trained the cats out of waking me up (mostly), and the housemate is on an even weirder sleep schedule than I am, and we don't generally get in each others' face about sleep. Or much of anything, really; it's a pretty good arrangement, at least in my view. Because nothing, and I mean nothing, feels better than getting enough sleep. And my definition of happiness is going to bed knowing that there's not a particular time when I have to wake up. |
| A bit about our feline overlords from The Conversation: The headline alone makes me wonder if the author actually lives with cats. Because one of the first things they do to our brains is disabuse us of any notion of "owning" them. Cats may have a reputation for independence, but emerging research suggests we share a unique connection with them – fuelled by brain chemistry. As usual, note the language: "emerging research suggests." The author (one Laura Elin Pigott, whose pronouns I'm going to assume based on the name) is careful to hedge her declarations here, but it's easy to slip past that. All I'm saying is that this shouldn't be taken as definitive science, like gravity, relativity, or evolution; but as a description of current research that is still subject to review and confirmation. I'm not saying I think it's wrong, mind you. The main chemical involved is oxytocin, often called the love hormone. Well. Oxytocin has been having a moment this year. I've seen it bandied about quite a bit. As its Wikipedia entry That's quite the chemical structure, isn't it? The formula is C43H66N12O12S2 (same source). So yeah, don't be afraid of "chemicals." If you are, there's a chemical regulating that, too. And now studies are showing oxytocin is important for cat-human bonding too. This shouldn't be surprising to any cat person, but as always, it's good to have studies to back it up. Oxytocin also has calming effects in humans and animals, as it suppresses the stress hormone cortisol and activates the parasympathetic nervous system (the rest and digest system) to help the body relax. We all know that pets help us relax (except maybe chihuahuas). It's good to delve into the mechanism. Scientists have long known that friendly interactions trigger oxytocin release in both dogs and their owners, creating a mutual feedback loop of bonding. Until recently, though, not much was known about its effect in cats. Yeah, probably didn't include chihuahuas in that study. Cats are more subtle in showing affection. Yet their owners often report the same warm feelings of companionship and stress relief that dog owners do – and studies are increasingly backing these reports up. Some cats are more subtle in showing affection. Again with "owners." Dogs may have owners, but cats have staff. In that study, women interacted with their cats for a few minutes while scientists measured the owners’ hormone levels. Right, because only women care for cats and only men care for dogs. Snort. Many people find petting a purring cat is soothing, and research indicates it’s not just because of the soft fur. The act of petting and even the sound of purring can trigger oxytocin release in our brains. And what's the stress hormone? Cortisol? Yeah, that's probably what gets triggered in me when a dog won't stop barking. And, to be fair, when the cat's on the table with something fragile near its edge, or when I wake up at 5am to the unmistakable sound of feline gastric distress. One 2002 study found this oxytocin rush from gentle cat contact helps lower cortisol (our stress hormone), which in turn can reduce blood pressure and even pain. I'm not a scientist. I just whistle at its butt as it walks by. And anecdotes aren't science, but this is my blog, so here's one: I used to have a calico cat named Maggie (not to be confused with my current calico, Zoe). Maggie was a very particular cat. Hated everybody and everything except me. Even my first wife, who picked her, couldn't really get along with her, so I got the cat after the divorce. The cat was never cuddly or attention-seeking; she'd occasionally let me pet her, but the only reason I knew she didn't hate me was that she didn't hiss, spit, or run away when I came near, the way she'd do with everyone else in the universe. (Now that I think of it, all of that describes my first wife, too.) One day, I had a massive pain in my neck, so bad I couldn't even get out of bed or roll over, so I just lay there in the most comfortable position I could manage and hoped that, eventually, someone would come by and help. Well, no one did, but Maggie jumped up on the bed, settled down on my upper chest on the side that was in pain, and started purring. I'm not going to claim that it healed me. Time did that, as usual. But she'd never done that before, and to do so on exactly the right spot, and in a way that didn't make the pain worse and maybe even relieved it a bit, well, that was really out of character for her. Toward the end of her life, she mellowed out some and would sit on my lap. Despite her neuroses, she was a good kitty. A February 2025 study found that when owners engaged in relaxed petting, cuddling or cradling of their cats, the owners’ oxytocin tended to rise, and so did the cats’ – if the interaction was not forced on the animal. That's the thing about living with cats: you learn to respect boundaries, if that somehow hadn't been drilled into you at an early age. Maybe humans could learn something from their feline friends on managing attachment styles. The key to bonding with a cat is understanding how they communicate. Isn't that, like, key to bonding with anyone? I don't trust the popular "love language" pop-sci, but communication is kinda important. As always, there's more at the article. No, dogs aren't completely left out of the narrative there, so it's not just about cat people. |
| An etymology lesson from NPR, which, yes, is still around: If you hear someone asking that question (or the similar "What's the tea?"), they're probably not referring to the steeped, hot beverage that has long been cherished the world over. Just to be clear that we're not talking about the world's most important beverage (sorry, coffee). Instead — to put it in other slang phrases of days past — they probably mean "what's the haps" or "what's the hot goss" or, for children of the pre-internet era, "what's the 411?" Or, you know, "dish" or "scoop" or any number of other synonyms for gossip, which may rival even "penis" for number of different names for it. The word traces its origins to Black gay culture. Okay, well, that shouldn't be too surprising. A lot of our slang traces to either Black or gay culture. "Three southern-born informants explained 'tea' as 'gossip,' such as that exchanged between 'girls' taking tea in the afternoon. They indicated that the expression was black and originally southern." Southern? So it's really sweetened iced Lipton. The original "tea" — the oolong or Earl Grey kind here — traces its English roots to the 17th century, where it was spelled and pronounced "tay," according to the Online Etymology Dictionary. "Tea" evolved from the word "chaa," which was derived, in part, from the Chinese "ch'a." That much, I knew. I've heard that there's only two words in all languages for the drink: ones that sound like "tea" and ones that sound like "cha." Like "chai" from India, or "thé" (pronounced tea) in France. Legend has it that in 2737 B.C.E., Chinese Emperor Shen Nung was seated under a tea tree when some of its leaves blew into a pot of water that his servant was boiling, leading the emperor to try the beverage, according to the Tea Advisory Panel, an industry group. Legend for sure. That has all the hallmarks of mythology. What probably actually happened was something like: someone from the lower classes who couldn't afford other drinks started steeping leaves, and discovered one that a) wasn't poisonous b) tasted good and c) gave them more energy with which to serve their overlords. The upper classes wouldn't ever admit that anything the peasants did could be good, so they made up the "apple fell on Newton's head" mythology. "So in 2014-ish, there started to be a lot of memes of Kermit the Frog, and one specific one — looks like it was posted to Tumblr — was a photo of Kermit the Frog sipping a Lipton tea," said Amanda Brennan, an internet culture and meme expert who is known as the Internet Librarian. Oh, a copyright and trademark violation all rolled into one! There's more history at the link; that is, if you consider stuff that happened this century "history." That's it, really. I just like knowing where words (and new uses for words) come from, so there it is: the tea. |
| I thought I might have addressed this at some point, and it turns out I did, in the previous blog, back in 2021. That entry can be found here: "My Baloney Has a First Name..." Carl Sagan’s Baloney Detection Kit: Tools for Thinking Critically & Knowing Pseudoscience When You See It Part of the reason I'm doing this is that I found an inconsistency. I wouldn't call it "baloney" or fake news or bullshit, but it illustrates exactly why we shouldn't take words on the internet to be absolute truth without some backup. Though he died too young, Carl Sagan left behind an impressively large body of work, including more than 600 scientific papers and more than 20 books. And yet, he was best known for his Mister Rogers-like TV personality. Sagan’s other popular books... are also well worth reading, but we perhaps ignore at our greatest peril The Demon-Haunted World: Science as a Candle in the Dark. Published in 1995, the year before Sagan’s death, it stands as his testament to the importance of critical, scientific thinking for all of us. It has been too long since I read that. If I can't even remember what I posted here four years ago, it would do me well to revisit it, since I try to promote real science in here. The article lays out the "Baloney Detection Kit." Or does it? Both this article and the one I linked in 2021 claim nine principles, starting with Wherever possible there must be independent confirmation of the “facts.” But today's link ends with "Occam's Razor," while the previously linked article had a lengthy bit about hypotheses needing to be, in principle, falsifiable, after that one. This doesn't sink to the level of fake news, in my opinion. It's, at worst, a different way to look at the source material (which, I reiterate, I haven't seen in decades). The article isn't a scientific paper. If you make a transcription error in a scientific paper, bad things happen. I had one in here a while back about the ooga-booga scare over black plastic cooking utensils; turns out they'd misplaced a decimal, and black plastic is about as safe as anything in your kitchen, and safer than most. I suppose even that is better than if the mistake went in the other direction, calling something safe when it's not, but still. The particular team involved in that, as I recall, had some sort of bias against the utensils (perhaps they were being paid by a manufacturer of different kinds of utensils, perhaps not), and it's that kind of bias that science is supposed to mitigate, as noted in the article: As McCoy points out, these techniques of mind have to do with canceling out the manifold biases present in our thinking, those natural human tendencies that incline us to accept ideas that may or may not coincide with reality as it is. If we take no trouble to correct for these biases, Sagan came to believe, we’ll become easy marks for all the tricksters and charlatans who happen to come our way. And there are more tricksters and charlatans than ever before. Or, at least, they have a broader range with the internet and all. Now, other sources break the principles up slightly differently, too. I suppose it's a bit like the Ten Commandments, which vary depending on which version and translation of the Old Testament you look at. “Like all tools, the baloney detection kit can be misused, applied out of context, or even employed as a rote alternative to thinking,” Sagan cautions. “But applied judiciously, it can make all the difference in the world — not least in evaluating our own arguments before we present them to others.” Sagan was, apparently, a far nicer person than I am, because I call it bullshit. Baloney is at least edible. Though "bullshit" holds the implication that it's deliberate, but it's not always so. Whichever version you see, though, I think the principles are sound. I may not have memorized them, but I still find myself applying them and, often, find articles that come up short. This isn't always a science problem; most of the time, it's a writing problem. And that's what we're really here for, isn't it? |
| From Smithsonian over 4 years ago, a bit of cinema history. How ‘One Hundred and One Dalmatians’ Saved Disney Sixty years ago, the company modernized animation when it used Xerox technology on the classic film Really, that's about all there is to it; the rest is explanation. Take a closer look at Walt Disney’s 1961 animated One Hundred and One Dalmatians film, and you may notice its animation style looks a little different from its predecessors. But to do that, you also would have to see those older films. Be careful, or you might become a movie history fan. That’s because the film is completely Xeroxed. No higher honor for a corporation than having their brand verbed. “The lines were often very loose because they were the animators’ drawings, not assistant clean-up drawings. It really was a brand new look,” says Andreas Deja, former Walt Disney animator and Disney Legend, about Xerox animation. Heh, you can say that again. Deja? Again? No? No? Come on, that was a good one. ...tough crowd. With animation growing more expensive, tedious and time-consuming in the mid-20th century, Xeroxing allowed animators to copy drawings on transparent celluloid (cel) sheets using a Xerox camera, rather than having artists and assistants hand-trace them. Thus continuing the trend of machines taking over our jobs. The article goes into some detail over how animation was done before machines took over those jobs, too. It's interesting, at least to me, but I don't have much to comment on. While Walt Disney didn’t necessary dislike Xeroxing, he found it hard to get used to the harsh look, especially for a story like Dalmatians that he adored. “It took a few more films before he softened his attitude toward it,” says Deja. He was also more concerned with upholding Disney’s iconic quality and charm than with finances. Things are a bit different now, huh? As the article points out, they later moved on from this technique, too. Obviously. Nothing else to say, really. I can't say Dalmatians was my favorite movie or anything, but this little insight into animation techniques and their history is pretty cool. |
| If it seems like I post a lot about cheese, that's because I like cheese. An important article from Food&Wine: This Is Officially the Best Cheese in the World, According to the 2025 Mondial du Fromage A perfectly aged wheel from Switzerland triumphed over more than 1,900 entries to win Best in Show at the prestigious competition in Tours, France. You might say the wheel rolled over its competition. The 2025 Mondial du Fromage took place in mid-September in Tours, France, bringing together the global cheese trade for celebration, exposition, and competition. As much as I love cheese, and as much as I like being in France, I don't think I could have handled being there. If you think wine snobs are bad, wait'll you meet a cheese snob. Especially a French one. This year, Switzerland’s Le Gruyère AOP Vieux, crafted by Simon Miguet of Fromagerie La Côte-aux-Fées, earned Best in Show among more than 1,900 submissions. No wonder they won. They had help from the fairies. While the cheese has long been a top contender at international competitions, history was made on the cheesemonger side: for the first time, Americans took both gold and bronze. Ooof. That's gotta sting worse than when California started winning wine competitions. Comparable to Olympic gymnastics in its rigor, the contest tested every aspect of cheesemongering. “The challenges encompass every aspect of cheesemongering, from general knowledge, to blind tasting, to making exact cuts by weight, to service and presentation,” says Johnson. I know you were thinking it. So I had to include this bit to show that yes, indeed, there was a cheese-cutting competition. Having two Americans on the podium marked a breakthrough. “We have always been viewed as underdogs in the global cheese community,” says Johnson. I've long thought of the US cheese scene as a far distant cousin to that of France, Switzerland, and even the UK. But perhaps things are changing. I still don't see a lot of specialty cheeses of US origin in stores. I know they exist. Hell, there's a monastery near me that makes its own gouda for sale, much as Belgian monasteries make beer. It's highly local and small-batch, though. The monastery is, incidentally, run by nuns. Don't ask me; I don't know. Complicating matters, the generic use of “gruyère” in the U.S. allows domestic producers to flood the market under the name. “The word ‘gruyère’ has been allowed to be used out of context,” Moskowitz explains. “Le Gruyère AOP is a protected cheese from Switzerland, but the word ‘gruyère’ by itself in the U.S. is like ‘cheddar,’ and the market is about to be flooded with domestic gruyère.” And yes, this is akin to the use of "sparkling wine" to denote fizzy wine from outside the Champagne region. As far as I'm aware, there's no international body enforcing AOPs or AOCs, though. The UN has other things to do, I suppose. All the French can really do is look down upon the American upstarts with scorn, but they're going to do that regardless of what we call our cheese. |
| I wish other people would stop making up deranged portmanteaux. They all suck, except for the ones I come up with. Like this one from Kiplinger: Noctourism: The New Travel Trend For Your Next Trip It's night owls' time to shine, because some of the best travel experiences happen at night. Also, it's not "new," and it's not a "trend" just because someone's desperately trying to make it one so that they can profit off of it. Curmudgeonly rantings aside, I wholeheartedly support nighttime activities. Typically when you plan a trip, you're packing your day with activities. No, I'm not. I don't want to return from the trip more exhausted than when I left. But what if instead of focusing on the a.m. hours, you centered your vacation around the p.m.? Ugh. It seems that in their hurry to be precious about this "new" "trend" (it's neither), they can't even get the hours of the day right. Night stretches from p.m. to a.m., for instance. Noctourism refers to travel experiences that occur at night — think stargazing or nighttime excursions. Not to minimize the awesomeness of skywatching—anyone who's been following along knows that I absolutely encourage going out and looking at the night sky—but I'm of the firm opinion that nothing good happens before sunset. If I had my way, as a night owl, I'd sleep most of the day and enjoy the benefits of the dark hours. But the hospitality and travel industries are stubbornly stuck on a lark's schedule. Check-in is at 4 pm. Check-out is at 11 am. Breakfast is from 6-9 am only, hours when I prefer to be asleep. It can be downright freeing to plan a vacation with an emphasis on the evening, rather than feel the pressure to get up-and-at-'em each morning. If you're feeling pressure, maybe you should see a doctor. Or get better traveling companions. You're on vacation, for fuck's sake. Relax. Then the article goes into specific post-sunset activities. It's a not-very-well-concealed ad for various travel services. While I tolerate book ads here, I don't like double-promoting other products. The article's there at the link if you're interested; some highdarks (they're not highlights) include the auroras and finding a place with dark skies for stargazing, but also other more Earthbound activities that are, like everything except sunbathing, better at night. I overcame my resistance to posting stealth ads here so I could rant about the name they use for nighttime tourism activities, which makes me gag every time I see it in the article. But like I said, the concept appeals to me. Now, where did I leave that bottle of Type-B? |
| This conversation keeps coming back to life, this time courtesy of The Conversation: Well, with the usual caveat that I'm not a biologist, I can answer that: we're about as close as we are to infinity. Remember the other day, I talked about an AI meant to emulate the personality of a dead person? It's kind of like that, only with genetic manipulation instead of neural networks. US biotech company [redacted] says it has finally managed to keep pigeon cells alive in the lab long enough to tweak their DNA – a crucial step toward its dream of recreating the dodo. (Their name is redacted in this entry because they've been lawsuit-happy lately.) And if they're not using actual dodo DNA, it gets even worse. Life is a continuum, and the extinction of a species breaks that continuum. At best, you get some very confused pigeons. This is an achievement avian geneticists have chased for more than a decade. Not to downplay the actual achievement here, which can be important to science in other ways, as the article notes. Those cells, once edited, [redacted] spokespeople say, could be slipped into gene-edited chicken embryos, turning the chickens into surrogate mothers for birds that vanished more than 300 years ago. Okay, some very confused chickpigeons. Remember the hype over the "dire wolves" a few months back? Yeah. Same people. Also not dire wolves. Almost everything we know about bird gene editing comes from chickens, whose germ cells (cells that develop into sperm or eggs) thrive in standard lab cultures. Pigeon cells typically die within hours outside the body. Chickens are also tastier than pigeons. Contrary to popular belief, the dodo wasn't hunted to extinction That process may create a bird that looks like a dodo, but genetics is only half the story. The draft dodo genome was pieced together from museum bones and feathers. Gaps were filled with ordinary pigeon DNA. If this sounds familiar, yes, I know, Jurassic Park. Since birds are actually dinosaurs, I suppose this is life imitating art. Sort of. Due to the fact it is extinct and cannot be studied we still don’t know much about the genes behind the dodo’s behaviour, metabolism and immune responses. I had an English teacher in high school who would knock our grades down one letter every time we used "Due to the fact that..." in a paper. Getting past that, though, yeah, what they said there. Plus, organisms don't exist on their own. They're part of an ecosystem. The ecosystem that supported the dodo has moved on. There is also the matter of the chicken surrogate. A chicken egg weighs much less than a dodo egg would have. In museum collections there exists only one example of a Dodo egg, and that is similar in size to an ostrich egg. Thus launching a whole new slew of "which came first?" jokes. These caveats are why many biologists prefer the term “functional replacement” to “de-extinction”. What may hatch is a hybrid: mostly Nicobar pigeon, spliced with fragments of dodo DNA, gestated in a chicken. And, really, I'd listen to biologists before I'd listen to me, here, and definitely before listening to a lawsuit-happy company with a vested interest in promoting their version of this. The “dire wolf” puppies unveiled in August 2025 turned out to be grey-wolf clones with a few genetic tweaks. Which all the biologists I follow were saying long before the puppies (more accurately cubs, but whatever) were born. And conservationists have warned that such announcements tempt society to treat extinction as something that is reversible, meaning it is less urgent to prevent endangered species disappearing. That's not a science problem; that's a PR problem. Even so, the pigeon breakthrough could pay dividends for living species... Germ-cell culture offers a way to bank genetic diversity without maintaining huge captive flocks, and eventually to reintroduce that diversity into the wild. Like I said, I'm not ragging on the science or the technological achievement, here. Just the claim that we're going to get dodos again, presumably so someone can put together a live-action Alice movie. For Mauritius, any return of dodo-like birds must start with the basics of island conservation. It will be necessary to eradicate rats (which preyed on dodos), control invasive monkeys and restore native forest. That's a rather tall order, considering that it's damn near impossible to eradicate rats. New York City tried. They even appointed a "rat czar," which I found extremely amusing, because it conjured up images of a giant Russian rat. Anyway, the rat czar resigned in disgrace, and the rat population of NYC continues to increase. But in the strictest sense, the actual 17th-century dodo is beyond recovery. Not just the strictest sense. In any sense (except that of marketing and PR). But, as I (and the article) keep repeating, there are other benefits to the research. I just feel bad for the living results of the process. |
| Sometimes, I wade into heavily controversial topics. Today, courtesy of some site called How-To Geek or How To Geek or How ToGeek or something, we have some potentially truly divisive assertions. I'll give 'em this much: at least they didn't call it "Top 7 Movies Better Than the Books." I’m an English professor, which means I spend a fair amount of my time uttering four magic words: “The book is better.” There are two types of English professors: those who accept that all genres can have value, and wrong ones. Fortunately, this guy seems to be one of the former. At least about writing in general. About the movies? We'll see. After all, books have hundreds of pages to flesh out detailed characters and build intricate worlds, and it can be tough to translate all of that into a two-hour movie. Yeah, well, even that is easier than summarizing it for TokTik, which is probably the only way to get anyone's attention these days. For 10 seconds at a time, anyway. However, every now and then, there comes a movie that is actually better (sometimes insanely better) than the book. I don't have an opinion on all of these. Sometimes, I didn't read the book. Sometimes, I didn't see the movie. Sometimes both, as is the case with the article's #7 (it's a countdown list like with Cracked). 6. The Lord of the Rings: The Fellowship of the Ring Wow. I hope this guy has good bodyguards and a paid-up life insurance policy. Okay, I can hear the Tolkien fans out there sharpening the daggers they got from the Barrows, but hear me out. Oh hell no, they're going for the nukes. However, the pacing of the book can be intimidating for beginners, especially those expecting a typical fantasy romp. Hell, man, Tolkien invented the typical fantasy romp. A dear friend once described what it felt like to read the Hobbits’ adventures for the first time as follows: “I’m going through the forest, I’m going through the forest … oh, look, mushrooms!” Okay, that's hilarious. 5. Jurassic Park It's been a long time since I saw that movie, and even longer since I read the book (I was on a Chrichton kick for a while there, and I do remember that some of his books were better than others, by like a lot). As a kind of cautionary tale in a sci-fi wrapper, the book does a great job of exploring the ramifications of science let loose without the restrictions of morality. Here's where it's my turn to get angry. "A kind of cautionary tale in a sci-fi wrapper?" What the hell do you think science fiction is for? Nevermind, I'll answer that: at least 50% of it is meant to be a cautionary tale. 4. Jaws Another "it's been a long time" for both for me. Hell, I remember finding the book in my aunt's paperback stash when I was a young teen, visiting her with my parents. Since the cover featured a naked woman swimming, and you could barely make out a nipple, I felt like I had to keep it hidden from my aunt and my parents lest they accuse me of reading porn. I don't remember much else about the book, except that I did, indeed, read it. And watched the movie. Can't comment on this article's opinion, though. 3. Blade Runner Of all the book/movies on this list, this one is most relevant to me. Few movies are as different from their original source material as Blade Runner. The Phillip K. Dick novel Do Androids Dream of Electric Sheep? features some of what movie fans might expect, including our protagonist (referred to as “bounty hunter” rather than “blade runner”) tracking down renegade robots (referred to as “androids” rather than replicants). For starters, Electric Sheep may or may not qualify as a "novel" if you go by word count, which is, at least according to this source, But that's pedantic. Look at some of the other Dick stories at that link I just did: "The Minority Report – 14,402" and yet it, too, was made into a feature-length film. Most interestingly, though, "The Man in the High Castle- 80,586" Novel-length by almost anyone's standards, it didn't become a movie, but a 40-episode TV series. But we're living in a world where a short novel like The Hobbit gets the three-movie expansion treatment, so, whatever. In this bleak dystopia, our hero must deal with heavy philosophical questions about what humanity and identity are, and what makes us different from the replicants he has been hired to kill. Which is, I say, one of the main purposes of science fiction (sometimes in conjunction with the "cautionary tale" thing above). I'm going to shut up about it now, though. By now, everyone who knows me is aware that the Director's Cut of that movie is my favorite of all time, but I'm not here to discuss it today. All I'll say is that since the book (novella or novel, whatever) and movie are so different, I think it's a little unfair to compare them. There's a couple more on the list, one famous for being a movie and the other famous for being a book. In neither case did I read the book, so no comment. |
| Here's a potentially misleading headline from PopSci. There's a persistent myth that's been floating around for as long as I can remember that the Inuit have multiple (the actual number changes and is ultimately irrelevant) words for snow. This is at best a misunderstanding, Or so the Wiki link there says, anyway. Regardless, my point is that this study of precipitation seems to result in nine names for precipitation rather than the common three. Most of us generally think of precipitation in terms of three varieties: rain, snow, and sleet. Well, okay, but there's also hail. And frogs and dust and bird poo and locusts, but I'm willing to limit this discussion to some type of water falling from the sky. Which leads me to the simplest way to describe precipitation: water falling from the sky. It's a legitimate point of view to note that rain, sleet, snow, and hail are all forms of water. It's probably not very helpful, though. Even in hydrology, you need to know whether the water falls as snow and melts slowly (or not at all), or as rain and runs off quickly. The next obvious possible categorization, then is: liquid or solid? This lumps sleet and snow into the same category, which, as far as I'm concerned, is absolutely fair. And it implicitly includes hail. What I'm getting at here is that this starts to look less like science and more like purely human definition. Like how big is a pebble? A rock? A boulder? The dividing lines are kind of arbitrary. Or how a planet is defined. Or hills vs. mountains, or oceans vs. seas. In fact, a team of researchers including NASA engineers spent almost a decade analyzing weather data to fine-tune these categories. Now, I am not saying this was a waste of resources. There are, as the article notes, good reasons to categorize things as they did. All I'm saying is that a different team of researchers, perhaps in a different country, could very well have come up with different categories. As they explained in a study recently published in the journal Science Advances, they aren’t trying to nitpick—they’re hoping to save lives. Another "categorization" example is the hurricane scale. It is, as far as I know, based on sustained wind speed. But the cutoffs are somewhat arbitrary. For example, look at Category 2 on the scale, here. One can tell at a glance that the Saffir-Simpson scale is American in origin, because it's based primarily on miles per hour. If a more civilized country had produced it, perhaps Cat 2 would have been 155-174 km/h. In practice, there's little difference between that and the current Cat 2 definition, but it does give primacy to SI units rather than Imperial ones. And this isn't even getting into how that scale may not be adequate for risk assessment. The headline article also gives a temperature range in Fahrenheit. So, yeah, definitely US. It’s understandable to think that snow only enters a forecast when the temperature drops below freezing, but that’s actually not the case for meteorologists. Temperature changes with altitude, so no, I don't find it all that understandable to think that. The article describes the methods they used to record ground-level precipitation, and they're interesting, but I don't really have any comments on them. Except for this bit: To start, they installed a specially designed camera array from NASA called the Precipitation Imaging Package (PIP) at seven strategic sites across the United States, Canada, and Europe. I feel like the sample size here should be questioned. While pretty much the entire world experiences precipitation—even Antarctica and the Atacama desert, to a very small extent—the patterns are different everywhere, and, moreover, the bulk of the world's population lives in Asia. If you're going to make categories based on how people are affected, maybe take into account the majority of the people? So, what are the nine technical categories to be on the lookout for this fall and winter? Oh, yeah, and they left out the entire southern hemisphere. You can read these categories at the link; what they are determined to be may be important to know, but irrelevant to the point I'm making here. Which is that ultimately, categorization, while sometimes useful, is a human activity subject to human biases. Also that frozen precipitation of any kind sucks ass. Admittedly, that's my bias. |
| Got a short one today, from Nautilus. The metaphorical microscope, anyway. This spring, I had the pleasure of sitting in a crowded school gym in middle America and watching my eldest son cross a stage and shake a few hands to mark his successful completion of middle school. I'm just going to say it: graduation ceremonies for anything but high school and college are a sure sign of a civilization in decline (as if we needed more proof). I'm not saying that because no one did it when I was a kid and we should stick to those ways because they were better (which is wrong), or out of jealousy (definitely not jealous). It's because after kindergarten, elementary, and middle school, further education is, at least for now, mandatory. High school might be the end of someone's formal education, and college probably is. But, you know, some of those middle schoolers will get shot, or OD, or commit suicide in high school and never make it to graduation, so I suppose there's some rationale for it. I digress. This article isn't about participation trophies, but the noises our hands can make. The organizers of my son’s ceremony sought to head off the outbursts of clapping and hollering that are wont to emerge at these affairs. Another reason not to have spurious graduation ceremonies: people simply cannot be trusted to act with restraint or consideration for others. But then something strange happened. Once the first student’s name was called, a small group of people (their family, I presumed) clapped in unison. One coordinated, staccato clap. Playful acquiescence to the rules. That's actually pretty cool. But then, as more names were read aloud, the joke reverberated. In short order, every announced name elicited that single, synchronized clap from the crowd. Of course. Monkey hear, monkey do. This emergent phenomenon got me thinking about lots of things: synchronous behaviors in humans, the sociology of groupthink, how sound travels in a gymnasium. This is how science gets started. So when a little publicized Physical Review Research study on the physics of clapping made the media rounds shortly afterward, I realized I hadn’t considered the acoustics of applause itself. And now I'm feeling a little shortchanged, because I wanted to learn more about "synchronous behaviors in humans, the sociology of groupthink, how sound travels in a gymnasium." But, okay, at least I'll learn something. Turns out that science had yet to fully elucidate the intricacies of sound produced by two hands brought rapidly together. So, we couldn't even answer "what is the sound of two hands clapping?" With an impressive variety of experimental methods—including recordings of clapping humans and models of soft plastic hand replicas—the researchers dissected the physics involved in several types of claps. The article's description of the science is brief and probably oversimplified, but it's good to see this kind of research being done. No, it's not cutting-edge physics or a new paradigm in biology, but the more we understand, the better off we are. One never knows when such seemingly mundane science will have applications in unexpected places. I'd end with a quip about applauding the scientists' efforts, but the article's author beat me to it. Sonofabitch stole my line. |
| It's not often I'll bother with History (as in The History Channel), because, every time I saw something from them, it was WWII or aliens or aliens causing WWII. Okay, that's not fair. There were also Secret Bible Codes, some of them put in there by aliens. I'll give 'em a chance with this one. Did the Trojan Horse Really Exist? Some writers have struggled to rationalize the Trojans' gullibility. Well, it's a legitimate question, I suppose. I think the importance of the Trojan Horse lies in its metaphor, whether it existed in consensus reality or not. Like with Eden or Atlantis. The story of the Trojan Horse has been celebrated for thousands of years as a tale of cunning deception... To the victor, as they say, go the spoils, as well as the ability to write history to make you look good and the enemy look like a bunch of fools. I'm going to assume everyone here knows about the Trojan Horse. If not, there's always the linked article. But at least one later Greek writer was struck by the gullibility of the Trojans in falling for this obvious ploy. The second-century geographer Pausanias described it as anoia—"folly" or "utter silliness." What might not be obvious from the article is that there's a 1500-year gap between the generally accepted time of the Trojan War and the time of Pausanias. 1500 years is, by any measure, a long time. Think about what you know about what happened fifteen centuries ago, during the sixth century C.E. Just why the Trojans were fooled by the Trojan Horse, without first checking inside it for enemy warriors, is more complicated than it may seem. Well, for starters, we know about the Horse due to an epic that included gods, sorcery, and an epic love triangle. Of those, the only thing I'd give any credence to is the love triangle, and even that was most likely exaggerated for effect. “But it’s myth,” Burgess says, adding, “The wooden horse is not nearly as strange or fantastic as most of the story.” Like I said. Homer doesn't actually say much about the Trojan Horse. As the article notes, that particular story was added to later on. Virgil gets mentioned, of course, but even Virgil was over a thousand years later. University of Oxford classicist Armand D'Angour, author of The Greeks and the New: Novelty in Ancient Greek Imagination and Experience, says archaeology indicates a war destroyed Troy VI—the sixth of nine ancient city layers discovered during excavations at Hisarlık near Turkey's Aegean coast. What the article seems to leave out is that the Troy excavation was the beginning of modern archaeology, and as far as I know, the best we can do is say this might have been Homer's Troy. That suggests Homer's epics contain echoes of true events, and the Trojan Horse may be one of them. Yeah, that's pretty damn common with myths and other ancient stories. The trick is figuring out which elements are factual and which are fictional. But even the fictional ones have meaning for us, which is what elevates it to mythological status. It's like historians trying to decide who the historical King Arthur was, or wondering who were the real Romeo and Juliet. Regardless of the answer, those stories are dug in deep in the soil of our collective consciousness, at least here in the West. "I like the theory that the 'horse' was based on the notion of a wooden siege engine covered in horse hides," D'Angour says. Back when I was in high school, trying to read Virgil in the original Latin, that's the interpretation I remember my teacher favoring. I don't know if it's true or not, but it tracks with what I know of the history of warfare, and it doesn't involve gods or monsters. There are also suggestions that Troy VI was destroyed by an earthquake, in which case the Trojan Horse could have symbolized such a disaster: Poseidon, the Greek god of the sea, was also the god of horses and earthquakes. And so I learned something new (to me) about Poseidon. D'Angour doesn't think the Trojan Horse was an earthquake, but he reasons there may have been some truth in the story. "What a feat of imagination that would be, if there were in fact no material counterpart," he says. And yet, humans are capable of such feats of imagination. We know this. Just read fiction or watch a movie. Humans haven't changed all that much in 3000+ years (though society and technology certainly have). According to Virgil, a Trojan priest of Apollo named Laocoön warned of danger, declaring "Timeo Danaos et dona ferentes”—Latin, which means “I fear the Greeks, even bearing gifts” in English. And I'm just including this bit because I've seen some confusion as to the origin of the phrase "Beware of Greeks bearing gifts." It wasn't Homer. And also so I can make this pun: if you made it this far, you know I don't really have an overarching point, so you may be disappointed. I don't charge for this service, though, so beware of geeks bearing gifts. |
| I was hoping for something less dense today, like maybe helium. But no, the random number gods have it in for me. From a source I don't remember ever linking before, cybernews: The more I hear about AI, the less I care. Okay, that's not really the case; I do care. It's just that, to paraphrase Malcolm Reynolds, my days of not ranting about it are definitely coming to a middle. Former CNN journalist Jim Acosta conducted an “interview” via ‘The Jim Acosta Show,” according to The Washington Post, which internet users and social media users have described as “ghoulish” and “disturbing.” I get why all those things in quotes are in quotes. It still made my eye twitch. The interview takes place on the journalist’s Substack and shows him talking to late teen Joaquin Oliver, who was killed in the 2018 Parkland high school shooting. Okay, you know the French surrealist painting of a tobacco pipe with the caption "Ceci n'est pas une pipe?" The intent, at least insofar as I understand it, is to point out that the image of something is not the thing. We take shortcuts, though, so if I showed you a picture of my cat and you said, "That's your cat?" I'd just agree. But the reality of it is that it's an image of what my cat looked like (probably extremely cute) whenever the picture was taken. Point being, cela n'est pas un étudiant. No, I'm not going to get into the difference between ceci and cela. Doesn't matter for this discussion. Oliver's parents reanimated the teen using artificial intelligence (AI) to discuss gun reform and gun-related violence in the United States. The thing is, when I saw the headline, I felt kind of a little bit outraged. How dare the interviewer do such a thing! It's a bit like setting up a strawman. But then I got to this part, where the parents did it, and then I'm like, "Huh. Now that's an interesting ethical dilemma." Because the kid wasn't a public figure, it feels wrong to approximate him with an LLM. For whatever reason, I don't have the same judgment about family doing it. More recently than the linked article, I saw a brief blurb about someone giving the late, great Robin Williams the LLM "resurrection" treatment, and his daughter spoke out against it. That feels different too, since he was a public figure. Oh, and no, they didn't "reanimate" the teen. Good gods, if you're going to do journalism, use better language. Yes, I know I sometimes don't do it, myself, but I'm not in the same league. Or even the same sport. “Oliver,” responds in a way typical of an AI model, clinical and sterile. Media outlets have even compared the avatar’s responses to Yoda, as the model provides pearls of wisdom and asks Acosta questions that feel unnatural. Fuck's sake, that's because it's not actual AI, even if that's the accepted term for it. It's a Large Language Model. It's not like Data from ST:TNG, or even HAL 9000. Both of which are, of course, fictional. Nicholas Fondacaro, the associate editor of NewsBusters, a blog that attempts to expose and combat “liberal media bias 24/7,” spoke out against Acosta, dubbing the interview “very disturbing.” Why the hell should I care what that guy thinks? In the clip shared by Fondcaro, Oliver’s father tells Acosta that the teen’s mother spends hours asking the AI questions and loves hearing the avatar say, “I love you, mommy.” Okay, that's more worrisome, in my view, than an interview with the LLM. I can't even begin to comprehend the grief of a parent losing a kid, and I'm no psychologist, but that seems a rather unhealthy way to cope. Acosta’s interview with Oliver caught the attention of many, including Billy Markus, the creator of Dogecoin, who simply said, “I hate this.” Another person whose opinion I can't give half a shit about. There's more at the article. There's probably pictures and X links, too, but to be brutally honest, I couldn't be arsed to play with my blocker settings to see them. Thing is, though: even if the consensus is that this is a Bad Thing, what can we do about it? Pass laws? What would such a law look like? "No using LLMs to pretend to be a dead person?" Then anyone who plays with it to, I don't know, "rewrite the script of The Fast and the Furious in the style of a Shakespeare play" would be breaking the law. I'd pay to see that, by the way. Just saying. Though I'm not a F&F fan. You could maybe hold it up as voluntary journalistic practice not to do such a thing, but these days, that doesn't mean shit because everyone's a potential journalist and many don't adhere to journalistic norms. About all we can do is judge them and shame them, which, well, refer to my entry from two days ago. Or, if you don't think this is such a bad thing (and I'm not trying to tell anyone how to feel about it here), then don't shame them. Still, I can say this: the use of LLMs was disclosed from the get-go. My biggest problem with what we're calling AI is when it's used without full disclosure. It is, I think, a bit like not putting ingredients on a food label: it takes important information away from the consumer. So, I don't know. For me, I feel like I feel about other forms of speech: if you don't like it, you don't have to watch or read it. I'd probably feel differently if it wasn't the parents who trained the LLM, however. I'm open to other interpretations, though, because I'm not an AI. Sometimes, I wonder if I'm even I. |