Items to fit into your overhead compartment |
All that text in my blog intro? It's nice to have some verification. From Mongabay: The report focuses on Central America. As noted at the top of the article: "Unlike temperate regions with diverse scavenger communities, the neotropical forest system showed vultures as the primary vertebrate decomposers..." I had to look up "neotropical," and apparently it simply refers to New World tropical zones. Anyway, point is, I guess, that other regions have other scavengers besides vultures. “Absolutely disgusting, so grim, the worst fieldwork of my life, but also extremely rewarding in a very odd way,” said Julia Grootaers, describing her three months collecting data among rotting pig carcasses in Costa Rica’s Osa Peninsula. I will not make a joke about her name. I will not make a joke about her name. I will not make a joke about her name. Their findings, published recently in the journal Ecology and Evolution, reveal that in the absence of vultures, carcasses take twice as long to decompose, and fly populations double, with significant implications for ecosystem health and potential disease transmission. It's probably good to note that there's more than one species of vulture ![]() The experiment consisted of 32 pig carcasses deployed in the southern Pacific region of Costa Rica, half in grassland and half in forest habitats. Eight carcasses were covered with exclusion cages for each habitat to prevent vulture access, while eight control carcasses remained uncovered. Half the experiment took place during the wet season and half during the dry season. While I'm no expert, that sounds like a fair methodology. One unexpected finding was how few vertebrate scavengers visited the carcasses, such as large cats or possums. One might consider that those mammals/marsupials could have an aversion to carrion that's been handled by humans, as these carcasses were. That could be an unrevealed confounding factor. (Also, it's "opossum." Respect the Powhatan.) Fly populations doubled at carcass sites without vultures, a finding with potential public health implications. Slower-decomposing carcasses could have important consequences for infectious and zoonotic (animal-transmitted) diseases in the tropics. Flies are also important contributors to the ecosystem, but unlike vultures, they tend to land on your food and spread germs there. This study is also significant because vulture research has almost exclusively concentrated on Old World species, those found in Africa, Asia and Europe. And apparently, New World vultures represent an entirely different clade than the Old World vultures, not very closely related at all. Anyway, point is, disgusting though we find their habits, vultures are cool. And yeah, I couldn't resist the pun in today's entry title. How could I? |
From SciAm, an astronomically bad idea. ‘Space Advertising’ Could Outshine the Stars—Unless It’s Banned First ![]() Astronomers are racing to protect the dark skies as private companies seek to place large advertisements in Earth orbit Yes, I know there are worse things going on: human trafficking, slavery, rape, murder, war, celebrity gossip (to name but a few). That doesn't stop me from hating this as well. Imagine stepping outside to stargaze on a clear summer night, only to see no stars but rather the garish glow of advertisements streaming across the sky. As usual, science fiction came up with this first. The dystopia subgenre, anyway. This seemingly science-fictional scenario isn’t actually implausible: private companies are inching closer to launching swarms of tiny maneuverable satellites to create billboardlike displays big and bright enough to be seen from the ground. Just when you think we've reached peak capitalism, something like this gets floated. It's one thing to loft satellites up there to broadcast shows and provide internet connectivity, both of which result in a barrage of ads. But they're optional ads. You don't have to tune in or connect, and you can remain blissfully ad-free. This, however, would be inescapable, unless you just stay inside all the time. The suddenly all-too-real prospect of large-scale space advertising prompted Piero Benvenuti, former general secretary of the International Astronomical Union, to raise the issue in February during a subcommittee meeting of the Committee on the Peaceful Uses of Outer Space (COPUOS), the United Nations body that governs the use of space for peace, security and development. Speaking of internet connectivity, astronomers are already bitching about Starlink satellite constellations, which tend to be bright and get in the way of observations. Starlink (whatever your opinion of its company's CEO) at least serves a useful function, in theory, providing internet access in remote locations. This? This would serve no useful function to anyone who isn't doing the actual advertising (and even then, it's questionable). “There is absolutely no reason why you should use space in such a useless way to advertise commercials,” Benvenuti says. Well, I wouldn't say "absolutely no reason." Obviously, someone thinks there's a reason, and that reason is money. In 2020 Russia granted Avant Space a patent for a laser-based technology to project messages, logos and other images for advertisers onto the sky. Hey, look, actual space lasers. And they're not Jewish. Their vision, Sitnikov says, is “to prove that space is not just for scientists, not just for the military—it is entertainment, too. And people like entertainment.” It depends on the entertainment. I don't consider ads entertainment. I consider them an interruption of my entertainment. Yes, on occasion, there are entertaining commercials, but they are exceptions. In 2000 such concerns helped to spur the U.S. Congress to pass a federal law that banned the issuance of launch licenses to companies for the purpose of ferrying payloads for obtrusive space advertising. That's nice and all, but at my last count, there were at least six countries and one European Union with their own space capability, and the US is only one of them. This region of space around Earth is home to thousands of defunct rocket stages, dead satellites and discarded hardware that all zip around our planet at dangerously high speeds. On the plus side, maybe this orbital debris can finally have a good purpose: destroying the ad lasers. But, as the article notes, such collisions would create even more debris. In case you can't tell, I'm completely against this idea. I hate ads to begin with, and appreciate astronomy (not to mention the simple beauty of the night sky, which is hard enough to see from most places now). Hey, maybe the US Space Force can finally get something to do: take down the ads. |
The random number generator laughs at me once more. Here's another bit about happiness, this one from last year in Knowable. Scientists scrutinize happiness research ![]() From meditation to smiling, researchers take a second look at studies claiming to reveal what makes us happy "Claiming" being the key word there. As if the answer is the same for everyone. We all want to be happy... [Citation needed] ...and for decades, psychologists have tried to figure out how we might achieve that blissful state. Maybe it's by not paying any attention to psychologists? But psychology has undergone serious upheaval over the last decade, as researchers realized that many studies were unreliable and unrepeatable. This is my shocked face: ![]() Here’s what we know so far, and what remains to be reassessed, according to a new analysis in the Annual Review of Psychology. I'm skimming a bit. I'm late getting to this today, and tomorrow's entry may be early; plus, I just ragged on happiness research yesterday. One long-standing hypothesis is that smiling makes you feel happier. Spoiler: questionable, unverified. Which, again, absolutely shocks me (in a sarcastic way), because if there were ever a perfect example of confusing cause and effect, this would be it. I don't doubt it works for some people. But again, not everyone. For me, if I had to paste a fake smile on my face all day (say if I had to work a ret-hell job), someone would end up getting punched. Researchers have also found that external agencies can promote people’s happiness. Giving people cash promoted life satisfaction, as did workplace interventions such as naps. Huh. By absolute coincidence, having money and taking naps make me happy. The researchers didn’t find clear evidence of benefits for volunteering, performing random acts of kindness or meditation. I take it they also didn't find those things decreased happiness, so if you want to do them, do them. Dunn and Folk didn’t find any preregistered studies at all on exercising or spending time in nature, two oft-recommended strategies. Again, just me here, but I find that exercise has other benefits; spending time in nature, on the other hand, just means I have to check myself for ticks afterwards. It does make me appreciate my nice comfortable house and bed more, so I suppose there's that. Anyway, most of the article is about applying greater rigor to psychology studies, which is probably a good thing overall. And that's probably all I have on happiness for a while. Maybe. Hopefully. |
I know I've touched on this theme before, but I don't think I've shared this particular article. It's from BBC, and it's a few years old. Why our pursuit of happiness may be flawed ![]() It is an emotion linked to improved health and well-being, but is our obsession with being happy a recipe for disappointment, asks Nat Rutherford. Well, for starters, what's a Brit doing talking about a concept enshrined in the founding documents of the rebel colonies? Okay, fine, I'll give them a pass on that one. Perhaps you want to spend more time with your family, or get a more fulfilling and secure job, or improve your health. But why do you want those things? Chances are that your answer will come down to one thing: happiness. Our culture’s fixation on happiness can seem almost religious. By "our," I don't know if he's talking about British, Anglophone, or generally European and its derivatives. Because not all cultures are obsessed with happiness, but it does seem to be a Western thing. It is one of the only reasons for action that doesn’t stand in need of justification: happiness is good because being happy is good. But can we build our lives on that circular reasoning? As regular readers may remember, I distrust "happiness" as a goal. I think it's what happens (yes, those words, happy and happen, share the same proto-English root, one that meant something like "luck") when you're doing other things. A survey in 2016 asked Americans whether they would rather "achieve great things or be happy" and 81% said that they would rather be happy, while only 13% opted for achieving great things (6% were understandably daunted by the choice and weren’t sure). Fortunately, it's not a binary choice in reality. Neither is wealth and happiness. The idea that rich people are miserable while poor people are happy is a lie we tell poor people to keep them from getting too uppity. There is some evidence that the obsessive pursuit of happiness is associated with a greater risk of depression. While I don't trust "some evidence" necessarily, this tracks for me. In his recent book, The Enlightenment: The Pursuit of Happiness, historian Ritchie Robertson argues that the Enlightenment should be understood not as the increase in value of reason itself, but instead as the quest for happiness through reason. Oh look, it's a book ad. That should make the author happy. Or rich. Or both. It’s easy to assume that happiness has always been valued as the highest good, but human values and emotions are not permanently fixed. Some values which once were paramount, such as honour or piety, have faded in importance, while emotions like "acedia" (our feeling of apathy comes closest) have disappeared completely. From what I understand, honor (or honour, depending on your geographical location) is still paramount in some cultures. Not just Klingon, either. Self-help books and "positive psychology" promise to unlock that psychological state or happy mood. But philosophers have tended to be sceptical of this view of happiness because our moods are fleeting and their causes uncertain. Instead, they ask a related but wider question: what is the good life? I believe Conan the Barbarian answered that question definitively: "To crush your enemies, see them driven before you, and to hear the lamentations of their women." In short, I'm pretty sure the answer is different for each person. For example, lots of people find having children makes them happy (or at least claim it does). For me, that would be the very definition of Hell. One answer would be a life spent doing things you enjoy and which bring you pleasure. A life spent experiencing pleasure would, in some ways, be a good life. But maximising pleasure isn’t the only option. Every human life, even the most fortunate, is filled with pain. Painful loss, painful disappointments, the physical pain of injury or sickness, and the mental pain of enduring boredom, loneliness, or sadness. Pain is an inevitable consequence of being alive. Oh, you've been listening to Buddhists? Yeah, life has its ups and downs. In my view, the downs help us appreciate the ups. Studies have shown that having loving attachments correlates with happiness, but we know from experience that love is also the cause of pain. What if pain is necessary and even desirable? Yeah, no, not unless you're a masochist (not that there's anything wrong with that). But there's something to be said for purposely enduring the painful parts in hope that things will improve. Like getting a root canal, known to be painful and boring (that's a pun, by the way) in the short term, expecting that your toothache will go away. Less dramatically, all the good things in life entail suffering. Writing a novel, running a marathon, or giving birth all cause suffering in pursuit of the final, joyous result. I question those examples, especially the last one, but I wouldn't know. Well, except for the "writing a novel" part. I didn't suffer while writing mine; it was challenging, but I enjoyed the process. The "suffering" happened when I went to edit. Friedrich Nietzsche, in The Genealogy of Morals, saw that we do not merely endure pain as a means to greater pleasure because "man…does not repudiate suffering as such; he desires it, he even seeks it out, provided he is shown a meaning for it, a purpose of suffering". In Nietzsche’s view, pain is not alleviated through pleasure, but instead through meaning. Ah, well, too bad there's no meaning, then. The American philosopher Robert Nozick came up with a thought experiment to make the point. Nozick asks us to imagine a "machine that could give you any experience you desired". The machine would allow you to experience the bliss of fulfilling your every wish. You could be a great poet, become the greatest inventor ever known, travel the Universe in a spaceship of your own design, or become a well-liked chef at a local restaurant. In reality though, you would be unconscious in a life-support tank. Because the machine makes you believe that the simulation is real, your choice is final. One wonders if he came up with these things before, or after, Star Trek's holodeck and The Matrix, both of which came out during his lifetime (yes, I looked up his bio). Would you plug in? Nozick says you wouldn’t because we want to actually do certain things and be certain people, not just have pleasurable experiences. Okay, Nozick wouldn't. As I mentioned above, I'm pretty sure there are people who would. If the simulation feels real in every way, what's the difference? That you won't be remembered by history, like Einstein or Curie? News flash: most of us won't, anyway. But this touches on my own philosophical point, which is that we get happiness not by aiming for it, but through accomplishment. Nozick’s experience machine aimed to disprove the essential claim of utilitarianism, "that happiness is desirable, and the only thing desirable, as an end". But I can't fault the guy for railing against utilitarianism. Dissatisfaction, unhappiness, and pain are part of the human condition and so "it is better to be a human being dissatisfied than a pig satisfied", according to Mill. He continued to believe that happiness was deeply important, but came to see that aiming at happiness will rarely lead to it. Fuck me, I agree with John Stuart Mill about something. Shoot me now. (In my defense, with this philosophy, he contradicted his own earlier works.) What Mill recognised was what Aristotle had argued two millennia earlier – the passing pleasure of happiness is secondary to living a good life, or of achieving what Aristotle called eudaimonia. Why'd he have to name it in Greek? Oh... right. Eudaimonia is difficult to translate into our contemporary concepts. Some, like the philosopher Julia Annas, translate it directly as "happiness", while others scholars prefer "human flourishing". Whatever the translation, it marks a distinctive contrast to our modern conception of happiness. Literally, I believe it translates to something like "good spirit," but the problem with that translation is that "good" and "spirit" have multiple definitions. For instance, for me, Scotch is a good spirit. But I think the sense is more like virtue and a pursuit of perfection (though without expecting actual perfection). Virtue, also ill-defined and culturally relative, has fallen out of favor as a goal, replaced by the selfish "happiness." Like our modern conception of happiness, eudaimonia is the ultimate purpose of life. But unlike happiness, eudaimonia is realised through habits and actions, not through mental states. Happiness is not something you experience or obtain, it’s something you do. It's not necessarily the "ultimate purpose of life," but okay. There's more at the article, of course, but I've said what I needed to say. I think that, while the author is better-versed in philosophy than I am (and British), we've reached similar conclusions. And that makes me happy. |
Mother's Day has come and gone, here in the US, with my usual avoidance of anything related to it and ritual blocking of any business that emails me with MD promotions. This article, from Atlas Obscura way back in 2018, can be an exception. The Ultimate Guide to Bizarre Lies Your Mom Told You ![]() Turns out mothers all over the world are telling a lot of the same outrageous fibs. One particularly famous parental fib involves avoiding tough conversations with your kids about death. Your dog gets run over by a truck, so you tell the kids he went to live on a farm where he'd be happy running around outside all the time. Well, that wouldn't have worked for us because we lived on a farm. And then they have the chutzpah to tell the kid that lying is bad and you shouldn't do it. Being a mom is a tough job, in large part because you just can’t reason with small children. What you can do, however, is lie to them. In honor of Mother’s Day, we asked Atlas Obscura readers to send us the most outlandish white lies their mothers ever told them. As it turns out, moms all over the world are telling some wonderfully inventive lies. I doubt many of them are "inventive." They were probably passed down from their own lying mother, and so on. Some do, however, have modern twists. Many mothers still tell variations on the classics: If you make a funny face, it will stay that way; if you eat before you swim, you’ll get cramps (or die); moms have eyes in the backs of their heads, and so on. Calvin and Hobbes did a great take on the funny face thing. ![]() We couldn’t include all of the fantastic entries we received, but we’ve collected over 100 of our favorites below. Clearly, I won't be commenting on all of them here. “In order to keep us kids from stealing pennies from water fountains, my mother told us the water was electrified and we would die.” —G. Johnson, Georgia Yeah, I would have still had to find out for myself. That's the kind of kid I was. “Mom always knew when I was fibbing. She said she could tell because I had a black mark on my forehead. My grandma used to say the same thing. I would run to the mirror to see it, but it was never there. They said I couldn’t see it because fibbers eventually go blind! I was scared to death.” —Batzion, Chicago, Illinois Would have been funnier if Grandma were blind. “Eating the crusts of your bread will give you curly hair.” —Rosie, Farnham, United Kingdom That seems to be a common one, coming from both UK and US sources. This is the first I've heard of it. Unlike apparently some kids, I never had a problem with bread crusts. Of course, the crap they passed off as "bread" (mostly Wonder brand) never had a real crust. My only hardline objection was I wouldn't eat the end slices of the loaf, and, as an adult, I still avoid them. But it hardly matters because I prefer real bread with firm, chewy crusts. “Eating end of a bread loaf will help to grow breasts.” —Elina, Latvia This is not why I avoided the ends, but it's hilarious. “I wanted a pet very badly and my mother told me that if I could put salt on the tail of a bird, I’d be able to catch it. Hours were spent outside with the salt shaker and various homemade traps.” —Anne Falbowski, Colchester, Connecticut "Mission accomplished." -Anne's mom, presumably. “Not to play in rain puddles. Will get polio.” —Maryann Kelly, Boston, Massachusetts I don't know about polio, but I don't doubt you can catch something from playing in rain puddles. Tetanus, perhaps. “You get canker sores if you pee off a bridge.” —Stacey Henrikson, Rochester Hills, Michigan I had many questions, until I remembered that Stacey is also a boy's name. “My mother told me if I bit my nails, a hand would grow in my stomach.” —Mary Pagone, Los Angeles, California Lie? Yep. Brilliant and effective? Also yep. “Don’t let your umbrella open inside the house or your mommy is going to die.” —Norton McColl, Sao Paulo, Brazil I expect Teenage Norton wasted many hours opening and closing an umbrella in the house, to no avail. “That there was a man that traveled around town and he would chop off your middle finger if you used it to make crude hand gestures.” —G. Johnson, Georgia "But I need that one!" “To deter my brother and me from eating my mom’s delicious homemade chocolate chip cookies she told us the extra crunch to them were frog legs. Really they were walnuts.” —Jen, California I should be offended on behalf of the French for that, but they already do "offended" so well. “Never go swimming in the pool/ocean after eating watermelon (common parental lie in Israel).” —Sharon, Israel One wonders why it's watermelon in particular. I got the "don't go swimming after eating" warning, but for everything. “My mom told me that sugary foods had little bugs on them, and the bugs liked to eat teeth, but if I brushed, then it would take them off.” —Adam Drew, Calgary, Canada I mean, as parental fibs go, that one's not far from the truth. “Everything on the ice cream truck is poison.” —Jon Thierry, Dearborn, Michigan That one too. Delicious, delicious poison. “My pet chickens and rabbits had gone ‘to the farm’ when in fact my former farmer Dad had turned them into dinner.” — Pat, Arlington Heights, Illinois Well? Someone's gotta keep therapists in business. “She told us that if you kissed your elbow you would turn into a boy.” —Tara Bryan, Flatrock, Newfoundland It's a little easier to do that nowadays. “For as long as I can remember when we would drive to Rhode Island, she would tell me that the forest rangers used giraffes to prune the trees. I would always be looking in just the wrong direction and miss seeing one as we went by.” —Edward P. Steele, Connecticut That's a prank, and a really funny one at that. “My mom told me that the gum spots on the sidewalk were actually blood from the kids who didn’t look before crossing the road.” —Ava Moody, Fort Worth, Texas And that one's brilliant. Now, I'm not saying that lying to kids is always a bad thing. (Spoiler: this week's Fantasy newsletter will be about lies.) But you gotta admit, some of them are meaner than others. Lots more at the link, no lie. |
A few days ago, we had the article about a correlation between hairiness and the speed of wound healing. Well, this one, from PopSci, talks about the hair part. How is head hair different from body hair? ![]() There's a reason you can't grow your armpit hair to your belly button. You know if we could, someone would turn it into a fashion statement. Hair can be curly, straight, thick, thin, brown, black, blonde, or auburn. It can be long or short, frizzy or lush. The musical ![]() We have two types of hair, says dermatologist Elizabeth Houshmand. Vellus hairs, or “peach fuzz,” cover virtually our entire body but aren’t easy to see. Our head, chest, armpit, and pubic hair consists of terminal hairs. These are thicker and darker. The author forgot nose and ear hair in the latter category. But not all terminal hairs are alike. For example, the hair on our head can grow far longer than that on the rest of our body. To understand why, we have to dive deep into our skin. Phrases like that really get under my skin. The article goes into a brief bit of scientific detail, then: But bald men can still retain thick body hair. Radusky, who has worked on clinical trials for hair loss conditions, explains this is due to the conversion of testosterone as we get older. An enzyme called 5-alpha reductase changes the hormone into dihydrotestosterone. And so we see how another problem is caused by testosterone. The article's pretty short (unlike my hair) and, I would hope, uncontroversial, so I don't have anything else to say. I'm sure people have something to say to me, though, like "Get a haircut ![]() To which I can only reply: No. |
This is a pretty long article from Vox, though I'll try to keep my commentary brief. Also, it's from December, so some of the information is already outdated, and I'm not always sure which information. You’re being lied to about “ultra-processed” foods ![]() Coverage of the latest nutrition buzzword is overly broad, arbitrary, and wildly misleading. The problem goes deeper. Yeah, we're being lied to. We're always being lied to. Sometimes it's malicious; sometimes it's advertising; sometimes it's both. “New research,” the Washington Post reported in June, “found eating plant-derived foods that are ultra-processed — such as meat substitutes, fruit juices, and pastries — increases the risk of heart attacks and strokes.” “Vegan fake meats linked to heart disease, early death,” the New York Post declared. There was just one problem: The narrative was totally fake. Meat industry propaganda detected. Robert F. Kennedy Jr., Donald Trump’s pick to lead US health policy, promises to crack down on ultra-processed foods and has called plant-based meats instruments of corporate control over our food system and humanity. As opposed to the existing corporate control over our food system? You think all the USDA dietary guidelines are science-based? Think again. ![]() The American food environment is unhealthy and disease-promoting, and the food industry bears much of the blame. When I was taking a walking tour of Brussels, the tour guide pointed out, "You Americans eat like you get free health care!" No arguments here. But the framing of that University of São Paulo–Imperial College study, and the promotional materials associated with it, might have made it easy for reporters to misunderstand what the research really found. I know we say there's "lies, damned lies, and statistics," but there's also innocent mistakes and repeating something that you're convinced is true, but isn't. For example, "lies, damned lies, and statistics" is usually attributed to Twain, but he attributed it to Disraeli, but there's no evidence that Disraeli ever wrote that, and in the end, we're really not sure who coined the phrase. Attributing it to Twain may not be a lie, but it might be an innocent mistake. If you’re confused, don’t feel bad — some of the world’s top nutrition experts are, too. “You look at these papers, and it’s still very hard to pin down what the definition [of ultra-processed] really is,” Walter Willett, a professor of epidemiology and nutrition at Harvard, told me. Seems there's always a villain in the food story. I remember when the Bad Guy was fat, then carbs, then gluten, and now UPFs. Things are never simple when it comes to nutrition science, and none of these things are wholly evil. While I don't doubt that a carrot is "good for you" while a Chee-to of the same color is "bad for you," I'm not convinced the problem is processing in and of itself. Some foods absolutely need to be processed to be edible, and while most of us aren't in the kind of situation where we'd have to eat those foods, people have been processing meats, vegetables, and fruits for preservation for at least centuries. This is further complicated, as I hinted above, by the lobbying and promotional efforts of corporations who want to convince you to eat their packaged food as opposed to the other guys' packaged food (or, for that matter, a carrot.) The relevant question about a novel scientific concept is not whether it happens to correlate with stuff we already know is true, but whether it adds something genuinely new to our knowledge, without also being wrong about a bunch of other things, as New York University environmental scientist Matthew Hayek pointed out to me. UPF, at least so far, doesn’t seem to clear that bar — it casts a net that manages to be overbroad while excluding some unhealthy forms of processing that have been around longer. And I'm including this bit because I wanted to remember it for other science articles. Having said all that: I get it. It feels intuitive to think there is something fundamentally not right about ultra-processed foods. I can understand why people would be freaked out by a vegan burger that looks and tastes like meat. I shudder at the junk that was normal for kids to eat when I was growing up — Gushers, Fruit Rollups, Coke — and think: That is not food. No, that's marketing. And no, I'm not immune; I just try to recognize it as such. The breadth and ambiguity of the campaign against “ultra-processed” foods make it vulnerable to sloppy thinking and manipulation by pseudoscience purveyors like RFK Jr. Combine that with a political climate in which multiple red states have banned cell-cultivated meat and meat producers seize every opportunity to thwart plant-based competitors, and you can imagine how plant-based meats could be targeted by an unprincipled, politicized application of ultra-processed food research. I'm glad the author doesn't mince words here. After all, mincing is a form of processing. There's a lot more at the link. But let's not undersell the social impact of the food argument; one of the basic things that holds people together as a community is eating. This is one reason so many religions and cults have dietary constraints of one sort or another: it sets them apart from the rest of humanity. If you can get people to battle each other over what we should and shouldn't be eating, you can control them more easily through a "divide and conquer" strategy. And that seems to be what's happening. |
While "fascinating" is a value judgement, and "facts" is questionable, I thought this Mental Floss thing was interesting enough to share. I just don't fully trust the accuracy. Well, first of all, did you know it's in the mint family? Oh, wait, you said time, not thyme. Did you know that a day on Earth used to be around six hours shorter than it is today? Yeah, like, billions of years ago. Or that Julius Caesar once implemented a 445-day-long year? That's not about time. That's about timekeeping. It's not like he slowed the Earth's orbit down, or that it affected anyone outside the Empire. Don't worry; I'm not going to comment on every single point. 1. Every person on Earth is living in the past. Our brains don’t perceive events until about 80 milliseconds until after they’ve happened. This fine line between the present and the past is part of the reason why some physicists argue that there’s no such thing as “now” and that the present moment is no more than an illusion. Which is what I've been saying all along. Except I'm shying away from using the word "illusion" (time itself is most definitely not an illusion) because it's been misused. I'd say the present moment is an infinitesimal. 2. Throughout history, different cultures around the world have experienced time in different ways. Those who read languages that flow from right to left, such as Arabic and Hebrew, generally view time as flowing in the same direction. The Aymara, who live in the Andes Mountains in South America, consider the future to be behind them, while the past is ahead. In their view, because the future is unknown, it’s behind you, where you can’t see it. Some Indigenous Australian cultures, which rely heavily on direction terms like north, south, east, and west in their languages, visualize the passage of time as moving from east to west. Newsflash: different cultures conceptualize things differently. 4. Science has a number of different ways of defining time. To cover just a couple: There’s astronomical time, which is measured in relation to how long it takes Earth to rotate on its axis. In astronomical time, a second is 1/60th of a minute. And then there’s atomic time, which dictates the numbers that you’ll see on a clock. I feel like this section is misleading. There's only one official way to define a second, as the article goes on to note. The duration of one second was based on the length of an average solar day, but, as implied up there, the length of a day is increasing over geological epochs, and at some point, the Earth's solar day will average more than the current 86,400 seconds. To complicated matters further, there's the sidereal day, which is the amount of time it takes the Earth to rotate with respect to some other star; this is different from a solar day because of the Earth's orbit. 8. Gravity is also the reason why our days are getting longer. Over a billion years ago, a day on Earth lasted around 18 hours. Our days are longer now because the moon’s gravity is causing Earth’s spin to slow down. A bit simplistic, but okay. In Earth’s earlier days, the moon wasn’t as far away, which caused Earth to spin much faster than it currently does. Causation reversal. Flag on the play. 15 yard penalty. 9. There are two ways to think of the length of a day on Earth. Though you probably learned that one day on Earth is 24 hours, it actually takes the planet 23 hours, 56 minutes, and 4.0916 seconds to rotate on its axis. Which I know I already said, but I'm glad the article acknowledged it. 12. At the same time, Caesar asked the astronomer Sosigenes to help reform the calendar. And Caesar took all the credit, as usual. To be fair, "Julian calendar" is a lot easier to say and spell than "Sosigenean calendar" would have been. 13. But Sosigenes made a bit of a miscalculation, so the calendar continued to be a little off. Yeah, well, this was over 2000 years ago, so I can forgive the miscalculation. It eventually led to George Washington having two birthdays ![]() ![]() 16. Even with the advent of standardized time, people still struggled to keep their clocks in sync. One London family used this to their advantage, and made a living by selling people the time. That story is definitely interesting. 23. Sundials read differently depending on the hemisphere you’re in. This one should be obvious, but not everyone thinks about it. Our concept of “clockwise” is based on the way sundials in the Northern Hemisphere told time. Yet another example of the Northern Hemisphere hegemonic conspiracy. There's more at the link, as one might expect. Most of these aren't about time, though; they're about timekeeping, which is not the same thing. |
An interesting article from Smithsonian today, demonstrating that evolution involves trade-offs. Human Evolution Traded Fur for Sweat Glands—and Now, Our Wounds Take Longer to Heal Than Those of Other Mammals ![]() Even compared to chimpanzees, one of our closest relatives, humans’ scrapes and cuts tend to stick around for more than twice as long, new research suggests I'm sure you realize by now that "research suggests" is a flag that this isn't (yet) a high-confidence finding. That's okay. It doesn't really affect our day-to-day lives, like cancer research or nutrition science. In experiments, human wounds took more than twice as long to heal than those on other mammals—including chimpanzees, which are one of the closest relatives to Homo sapiens. "What's your research about?" "We're going to cut chimps and see how long it takes for them to heal." This is why we have ethics committees, folks. To gather data on the other mammal species, they anesthetized and surgically wounded captive lab mice, rats, olive baboons, Sykes’ monkeys and vervet monkeys. They also studied naturally occurring wounds—mostly caused by fighting—on five captive chimpanzees. In case you were wondering about the ethics. Researchers were not entirely surprised by the results, because skin healing is affected by hair. The follicles at the root of each hair contain stem cells, which, in addition to producing hair, can grow new skin when necessary. Since humans have much less hair than other mammals do, it makes sense that our wounds would also take longer to heal. As we also know by now, just because something "makes sense" doesn't mean it's true. But it does mean possibly less chimp-cutting in the future, because now they don't have to figure out why some result doesn't make sense. “When the epidermis is wounded, as in most kinds of scratches and scrapes, it’s really the hair-follicle stem cells that do the repair,” says Elaine Fuchs, a biologist at Rockefeller University who was not involved with the research, to the New York Times’ Elizabeth Preston. Now, that does raise a couple of questions in my mind. First: humans have places without hair follicles (we have them pretty much all over, but they only produce really thin hair). So what about comparing similar cuts from those places with the hair-having places? Like, we don't have hair follicles on our palms (well... YOU don't), so maybe compare a cut there to one on the forearm, which does have hair? (Unlike other animals, humans can consent to being used in research, up to a point.) Second: humans aren't the only mostly-hairless mammals. Elephants come to mind, though they have a famously thick skin, more resistant to wounds. Seals, whales, hippopatamice, and other aquatic or semi-aquatic mammals are left out of the study. (To be clear, the "aquatic ape" hypothesis has been pretty thoroughly debunked, so no, we're not technically semi-aquatic.) Yes, "hippopotamice" is a made-up plural. I do that sometimes. At some point in the evolutionary journey, humans lost most of their body hair. Eh, as I said, not really; it just... changed. There's also a difference that's correlated, though not perfectly, to natal sex. This is where evolutionary trade-offs come in: whatever the reason for the change (there are hypotheses, my preferred one being sexual selection), if this research is on the right track, the "longer heal time" thing might be a by-product, because it certainly doesn't seem to increase survivability. It also doesn't significantly decrease it, or it wouldn't have happened. One might be tempted to guess that men kept more body hair because early men got injured more than early women, but I'd shy away from such speculation. It’s not entirely clear why early humans lost their hair. But it seems our species swapped the once-abundant hair follicles for sweat glands, which are not as efficient at healing wounds but help keep us cool in hot environments. More evidence for my assertion that humans aren't meant to live in cold environments and that anyone who prefers the cold is an aberration. In theory, this slower wound healing rate should have put humans at a disadvantage. But the researchers speculate that support from friends and family, as well as the use of certain plants as medicine, helped humans survive. And I'm including this bit to support my (more serious) assertion that it's cooperation that got us to where we are today, for better or for worse. And also to address my "doesn't seem to increase survivability" bit above. Though, as noted, it's still speculative. There was a bit of buzz a few months ago, as I recall, when observers saw an orangutan using a medicinal plant; this reminded me of that. Like I said, interesting article and conjectures, despite the animal-rights angle. I'd say I'd like to see more, but while I don't object to eating meat, I feel like the true meaning of the cliché "curiosity killed the cat" is that scientists got out of hand with their curiosity. |
The Guardian asks the tough questions. From last year: The cult of 5am: is rising at dawn the secret of health and happiness? ![]() It has been called the morning miracle – getting up before everyone else and winning the day. But does it actually make you more productive and focused? I'm just going to address the obvious first: if "everyone else" got up at 5am, you'd have to get up a 4am to awaken before them. Then there'd be articles about the wonders of waking up at 4am. And people will buy into it, and soon you'll have to wake up at 3. And so on, in a never-ending cycle of backing off in time. But that's about as realistic as everyone suddenly turning vegan. I just wanted to call out the logical fallacy. It is 5.15am and I am walking down my street, feeling smug. The buildings are bathed in peachy dawn light. “Win the morning and you win the day,” suggests productivity guru Tim Ferriss. The prize is within my sights: an oat-milk latte, my reward for getting up ridiculously early. This implies that the oat-milk latte is to be acquired from a coffee shop, not the author's own kitchen. This means, wait for it: the coffee shop is already open. This further implies that the workers there have awoken even earlier in order to get the magic beans, or whatever, prepared. Are the baristas "winning the day?" Or have you already lost because you've slept in later than they did? Or are they just NPCs to your main character? On to the deserted six-lane high street where supermarket delivery vans and the occasional bus are the only signs of life. More NPCs. There is no coffee to be had at any of the eight shuttered cafes I pass... Oh, so I guess the baristas got to sleep in, after all. ...so I head for a patch of green space to meditate. Which is about the same thing as sleeping, so what's the point? Other than smugness. Why am I doing this? Because, in an attempt to become one of the elite superbeings who are members of the 5am club, I am trying a week of very early starts. It's okay to try something new. Doing so may even provide a temporary mood boost, as you are more deliberate in your actions and discovering new things, like the cafés being closed in this case. But doing it because celebrities are doing it? Or because some soi-disant "guru" says you should? I'm not impressed. To a sceptic, there is a degree of magical thinking to much of this. If you can just do this one thing – get out of bed while others snooze – you will have time to get fit, eat healthily and achieve all your goals. Again, if everyone did it, well, that infinite regression is something I've already covered. Also, the "you will have time" part is negated unless you get less sleep by going to bed at the usual time, because there are only 24 hours in a day (absent things like time zone travel or the clock switches in spring and fall). And getting less sleep isn't healthy for most of us, as I've noted before. So, yes, I'm skeptical (look, I understand British spellings, but outside of quotes, I'll generally use American English). And yet, as I said, changing one's routine can also have benefits. Sometimes that's what it takes to shoehorn in a workout or time to cook or whatever. Ordinarily, I get up at 6.30am without an alarm. I am not at my best at this hour. I mainline instant coffee and doomscroll for 90 minutes, and then it is time to get ready for work. Ah. I think I see the real problem. When I was working, both blue and white collar jobs, I never had time in the morning for more than shower, dress, quick bite (maybe) and commute. The few times I woke up earlier than absolutely necessary (by choice or not), I didn't see the point in doing anything else before work. Of course, this was before "doomscrolling." At 4.50am, my alarm, set to Arcade Fire’s Wake Up, blares out of my phone at top volume. There is a thud from above: I have accidentally recruited my neighbour into the 5am club. On behalf of your neighbor: Piss off, wanker! I decide to do some meditation, which is lovely, but 40 minutes later I have pretty much dozed off. What'd I tell you? There follows some description of the author's attempt, and then: Why is this so hard? I put the question to Russell Foster, head of the Sleep and Circadian Neuroscience Institute at Oxford University. But he wants to know why I would want to sign up for the 5am club in the first place. To say he is scathing about the fetishisation of the early start would be an understatement. “There’s nothing intrinsically important about getting up at 5am. It’s just the ghastly smugness of the early start. Benjamin Franklin was the one who started it all when he said, ‘Early to bed, early to rise makes a man healthy, wealthy and wise’ and it’s been going on ever since. It goes back to the Protestant work ethic – work is good and if you can’t or won’t work, that is, by definition, bad. Not sleeping is seen as worthy and productive.” As I've noted before, Ben Franklin was an epic troll, and I insist that this particular quote was satire. And yet, I have to agree with "it's just the ghastly smugness of an early start." Brits use "ghastly" more than we do here in the US, and I aim to change that. (Also, why are Brits listening to the words of someone who, to them, was a traitor?) By day four of my experiment, I am grumpy and miserable. I’ve had to cancel a trip to the pub because, newsflash, an evening of merlot and a dawn wake-up isn’t a good combination. Coward. Day five is a new low. I sleep in until 5.43am and then eat a salted caramel Magnum for breakfast to compensate for missing out on the pub. At this point, I laughed out loud. On day eight, I wake up at 5.04am without an alarm. The morning beckons. Do I bound out of bed to seize the day? I do not. I decide to return to my usual wake-up time, only now with a renewed focus. As I suggested above: changing one's routine does have benefits. But there's nothing magical about particular times. Humans naturally fall on a spectrum from extreme lark to extreme owl (I've talked about this before too, I know), and I hold the considered opinion that arranging your life around your sleep rhythm would be optimal. What happens, though, is that the world is generally made for larks, so larks do better at things like focus and creativity during the standard workday, leading to the classic conflation of causality and correlation (I'm going to have to remember that particular alliterative phrase). In other words, it's not waking up at 5am that does it for them, it's happening to possess a metabolism that wants to wake up at 5am. Musicians, for instance, who tend to play late-night clubs and concert halls, well, I can't see them benefiting from a schedule like that. But we can't all be musicians. Still, I can't fault the author for trying. As an experiment, it fails because the sample size is exactly one. But it's not like there's a perfect schedule that would work for everyone; we're all different, so I don't think science can ever answer the question "What are the ideal times for awakening and asleepening?" with a single answer that works for absolutely everyone. Because that answer depends on individual chronotypes, and we don't all fit into neat little industrial cog-boxes. |
There should be some relationship between the letters in a word and its pronunciation. Should be, but sometimes isn't. There are places like that in the US too, but no one visits the US anymore, so it doesn't matter as much. Inhabited by a succession of Celts, Romans, Anglo-Saxons, Scandinavians and Normans, Britain has spent centuries simmering into a confusing toponymic soup of counties, cities and castles. And that's one reason English itself is confusing. Over time, their names have further shifted and skewed, taking on their own idiosyncrasies and sometimes becoming utterly unpredictable. Happisburgh is “Haze-bur-ruh.” Cholmondeley is “Chum-lee”. Leominster is “Lem-stuh.” Some of these are just so the locals can instantly identify who doesn't belong there. As if having an American accent itself isn't enough. London loves to find ways to befuddle. At least most people get the pronunciation of London somewhat close. Except for the French. You know what they call it? Londres. We get even by calling Paris Paris instead of Paree. I'm only including a few of these here. The well-to-do neighborhood of Marylebone is commonly mispronounced. It has a picturesque etymology which has everything to do with a Mary and nothing to do with bones; it stems from “St Mary at the Bourne,” a church called St. Mary’s built on the banks of the old Tyburn river. Huh, and all this time I thought it was derived from mangled French. Oh, wait, it probably was, ![]() I told you English was confusing. (Holborn) No doubt you’ve already guessed that it’s not “Hol-born”, because that would be too easy. What we’ve got here is “Ho-bun”, derived from “hollow spring.” And now we have another meaning of something that sounds similar to "burn." Yes, sometimes a spring becomes a stream, but that's still a different thing. Bicester and Cirencester Screw this; I'll just visit a different city. (Edinburgh} “Burg” is a common suffix for a number of European cities — think Hamburg in Germany, or Johannesburg in South Africa. It means “castle” or “fortified town,” and both of the above “burg”s (and many more besides) are pronounced as they’re spelled. Astute readers may note that South Africa is not, in fact, a European city. But Europeans spread out all over the planet, and named some cities like the ones they were used to. Hence, here in the US, we have city names like Harrisburg or Fredericksburg, neither of which are particularly fortified. To make matters more complicated, lots of US place names end in "-ville," also, which is, you guessed it, French in origin. (Frome) Picturesque winding cobbled streets welcome tourists to the Somerset town of Frome, although the locals must get exhausted correcting visitors whenever “Frome” leaves their mouths. And this has nothing to do with Rome, which doesn't stop the article from making a terrible pun: "...when in Frome..." (Beaulieu} One glance at “Beaulieu” tells you this is a French influence. The name of this idyllic Hampshire village — home to a 13th-century abbey and the National Motor Museum, which houses one of the vehicles from “Chitty Chitty Bang Bang,” no less — means simply “beautiful place”. Très simple. Yes, except, for this native English speaker (albeit the American version), the French pronunciation is very difficult to wrap one's tongue around. Except that if you think the “beau” here is said how the French would say it, you’ve got another think coming. It’s “Byoo-lee”. That first part makes sense when you think of how we altered the word in, say, "beautiful," but the second syllable makes no sense whatsoever. And that’s not even the strangest bastardized French name: this honor goes to Belvoir Castle in Leicestershire, which is pronounced… “Beaver Castle”. Meanwhile, here in Virginia, we have a Fort Belvoir (which is not in a burg), and that's pronounced closer to the French version. And don't get me started on the mangled French-origin towns in the Mississippi Valley, like New Orleans or St. Louis, or Terre Haute or Versailles. Llanfair... I've been informed that long words (or place names) break the site on mobile view, so I'm not typing this one out. You know the town. I've been there. The only thing notable about it is the name, which I still haven't mastered. Somehow, it's also derived from a St. Mary's Church, which is probably about as common in England as Notre Dames are in France, and for the same reason. Anyway, like I said, more at the link. I'd consider it required reading if you're planning to claim asylum in the UK anytime soon, for whatever reason. |
Earlier this month, I did an entry on smarts: "We Are Very Smart" ![]() Well, I didn't find the exact quote, nor was I expecting to because I wasn't sure about its wording. But after I did that entry, I found the following article, an older one from Medium, and it describes a similar idea. The famous Nobel winning physicist Richard Feynman understood the difference between “knowing something” and “knowing the name of something” and it’s one of the most important reasons for his success. Well, that, and being really very insanely smart. Feynman stumbled upon a formula for learning that ensured he understood something better than everyone else. "Be very smart?" There are four steps to the Feynman Technique. Actually, there are five. The zeroth one is: you have to want to learn. Without that, you'll just be staring out the window, chin in palm, sighing, after asking "when are we ever going to use this in real life?" Step 1: Teach it to a child Write out what you know about the subject as if you were teaching it to a child. This, I think, is from where I got the mangled quote above. A lot of people tend to use complicated vocabulary and jargon to mask when they don’t understand something. Yes, a lot of people do that, especially in the business world, which is all about optimizing synergies for sustainable innovative solutions to resource allocation issues (for example). But other people use it because it expresses nuance in a way that simpler near-synonyms cannot. While I agree that there's almost never a reason to use "utilize" instead of utilizing "use," "complicated vocabulary and jargon" are compression algorithms. Instead of using lots of words like "it's made up of different little parts that connect to each other" we can say "it's complicated." Still, I agree with the basic idea: run a decompression routine on the jargon. Not only does that show you've got a handle on the subject matter, but it makes it easier to communicate to outsiders (and children). There are, of course, three other steps, as the article notes. I don't need to reiterate them here. Mostly, I just wanted to clarify the Feynman connection from the earlier entry. One thing I've found from experience is that teaching something (it doesn't necessarily have to be to a child) is a way to firm up and increase one's own knowledge of the subject matter. It may be one motivation for my blogging. |
Nothing bigger than infinity, right? Well, what about second infinity? From Quanta: Mathematicians Measure Infinities and Find They’re Equal ![]() Two mathematicians have proved that two different infinities are equal in size, settling a long-standing question. Their proof rests on a surprising link between the sizes of infinities and the complexity of mathematical theories. When someone brightly proclaims, "Nothing is impossible!" I have two possible responses, depending on my mood: 1) Even in a vacuum, there exist electromagnetic fields, quantum virtual particles, etc., so yes, technically, it's impossible to achieve "nothing." 2) Okay, if nothing is impossible, please, go ahead and count to infinity. And if I'm in a really bad mood, there's a third: 3) "You are." But despite my #2 response, which demonstrates that there are things that are, in point of fact, not possible, mathematicians understand quite a bit about the very useful (but probably entirely abstract) concept of infinity, one bit being that the infinity of integers is of a lower order, aka smaller, than the infinity of real numbers. The problem was first identified over a century ago. At the time, mathematicians knew that “the real numbers are bigger than the natural numbers, but not how much bigger. Is it the next biggest size, or is there a size in between?” said Maryanthe Malliaris of the University of Chicago, co-author of the new work along with Saharon Shelah of the Hebrew University of Jerusalem and Rutgers University. Just so we're clear, in math, "natural numbers" means positive integers, and "real numbers" means integers, fractions, and non-repeating transcendental numbers like pi. From what little about this stuff that I understand, there is a greater infinity of real numbers just between the integers 1 and 2 than the entire infinity of integers. Oh, and to make your mind spin even more, the infinity of natural numbers is the same size as the infinity of integers, which is the same size as the infinity of even numbers, which is the same size as the infinity of multiples of 100, and the infinity of prime numbers, and so on. You don't have to take my word for it, but mathematicians have proven this shit rigorously, as the article goes on to explain using pretty basic concepts, all because of the work of a dude named Georg Cantor. Then: What Cantor couldn’t figure out was whether there exists an intermediate size of infinity — something between the size of the countable natural numbers and the uncountable real numbers. He guessed not, a conjecture now known as the continuum hypothesis. Now, I'm obviously not a mathematician, but this stuff fascinates me to the point where I've read entire books about it. One of them, as I recall, pointed out later work that showed that the continuum hypothesis cannot be proven or disproven using the framework of mathematics, which this article also nods to: In the 1960s, the mathematician Paul Cohen explained why. Cohen developed a method called “forcing” that demonstrated that the continuum hypothesis is independent of the axioms of mathematics — that is, it couldn’t be proved within the framework of set theory. (Cohen’s work complemented work by Kurt Gödel in 1940 that showed that the continuum hypothesis couldn’t be disproved within the usual axioms of mathematics.) The article continues in even more detail and, no, there's not a lot of actual math in it; it's mostly written in plain language, and the only problem I had reading it was keeping all the names straight. Because while I don't understand much of mathematics, I understand it better than I understand people. My real point in posting this, however, is to show that some things are, in fact, impossible. But what is possible is to know what's impossible. |
This bit has cycled out of the news by now, but I don't really care. I have something to say about it anyway. From The Guardian: Celebrities criticize all-female rocket launch: ‘This is beyond parody’ ![]() Amy Schumer, Olivia Wilde and Olivia Munn are among the famous names calling out the much-publicised space trip The all-female Blue Origin rocket launch may have received plenty of glowing media coverage – but not everyone is impressed. Oh, you mean a publicity stunt sometimes generates bad publicity? But I've been assured that there's no such thing as bad publicity. The stunt has drawn criticism from a number of female celebrities who were not keen on the Jeff Bezos-owned Blue Origin NS-31 mission, which included Katy Perry... Cutting off the quote there because, in reality, no one gives a shit about any of the other passengers. Also, criticism from "celebrities," female or not, is about as meaningful as astrology. Model and actor Emily Ratajkowski... ...said she was “disgusted” by the 11-minute space flight, which featured Perry serenading her fellow passengers with a cover of What a Wonderful World and advertising her upcoming tour setlist in brief zero gravity. “That’s end time shit,” Ratajkowski said. “Like, this is beyond parody.” Okay, that's about as much of the article as I can stomach quoting. Here's the thing, though: I don't blame Perry or whoever those other chicks were. Well, maybe Sanchez, but who can really blame her for wanting to marry billionaire Lex Luthor? Sorry, I mean Jeff Bezos; I always get those two confused. Point is, it was a cunning stunt, and it worked. People talked about it for weeks. I'm talking about it now, but only because the article has been languishing in the pile for a month. I'm tempted to say "I don't care," but obviously, on some level, I do. And I wanted to try to articulate why. Let's start with the definition of "space." There's no well-defined upper limit to Earth's tenuous outer atmosphere. It's not like the ocean, which has a shifting and rolling but definable boundary; air molecules just kind of get more and more rare the higher you go. Plus, if you did pick some density and say "anything above this density is atmosphere and anything below it is space," you'd find that the altitude varies depending on what spot you're above on Earth, partly because it'll be higher where the air is warmer. So "space" is defined to begin at the Kármán Line, 100 km above mean sea level. Leaving aside that it's not really a line but a (nearly) spherical shell, this launch barely exceeded that altitude. So, yes, from a technical and internationally-recognized legal perspective, they were in space. Briefly. Second, I'm going to say stuff about women in space, at great personal risk. The first woman in space was Valentina Tereshkova, who's still alive. Kate Mulgrew, the actor who portrayed Kathryn Janeway on Star Trek: Voyager looks a bit like her (I doubt that this is a coincidence). Now, if they'd sent Mulgrew up there, I might have been more impressed. But someone sent Shatner up in a different launch, and he was all jaded and shit so they probably soured on Trek actors. Anyway, my point is, Tereshkova was a trained cosmonaut. The first American woman in space, Sally Ride (which I always thought was an excellent name for an astronaut) was a physicist with a PhD. Katy Perry is a singer. I'm not ragging on singers. I'm not ragging on Perry. She's talented, though I can't say I personally like her stuff. I'm just saying that her skill set doesn't say "astronaut." Hence: publicity stunt. I remember thinking a similar thing about Shatner when they lofted him up there, except Shatner doesn't exactly have the singing chops, as anyone who's ever been subjected to his cover of "Rocket Man" can attest. We're not at the point yet where we need singers in space. And let's not forget that it's pretty routine to send both men and women up to the ISS now, people who spend months there doing... whatever they're doing. Science, research, maintenance; you know, productive stuff. there was that one Canadian dude who brought a guitar with him and made some cool videos, but he's known as an astronaut, not as a musician. These spacers do actual work. Basically, this was a passenger flight, albeit a very expensive one. So, no, the passengers don't impress me, regardless of gender. And finally, shame on the media for breathlessly covering this like it's some sort of grand accomplishment. It's not. No new science, no barrier-breaking, no frontier-pushing. We won't be getting cool space-age tech from a suborbital passenger flight. Sure, first all-female crew, but they could have lifted pretty much anyone healthy enough to handle some extra Gs. I mean, yeah, on some level it's pretty cool that we have companies doing their own rocket launches, but it's not so cool that they can afford to use their powers for advertising. Go, I don't know, mine an asteroid for rare earths or figure out a way to stop the next one from hitting our planet. More likely, they'll figure out how to make the next one hit just the right spot on our planet. Their competition's headquarters, e.g. As anyone who reads my blog should know by now, I'm not against space exploration, and that I believe that a variety of genders, races, and nationalities should be included—because it's about humanity in general, not about one government or company or culture. But this wasn't exploration. It was exploitation. |
I don't expect much from Stylist. Still, this doesn't clear even the low bar I set for them. Astrology fans, you’ve been reading the wrong star sign all this time: this is what your zodiac sign means now ![]() First off, to reiterate stuff I've said before: Yes, of course astrology is bunk. I consider it like folklore or fairy tales: obvious fiction, but still culturally significant, at least from an historical perspective. And it did give way to astronomy, similar to how alchemy evolved into chemistry. Any interpretation about what some particular stellar/planetary configuration "means," however, is purely made up. And second off, we've known about precession for decades. I remember great horror in the astrology community in the 1990s when it became publicized that zodiac signs are nearly one full sign off from their traditional calendar locations (which has been a thing for way longer than decades, but apparently, astrologers hadn't heard), and more freakouts in the noughties when astronomers announced that Ophiucus was part of the zodiac, too. January is over, February is upon us, and we all know what that means: it’s Aquarius season! Yeah, this article has been languishing in the pile for a few months. So what? The original article is nearly 10 years old, anyway. That’s right; astrologers have promised that all planets are going direct and we have zero retrogrades this month, which means we can lean hard into the do-gooder spirit of this astrological season without any annoying complications. This is usually where I close the tab and give up, but I still want to hear her take on the changes. Well, astrology, on the surface, may be based on the position of the sun relative to certain constellations – and it may be influenced by the movements of the sun, moon, planets and stars, too. "May be?" That's what astrology is. Not that it has any bearing on objective reality, but that's the lore that we've inherited. However, it is absolutely not considered to be a science. Finally, the truth. Indeed, it’s been wholeheartedly rejected by the scientific community – with many pointing out that astrological predictions are too general, too unspecific to be subjected to scientific testing. That's hardly the only objection. Even those of us who dismiss astrology as a load of absolute nonsense know which star sign we are. Yes, and as an Aquarius, I dismiss astrology as a load of absolute nonsense. (I am, however, sometimes fond of nonsense.) Because, as you’ve no doubt read already, it was recently revealed that everything we thought we knew about the zodiac was a lie. "Recently," my ass. Suddenly, astrologers started paying attention to astronomers. Selectively. If they'd actually paid attention to everything science said, there wouldn't be astrologers. Nasa – as in, yes, actual Nasa – have confirmed that the sky today is completely different to how it was almost 3,000 years ago, when the Babylonians first invented the 12 signs of the zodiac. Sigh. I... I can't even begin. Pauline Gerosa, the consultant astrologer behind Astrology Oracle, tells me: “Ophiuchus has always been one of the constellations that fall along the ecliptic. It just wasn’t selected by the ancient astrologers to be one of the 12 zodiac signs.” To muddy the waters (that's an Aquarius pun) even further, the zodiac constellations (I suppose I should explain here that these are the made-up interpretations of stellar configurations that get crossed by the Sun, Moon, and planets) aren't all nicely uniform in size. The article delves into that bit later. “It’s important to remember that astrology is NOT astronomy,” she adds. “Astronomy is a scientific concept based on 3D material reality. Astrology is a symbolic language, a philosophy, a multidimensional concept. They used to be seen as two sides of the same coin and hopefully they will be again.” You know, I can't really fault this quotation too much. It gives too much credit to astrology, perhaps, but considering the source, that's understandable. There are some more details at the article, but I just have one more takeaway from this: this represents an intrusion of actual science—well, not so much actual science as astronomical categorizations—into astrology. Maybe there'll be more of this, and astrology will be sent back to folklore, where it belongs, instead of being taken seriously as life direction and fate. |
Taking a chance here with a National Geographic link. I couldn't trust them once Fox bought them out (I fully expected headlines like "Global Warming: Myth Or Hoax?"), and I'm not sure Disney's much better. But I couldn't give this one a miss. Everything you think you know about spiders is wrong ![]() They're not attracted to your body lotion. They don't crawl in your mouth at night. In fact, they want nothing to do with you. Uh huh. That's what they want you to believe, to lull you into a false sense of security until, one night, you wake up, and something eight-legged and fuzzy is staring at you with eight hungry eyes. I mean, come on, did a spider write this? With hundreds of years of baseless myth to supply us, it’s no wonder as many as six percent of people are phobic of arachnids. Well, that's one spin on it. Another is that some people are phobic of spiders regardless of truth or fiction. Also, I find it difficult to believe that it's only six percent. One other thing: the author uses "myth" to mean falsehood, which is a perfectly acceptable definition, but many cultures have actual myths ("foundational stories") surrounding spiders, many of which paint the arachnids in the positive colors they deserve. It's just important to know what definition the spider who wrote this is referring to. These animals are stunningly diverse, ingenious creatures with so many characteristics worth admiring. Yes, and it's far easier to admire said characteristics on the exceptionally large members of order araneae. Much of what is commonly touted about the spindly eight-legged invertebrates is a misconception, according to Rod Crawford, a spider expert and curator of arachnology at The Burke Museum. “Everything you thought you knew about spiders is wrong,” says Crawford. Hey, another spider getting quoted! First, they aren’t insects. Spiders belong to a completely different class called “Arachnida.” Yeah, you know what else are in Arachnida? Such cute and cuddly exoskeleton-owning bugs as scorpions and ticks. Studies show that, in some ecosystems, more than 40 percent of all insect biomass passes through spiders, making them the number one controllers of insect populations. Yeah, yeah, I know: they eat bugs. This is not the flex you think it is. Myth: Spiders are out to bite us Most people will never be bitten by a spider in their lifetime. Yeah, that's what they want you to think. I'm healing from a spider bite right now, and it's not my first. Although it’s common to wake up with small skin bumps and sores and blame a spider, there’s almost always no reason to believe a spider is responsible for the prick, says Dimitar Stefanov Dimitrov, a spider evolution expert at the University Museum of Bergen in Norway. I strongly suspect that Dimitrov would be singing a different tune if he lived in Australia instead of Norway. Myth: We swallow some spiders in our sleep every year Throughout the years, several online forums and publications have claimed we swallow as many as eight spiders in our sleep every single year. Okay, even if that were true, though: so what? Apart from the "gross" factor. Myth: Spiders lay eggs in the tips of bananas and other fruits No, of course not. They lay them in your pillow. I remember when I was a kid and Bubble Yum first came out (being a kid, this was exciting: a soft bubble gum? Count me in) there was a pervasive urban legend that it contained spider eggs, a myth probably started by the trolls over at Dubble Bubble. Myth: Spiders can lay eggs under your skin and other crevices of your body The story goes like this: a woman returns from a holiday in a warm, exotic location and finds a bump on her cheek that’s pulsating and growing. Concerned, she visits a doctor, and when the specialist pries the welt open, hundreds of small spiders crawl out. We heard that story as kids, too, only it wasn't always a woman. I did once read an account of a biologist who got infested with a botfly larva, and who was so intrigued by the process that he just let it develop under his skin. There's some more at the link. Now, just to be clear, I'm mostly joking here. I admire spiders, preferably from a distance. That doesn't stop me from making "nuke it from orbit" jokes, or posting jump-scare gifs for your viewing pleasure. |
Not exactly cutting-edge physics, but this PopSci article showcases scientists using their noodles. Curious and hungry physicists whip up perfect pasta pan salt rings ![]() ‘Our simple observation of daily life conceals a rich variety of physical mechanisms.’ When you’re boiling water for pasta, throwing a bit of salt into the water can help it boil a little bit faster–if only by a few seconds. Not a good start, PopSci. Not a good start at all. That's not why you salt pasta water. It's for flavor and texture. ![]() With that, a white ring of salt deposits will often show up within the pan. A group of curious and hungry physicists harnessed the power of fluid dynamics to see what ingredients are necessary to create nicer looking salt rings–releasing larger salt particles from a greater height can help make more uniform salt deposits at the bottom of a pan. The findings are detailed in a study published January 21 in the journal Physics of Fluids. And with that, we come one step closer to a Unified Theory of Everything. Okay, no, now I'm the one lying, but at least I'm doing it for the sake of comedy. A team from the University of Twente in the Netherlands and the French National Institute for Agriculture, Food, and Environment (INRAE) were spending an evening playing board games and eating pasta, when they began to question what it would take for them to create uniform and “beautiful” salt rings. I can't help but note the absence of Italy from these experiments. The team set up a tank of boiling water in a lab and tested dropping in salt of various sizes at different speeds. By "various sizes," I assume they mean fine-grained to coarse-grained. “These are the main physical ingredients, and despite its apparent simplicity, this phenomenon encompasses a wide range of physical concepts such as sedimentation, non-creeping flow, long-range interactions between multiple bodies, and wake entrainment,” said Souzy. Jargon is a compression algorithm. I'll just trust that they can indeed relate this to other physical phenomena. Souzy also reports that he can use this data in the kitchen to “create very nice salt rings almost every time.” Which is cool and all, but I have to ask: how does the pasta taste? |
From Vox, a report that's sure to freak out a lot of people. The life-or-death case for self-driving cars ![]() Sorry, a robot is probably a safer driver than most humans. As usual, I'm not just accepting this at face value. Nor am I rejecting it outright. When the concept of autonomous vehicles (AVs) was first floated in the real world, I knew immediately what was going to happen: the public would freak out, and anyone who stood to lose revenue would take advantage of the freakouts to come down hard against AVs on "safety" grounds, playing in to the general fear of robots. Municipalities, for instance, who stand to lose a significant source of revenue if they can't ticket people for speeding or rolling stop signs. And cops, who might have to switch to investigating real crimes like theft. And boy, have they been harping on safety. Humans drive distracted. They drive drowsy. They drive angry. Even when we’re firing on all cylinders, our Stone Age-adapted brains are often no match for the speed and complexity of high-speed driving. Ugh. They had to throw in a spurious reference to evolutionary psychology, didn't they? The result of this very human fallibility is blood on the streets. Nearly 1.2 million people die in road crashes globally each year, enough to fill nine jumbo jets each day. I admit I didn't check that statistic, but it tracks. I'm also not sure of the numerical comparison to ill-defined "jumbo jets," but I think the point is that every time we lose an airplane, "jumbo" or not, we hear about it for days or weeks afterward, while the road crashes are generally just background noise. Here in the US, the government estimates there were 39,345 traffic fatalities in 2024, which adds up to a bus’s worth of people perishing every 12 hours. I generally work with the nice round number 40,000 for average yearly traffic fatalities in the US. It's close enough to make the point I like to make, which is: that's about 110 fatalities each day, which translates to 4 to 5 per hour. Call it 4. Every time an AV so much as skins someone's knee, we hear about it, and it frightens people. And yet, if your phone pinged every time there was a fatality involving human drivers, you'd hear an alarm every 15 minutes or so. Injuries? Roughly every five minutes. We're used to it. As I said, background noise. Obviously, there are a lot more human drivers than computer ones, so a direct comparison is more difficult. But my point remains: driving kills. In the US, it kills on the same order of magnitude as firearms, ![]() But the true benefit of a self-driving revolution will be in lives saved. And new data from the autonomous vehicle company Waymo suggests that those savings could be very great indeed. Obviously, we need to be very careful using data from a company whose existence depends on continued development of AVs. Fortunately, the article uses "suggests," implying that it would be good to look into this further, preferably without an Agenda (for or against). In a peer-reviewed study that is set to be published in the journal Traffic Injury Prevention, Waymo analyzed the safety performance of its autonomous vehicles... They then compared that data to human driving safety over the same number of miles driven on the same kind of roads. And that alleviates some of my concerns about impartiality. Not all of them, but some. Back-of-the-envelope calculations suggest that if the same 85 percent reduction seen in serious crashes held true for fatal ones — a big if, to be clear, since the study had too few fatal events to measure — we’d save approximately 34,000 lives a year. Okay, first of all, I was morbidly amused at "the study had too few fatal events to measure." Look, I get that the number of motor vehicle fatalities can never be zero, unless there are no vehicles on the road whatsoever (and even then, I suspect it would still be nonzero). It would be ideal, sure, but it's unrealistic to expect that. What I've been saying is that if AVs could be shown to reduce fatalities and other serious accidents by 10-20% (perhaps to an annual fatality level of 32,000 to 36,000, down from 40K), it would be worth it from a safety perspective. This study flips that script, implying a fatality rate of about 6,000 per year—a truly significant reduction, way more (pun intended) than my personal threshold. Of course, there are plenty of caveats to the Waymo study and even more obstacles before we could ever achieve anything like what’s outlined above. Yes, and I'm glad the article includes said caveats. You can read them there; I've already mentioned one of them (it being a company study). Still, the data looks so good, and the death toll on our roads is so high that I’d argue slowing down autonomous vehicles is actually costing lives. And there’s a risk that’s precisely what will happen. As with most things, there are other factors to consider. Accidental death is, obviously, a bad thing, so it's a fine metric for studies like this. But some other considerations include: personal loss of freedom (I can almost guarantee that AVs will have kill switches usable by law enforcement, which could be hacked or abused); economic impact (taxi drivers, rideshare gig workers, truckers, etc. to lose jobs); and ensuring that their routing is accurate (no driving into lakes or onto closed roads, e.g.); to name just a few. Not to mention the aforementioned loss of revenue to municipalities, which, frankly, I don't care about. There's also the issue of liability. Right now if I hit a pedestrian on the street, I'd be personally liable. Who or what is responsible if an AV hits a pedestrian? Well, that's for lawyers to figure out and, hopefully, if this study holds water, there would be a lot fewer such cases. Which is another consideration: personal injury lawyers would make less money. Waah. The article continues with a reiteration of what I've already said up there: Too often the public focuses on unusual, outlier events with self-driving cars, while the carnage that occurs thanks to human drivers on a daily basis is simply treated as background noise. (That’s an example of two common psychological biases: availability bias, which causes us to judge risk by outlier events that jump easily to mind, and base-rate neglect, where we ignore the underlying frequency of events.) As I said, if the news had to cover every traffic fatality in the US alone, we'd be getting four or five alerts an hour. The result is that public opinion has been turning against self-driving cars in recent years, to the point where vandals have attacked autonomous vehicles on the street. You know what that reminds me of? Luddites. Change is scary. Machines are scary. Also, we might lose our jobs, and that's really scary. In other words, I doubt that's all about safety or traffic fatalities. To sum up, I'm not strongly for or against AVs. I do think that reducing fatalities is generally a good thing, but, as I said, there are other considerations, though maybe not life-or-death ones. We need more studies, preferably independent ones, but ideally, any switchover (which would realistically happen after I'm gone) should be based on statistics and science, not on fear. |
Big Think asks the tough questions again: The 6 strongest materials on Earth are harder than diamonds ![]() For millennia, diamonds were the hardest known material, but they only rank at #7 on the current list. Can you guess which material is #1? Thing is, while we can use words like "strong," "tough," and "hard" almost interchangeably in casual conversation, those words have specific meanings in materials science. And there they are in the headline, conflating "strong" and "hard." (I'm the one who muddied the waters with "tough.") I'll just pause here for a moment while you get juvenile strong/hard jokes out of your system. Ready? Ok, good. Worse, later in the article (spoiler alert), there's even more confusion. So I thought I'd put this up front. While I do have some background in this from engineering studies, I'm by no means an expert. Wiki has a section ![]() As examples, glass is strong in many ways, but it's not very tough (except for certain specialty glass). Concrete is tough, not very hard (you can scratch it with many metals, as my sidewalk can attest after a snow-shoveling session), and has high compressive strength but low tensile strength. As for diamonds? Hard, but not tough; they'd make a lousy structural material. Their strength is fairly high, but there are many higher-strength materials, which is one reason I felt this article to be somewhat misleading. One final bit of pedantry before jumping in to the article: As I'm sure everyone is aware, diamond is carbon. Graphite is also carbon, and it's one of the least hard materials known (which is why we can write with it; it needs to be mixed with some other material in pencils so as not to wear down too fast). What we'll see here is that carbon is even more versatile than those two materials would suggest, and that's not even counting its ubiquity in biology. Although diamonds are commonly known as “the hardest material in the world,” there are actually six materials that are harder. And here, I'm not sure if the author means simply scratch resistance or not. For example, hardness, scratch resistance, durability (as many very hard materials are also brittle), and the ability to withstand extreme environmental stresses (such as pressures or temperatures) all factor into how impressive a material can be. And here we have some muddle again. Durability is a measure of wear resistance; there are brittle materials that are durable but not very hard, such as glass. If glass weren't durable, we wouldn't use it for windows. (To make matters even more muddy, there are types of glass that aren't very brittle.) On the biological side, spider silk is notorious as the toughest material produced by a plant, animal, or fungus. Sigh. Spider silk is strong (comparatively speaking), not tough. While other materials may rank higher on the Mohs hardness scale than diamonds, they’re all easier to scratch than diamonds are. (And, consequently, can be scratched or otherwise damaged through contact with a diamond.) And this bit makes no sense to me whatsoever. Either I'm missing something, or it's just plain wrong. The Mohs hardness scale, I've known from a very early age, refers to scratch resistance. Like I said, I'm not an expert, so don't just take my word for it. Now, I've already banged on long enough, but hopefully you get the idea: don't confuse hardness, toughness, strength, elasticity, etc. The article goes on to list the materials that surpass diamond in some way, but I'm still unclear on which material property they really mean. But I'm sure you'd at least like to know #1 on the list, because superlatives are always interesting: The #1 hardest material: Graphene At last, it’s the hardest material of all: a hexagonal carbon lattice that is only a single atom thick. And so we come full circle: hard diamond, soft graphite, ultrahard graphene. Carbon: is there anything it can't do? But the fun fact, left out of the article as far as I can tell, is that naturally occurring graphite consists of multiple layers of tiny bits of graphene. Sure, the interesting graphene is human-made, but it's not like we didn't know about its existence. Still, it's a little unclear whether graphene is hard, tough, strong, or some combination of the three. Whatever it is, though, it's damn interesting. |
Here's a source I don't think I've referenced before: Greater Good. Ever since I saw the movie Hot Fuzz (three or four times), that phrase has filled me with a sense of unease. And when I hear it, I reflexively repeat it, intoned in a British Northern accent, or as close to it as a US Southerner can get. Five Ways Nostalgia Can Improve Your Well-Being ![]() Some recent studies suggest that experiencing nostalgia about our past can make us happier and more resilient during times of stress. You know, used to be, nostalgia was considered a mental illness. Sigh... I miss those days. I often find myself nostalgic for days gone by—especially my young adulthood. Thinking about days when I could go backpacking with a friend on a moment’s notice or dance the night away at my wedding, without the constraints of child care or a limited energy supply, gives me a bittersweet feeling—a mixture of joy, sadness, and longing. Me? I've always had a limited energy supply, especially when it comes to group activities. And I've never had to deal with child care. Well, there was that one time my friend roped me into baby-sitting. I taught the kid how to make (and use) water balloons, and somehow never found myself in that situation again. Staying “stuck in the past” was often associated with being unable to adjust to new realities, like when soldiers were nostalgic for their faraway homes and experienced loneliness and dread. That may be an extreme example. While I've never been a soldier, I can understand how those emotions could happen, even without nostalgia. Not that long ago, some considered nostalgia to be a mental illness, akin to melancholy, which could lead to anxiety, depression, and sleep disorders. Yes, and I already made the joke about it up there. I just included this quote to support that the "mental illness" assertion was factual, unlike many of my jokes. Waltz's Second Rule: Never let the facts get in the way of a good joke. Or a bad one. Especially a bad one. But more recent findings on nostalgia suggest it can be good for us, increasing our well-being, making us feel connected to other people, and giving us a sense of continuity in our lives. Good. Let's drive another nail into the "living in the present moment" coffin. Rather than being a problem, nostalgia can help bring happiness and meaning to our lives. On second thought, happiness is overrated, and meaning is whatever we want it to be. Nostalgia makes us feel socially connected Nostalgia about our past often includes recalling important people in our lives—people who cared about us and made us feel like we belonged. Yeah, okay, but it can also highlight how you'll never see some of those people again. Seriously, though, the article links to some studies, which, full disclosure, I didn't read. Nostalgia helps us find meaning in life A sense of meaning in life involves knowing that your existence matters and that your life has coherence or purpose. It’s something we all strive for in one way or another. No, it's not. As one study found, nostalgia can increase your motivation to pursue important life goals, because it increases meaning—not just because it puts you in a better mood. Again, links to studies. Again, no clicky here. Nostalgia can make us happier Though it does seem to do just that—to boost our mood. Even though nostalgia is by definition a blend of positive and negative emotion, the positive tends to outweigh the negative, meaning we feel happier overall. I feel like the key words there are "can" and "tends to." Nostalgia puts us in touch with our authentic selves When thinking nostalgically about our past, we are the prime protagonists in our own life stories. I'm already the prime protagonist in my own life story. Perhaps for this reason, engaging in nostalgia can lead to personal growth. At least one study found that feeling nostalgia made people feel more positively about themselves, which, in turn, made them more open to experiencing new things, expanding their horizons, and being curious—all signs of psychological health. I'm still open to those things, nostalgia or not. But lately, what's been on my mind is: how long will that last, at this point? Nostalgia may help people who feel disillusioned or depressed Perhaps because of these potential benefits, people tend to engage in nostalgia when they are feeling down, lonely, or disillusioned. You know what helps me in those situations? Listening to really depressing music or watching really depressing movies. The article does go into some of the negatives: Of course, that doesn’t mean that nostalgia is always good or can’t have a downside. If nostalgia makes us spend too much time thinking about our past, it may prevent us from recognizing the joy in our lives right here and now. And, since we tend to engage in nostalgia when negative things occur, it could become an avoidance strategy that keeps us from dealing with present problems in more effective ways. Me, I'm not coming down hard on one side or the other. My feeling is that trying to squelch nostalgia, or any other emotion, simply on the basis of "I've heard it's bad to feel this way, so I won't feel this way" can't be good for you, but that's hardly scientific. I'll just point out that the "-algia" part of the word comes from a Greek root meaning "pain," and that the word apparently had the original sense of "homesickness," not a general longing for the past. The past, though, pain and all, is what made us who we are, so I can see why sometimes reflecting upon it can have positive outcomes. Let's just not lose ourselves in it. |