Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
I can't for any amount of money recall why I saved this particular Cracked article, but I'm going to link it anyway. History is full of success stories. These are not those. The idea that you need to “strike while the iron is hot” has morphed from a straightforward explanation of blacksmithing into an endlessly repeated cliche. It, or some more unwieldy modern rejiggering, is plastered over everything from horoscopes to aviator stock photos on hustle culture Instagram accounts. Every cliché was once profound wisdom. Plenty of people throughout history have filled their metaphorical blade with microfractures by waiting too long, while others have snapped the sword in half entirely because of ill-advised hesitation — losing a couple million dollars or an entire city in the process. It's not enough to invent something; one must also have the imagination to see the implications in one's invention. 5. Greeks Almost Inventing The Steam Engine Eh. This one wouldn't have gotten very far, anyway. The math wouldn't be ready, also the fault of the Greeks, who insisted that irrational numbers were of the devil. 4. Not Signing The Beatles This is fairly well-known, I think, and usually held up as an example of big mistakes. Now, Decca Records weathered the blunder and has still been plenty successful in their own right. I think maybe I had in mind, when I saved this article to my queue, something like "just because a publisher rejected your manuscript doesn't mean it sucks." However, it still might. It's just not a sure thing. 3. Passing on the Patent for the Telephone Orton and Western Union were offered the patent to a newfangled communication device invented by Alexander Graham Bell, one we know as “the telephone.” Sure, the asking price was high at $100,000, but you’d like to think the head of a communications company might see the value in the fucking telephone. Instead, he balked, apparently not seeing a future in the device that would, you know, literally define the future. This is where that lack of vision comes in. Western Union was used to doing business a certain way, and the telephone would have changed that. See also: Sears, who could have seen this whole Internet coming and positioned itself to take advantage of it, but no, Amazon won that race. 2. The Byzantine Empire Not Buying A Huge Cannon In general (pun absolutely intended), if you're at war, don't turn down enemy-destroying weapons. 1. Not Letting Hitler Drown By contrast to some earlier entries, this one's not a lack of vision. I'd like to think most of us would save a drowning kid, if we had the opportunity and didn't know for sure he'd grow up to be a genocidal autocrat. In 1894 in Germany, a man named Johann Kuehberger saw a drowning boy in a river and, without a second thought, strode in to save his life. Incredible, awe-inspiring, medals all around, right? At the time, sure. Time-travelers, take note: this is your moment. Except killing baby Hitler is the biggest cliché in time-travel fiction, and we don't know if the result would have been worse. "But what could be worse than 11 million dead people?" Well, someone just as evil and actually competent might have risen to power, someone who wouldn't have done the 20th century equivalent of turning down a war-ending cannon by losing all their rocket scientists and nuclear engineers. |
The headline from today's article sounds like something Cracked would tackle, but no, it's LitHub. You'd think that out of all possible professions, writing would be among the least prone to occupational hazard. Maybe the occasional carpal tunnel, but that's about it, right? Well, you'd be right, but as this article shows, nothing is perfectly safe. Even the toughest of poets and strongest of Hemingways would have to admit that “writer” is not a particularly dangerous job. (Unlike, say, fisherman, miner, logger, knife-thrower assistant.) This, of course, depends on the kind of writer you are. Novelist who sits at home and relies on Wikipedia for research? Sure. War correspondent? Not so much. Still, a few writers in history have actually suffered some serious health problems as a result of their writing practice—or in some cases, the drugs they used to fuel it. Yeah, drugs probably shouldn't count. Those aren't specific to the writing profession. As I noted above, this isn't Cracked, so no numbered list, countdown or otherwise. Orwell struggled with health problems from childhood, and things were not improved when he was shot in the neck in Spain. But as Ross writes at PW, “his health collapsed for the first time after the writing of Homage to Catalonia, and the heroic effort of writing and revising Nineteen Eighty-Four would kill him.” Or was it the government trying to suppress his work? Over at Poets & Writers, Anelise Chen notes that Herman Melville “dove with such intensity into his whale book that his entire family circulated letters conspiring to make him rest. Ignoring their pleas, he emerged from Moby-Dick plagued with eye spasms, anxiety attacks, and debilitating back pain.” What a coincidence. Those things happen when you try to read it, too. The nineteenth-century Italian poet Giacomo Leopardi... Who? ...whom Adam Kirsh has dubbed “the supreme poet of passive, helpless suffering,”... Clearly, Kirsh didn't know Leonard Cohen. ...may have turned out that way because he from debilitating scoliosis, which gave him a hump and turned him into, as he put it, “a walking sepulcher.” He blamed his condition on his “scholarly excesses” Nah, it was all the sex he got from being a poet. Honoré de Balzac was famously addicted to coffee, which he loved for what it did for his writing. Take heed, fellow WDC authors. Neither did it go perfectly well for Balzac himself, who reportedly died of caffeine poisoning at the age of 51. Fuck me, caffeine poisoning is a thing? Objectivist Ayn Rand also had an addiction that sprang from her writing process: amphetamines. Unfortunately for the world, they didn't cause her early demise. If you’re familiar with his work, it isn’t particularly surprising to find out that Franz Kafka put himself through emotional and physical hell to get his writing done. What? That's absurd. His throat closed up, precluding the ingestion of any food, and so he technically died of starvation, working on his story “The Hunger Artist” to the very last. Thus was the modern definition of irony born. |
Here's another opportunity for me to rail against the smug supposed superiority of wormcatchers. Not that there's anything inherently wrong with waking up early, as I've noted before. It's the whole "turning it into a moral mandate" thing that bugs me, and the utter disregard for those of us with less common sleep schedules. Still, this article (from, ironically, Time) makes a few good points I hadn't considered A favorite trope of sleep research is to divide the entire human population into two cute, feathered categories: early birds (also called larks) and night owls. Often, these studies link people’s natural sleep patterns—called their chronotype—with some waking behavior or personality trait. Kind of like astrology. Research says that early birds are happier, more punctual, do better in school, and share more conservative morals. Night owls are more impulsive, angry, and likely to become cyberbullies; they have shoddier diets and, most critically, are worse at kicking soccer balls. "Research" done by larks. This is like those studies that conclude cats are psychopaths, which, when you look into the authors, you find they're all dogs. Well, dog people, at least. But can the population really be categorized so neatly? Or is the research painting an incomplete and overly moralistic picture? No. Yes. A study published May 24 in PLOS ONE by a group of Polish researchers takes a fresh look at the long-established link between being an early riser and being conscientious by examining a separate but potentially important variable that might underlie the link: being religious. The team found that people who woke up earlier tended to score higher on all dimensions of religiosity, leading them to conclude that being religious could help explain why early risers are more conscientious and more satisfied overall. In other words: correlation, not causation. Not that religion is any kind of predictor of a person's morality, either. “I think most people would recognize that, in reality, [chronotype is] more of a continuous type of variable,” says Brian Gunia, a sleep researcher, professor, and associate dean at Johns Hopkins’ Carey Business School. It exists on a spectrum: not everyone is always one or the other. But so much research uses this binary classification because people are usually able to self-identify that way, Gunia says. And there's that. Life is rarely binary. Is someone who wakes up at 6:59 am a lark, while someone who wakes up at 7 is an owl? I have similar issues with how "generations" are labeled. The bias that people who rise early are morally superior to evening people doesn’t just loom large in scientific research. It’s at the very heart of the U.S.’s founding principles of industry and hard work, says Declan Gilmer, a PhD student at the University of Connecticut who studies workplace psychology. “If someone gets up at 6 a.m., and they show up at work early, they’re viewed potentially as more committed,” he says. Founding principles, my ass. Also, one can also show one's commitment to helping one's boss buy a second yacht by staying late. “Some of the better work in the topic area has been trying to identify the genes that are most tightly linked to morningness and eveningness,” he says—genes that, if understood, could open the door to a more nuanced view of the topic. Especially since genes alone can't give you the full picture. There are other factors, including environmental. Very few chronotype studies include information about the time of day during which the research was conducted, but Gunia’s research has found that this seemingly simple factor can change data a fair bit. In a 2014 study of chronotype and ethical behavior, for example, “we found that morning people are most ethical in the morning, and evening people are most ethical in the evening, so maybe it’s more of a fit between chronotype and time [of day] than it is this idea that morning people are better or worse,” Gunia says. Studies that don’t take time of day into account “are missing half the equation.” That seems so blindingly obvious to me that I should be surprised no one's thought of it, and yet, I'm not, really. I know if I had to wake up at 6 am to go be a guinea pig in a chronotype study, I'd be grumpy as fuck. Humans don’t always fit neatly into one of two categories, even when it comes to their sleep preferences. Duh. For instance, I'm biphasic. While I tend to go to sleep late, I also sleep most late afternoons. One of the great things about being me is that I'm usually able to go to sleep when I'm tired, and wake up when I'm not. This is the pattern I fell into, but even when I had to simulate being a wormcatcher, my gas tank would run out in the afternoons. I'm not saying this makes me superior. It also doesn't make me inferior. You don’t have to be a morning lark or a night owl. You can be any kind of bird you like—there are plenty of worms to go around. The whole wormcatcher thing always bugged me anyway. Worms? Eugh. That's why I latched onto the alternative phrase: The second mouse gets the cheese. Mmm. Cheese. |
Wrapping up my entries for June's "Journalistic Intentions" [18+]... jeté (This time, I added the accent aigu myself.) Long ago, I took a cinema class in college. I might have mentioned this before. No, it doesn't make me an expert on movies or anything; it was mostly an easy B and a different movie to watch every week. One film that stuck in my head was an artsy French science fiction short called La Jetée. While the words "artsy" and "science fiction" may give some readers pause, it can work. It's also more of a slide show than a "movie," but that's part of the art. The basic thrust (that's a pun, but you might not know it yet) of the film is that a kid witnesses a death on a jetée (which is the French name for the observation platform of an airport, and yes, it's related to the English "jetty"). Later, after the apocalypse, the guy's sent back into time and (spoiler alert) dies on the jetée, realizing at the end that he was the man whose death he'd witnessed in his youth. Does that sound familiar? It should. A few years later, when 12 Monkeys came out, I remember watching it and thinking, "I really hope Gilliam acknowledges La Jetée for this idea." And, indeed, he did. Of course, that synopsis fails to capture the metaphorical breadth and depth of the original film. It really is worth watching, and now that I know more French, I might even try to find a version without English subtitles. As a verb, the french "jeter" means to throw. The dance term, "jeté," has a meaning closer to thrust (also, it's a past tense construction); hence the really very obscure pun above. Because one doesn't really throw a leg (unless it's a prosthetic); one thrusts it. "Jetée" is a related word (feminine gender) for a structure that's thrust out into something else, like the abovementioned observation platform, or how a jetty is kind of thrust into the ocean. You can see I'm avoiding the more obvious, salacious puns related to gender and thrusting. Why? Not because they're salacious, but because they're too obvious for a professional punster such as myself. What I really want to know, though, is this: starting maybe about 15 years ago, a neologism entered the English language. I despise most neologisms, but this one is inherently funny. The word is "yeet," and it's a verb meaning something like "to throw without concern." As in "I watched him yeet the ball right through the asshole's window, then walk away." What I'm wondering is if it's inspired by the French jeter— as the "j" sound sometimes does become a "y" in linguistics. There's also some confusion over whether the past tense is "yeeted" or "yote," and I'm rooting for the latter. |
Everyone has an opinion on video games. Most of them are wrong. I'm going to go ahead and note the precious no-caps headline, and then forget about it. A few weeks ago, I interviewed Dr. Rachel Kowert about the new genre of alarmist rhetoric around kids’ pandemic gaming and screen time. As is often the case, I'm behind the curve, here; this article is from early 2021. But the "alarmist rhetoric" has been going on since long before the pandemic; enforced social isolation only changed the narrative a bit. You might not have kids, and you might not spend much time worrying about gaming. I don't, and I'm too busy playing games to worry about them. But you can still recognize that as a society, we often spend a lot of time worrying about how a cultural product is affecting a group of people — kids, teen girls, grown-ass women — and very little time actually talking to the people actually consuming it. Right, because teen boys and men can be safely ignored. The problem, then, is that some people don’t want things to be complicated. They don’t want to hear people talk about why they like things, because if they listen long enough, it will challenge neat understanding of things that are “good” and “bad” — especially when it comes to children, or teens, or women. I'm going to spend the rest of this entry not being salty about the demographics there. I'll just note that this bit is otherwise pretty insightful, and argues against the pervasive good/bad binary. It's as if the great philosophers Beavis and Butt-Head live in people's minds, where everything is either cool, or sucks, and there's no nuance. One star or five stars, never anything in between. Kids don’t know everything. But they often do know themselves. So I wanted to hear them talk about their own relationship to the games they play: what they like about it, when they like to play, how games make them feel, who they like to play with, and how they respond to anxiety about their gaming/screen time. So, most of the article does just that. I won't highlight much from the interviews, but I found them interesting. There's just one caveat: I gave adults a list of potential questions, and then asked them to transcribe answers in as close to their kids’ voice as possible. Some of the answers have been shortened, but none of the wording has. I'd like to believe that, but this is, like, hearsay of hearsay. Some of the "voices" are questionable to me. So I wouldn't take any of them as accurate; some might be flat-out lying. I won't exacerbate that situation by quoting a lot from the actual interviews; just highlight a couple that I found to be interesting, whether it's actually a kid's "voice" or not. (5 year old kid) I hope you really enjoy video games too. They're invented to be really cool. They're invented to be engaging, rewarding, and at least borderline addictive. I say this as a gamer, but I'm also an adult (at least by chronology). (15 year old kid) Gaming is so new that there's no conclusive evidence yet to prove if it's actually harmful. It feels like they’re just trying to control us and tell us what to do. Now this, I can believe coming from a 15 year old. (13 year old kid) People need to make sure they don’t get correlation and causation mixed together. Which is what I've been saying. I'm tempted to not believe he actually said that, but other things in that kid's interview make me think he's rather advanced for a 13 year old. There's a lot more at the link. One final point, though: a lot of these kids focus on the social aspects of gameplay. I prefer playing solo games, myself. Unlike these kids, I followed the evolution of computer games from text-based adventures all the way to near-VR experiences. I've had a lot of good experiences playing multiplayer games, but I also think there's a toxic culture there. Not just online trolls, but the whole conversion of a leisure activity to cutthroat competition. I don't know. Maybe you think I don't get to have an opinion on the subject, because I don't have kids. That's your prerogative, of course, but these kids are part of the society I live in, so I do have an opinion. So do they. |
Today's throwback is from a bit over two years ago, and dealt with corporate jargon: "Put a Pin in This Synergy" The link, from NPR, is still there, as of today. No idea what NPR's archiving policies are. Sadly, despite another two years of pan(dem)ic with its much-discussed rise of work-from-home situations, corporate jargon endures. And it's likely morphed again since I wrote that entry, and since the original article's date of November 2020. I wouldn't know, though, since I don't keep up, and articles that unironically include it get X-ed into oblivion very quickly. I did want to touch on a few things I said back then, though: Look, the English language isn't exactly pure as driven snow. Not sure why I didn't mention this at the time (probably because it's irrelevant), but the cliché "pure as driven snow" deserves some attention. Today, when we talk about the verb "drive," from which "driven" derives (see what I did there), we think of cars or trucks. When vehicles kick up clots of snow, they're about as far from pure as one can imagine without being yellow. But the concept of "driving" a car comes from the pre-vehicular practice of driving a team of horses that are pulling, say, a carriage—in which case, "driven" would still not be associated with purity. Apparently, if my sources can be trusted, the verb "drive" has yet an older meaning, referring to windblown snow drifts, which were generally thought of as unsullied. And yes, the word "drift" is associated with that meaning of "drive." All of which is to say that today's corporate jargon is tomorrow's unintelligible cliché. And finally: Being able to complain about language changes is one of the many perks of getting older, along with joint pain and ragging on "kids these days." Which is why I push back when someone uses "literally" as an intensifier, or "decimate" to mean anything other than "remove 1/10th of." |
The penultimate June entry in "Journalistic Intentions" [18+]: Interval training I can't stress enough. ...oh, you want more? Fine. I can't stress enough the importance of training your interval. Wild intervals can be chaotic, unpredictable, noisy, and even dangerous in the wrong situations. In the early days of interval domestication, trainers lost fingers, toes, and even noses to these stubborn beasts. Even today, improper training can lead to catastrophes, such as spilled coffee, punching in late for work, and a bad hair day. So, you may be wondering, what would the proper procedure be for interval training? Well, that's the problem, see: every interval is different. Some differences are nearly undetectable, while others are chasms rivaling the Grand Canyon. You'll need to discover the method that works for you and the interval. It's a long, difficult process, but the rewards can be inconceivable. In the end, you'll find that it was the interval that trained you. |
Yes, this article is basically an ad for books. But it was like catnip for me. Guardian link, so UK spellings ahead. Few people would mistake a wolf for a dog. But if you saw the ancestor of the domestic cat in your backyard, your first thought would likely be “What a cool-looking housecat!” rather than “What’s an African wildcat doing in Manchester?” I'd be like "Hey, when did I get a house in Manchester? Cool! Where's the nearest pub?" What about behaviour, then? Which of the traits we commonly associate with our furry friends are the result of domestication, and which do they share with their wild relatives? This is why the article is interesting to me, though much about cat behavio(u)r remains a mystery—as it should. Let’s start with the classic cat sound. Yes, let's. What I find fascinating is not so much the meow itself, but the different interpretations of the sound across cultures. Oddly, this article spells it meow, when proper British cats go miaow. French cats say miaou—not so different (I suspect, but I'm not sure, that the plural is miaoux). Japanese cats say something like "nyan," which is the source of a popular meme from a few years ago. Most meows in other languages are at least close to onomatopoeic, though. I had always assumed that cats talked to each other by meowing, and that they were just including us in their social circle. However, detailed behavioural observations of unowned groups of cats living in southern England have revealed that they rarely meow among themselves. This appears to be borne out by other observations. I have a cat who almost never vocalizes. She'll purr when you expect her to, but I think I've heard her meow twice in eight years. I got her as an adult, and apparently she wasn't well-socialized with humans as a kitten. You'd never know it now, though, because she doesn't act feral at all. Domestic cats are thought to be like other felines: solitary, aloof and asocial. But that is not always the case. When unowned cats occur in large, dense populations – as happens when people provide a lot of food – they do live in groups, composed mainly of related females. Whoever thinks that doesn't know many cats. No, they're not always up in your face like those other animal companions are; to me, this is a feature, not a bug. But when they do want to hang with you, they won't leave you alone. Why, then, do we consider domestic cats to be loners? Remember that the key aspect of lion and domestic cat groups is that they are made up of female relatives. But when multiple cats are brought together in the same house, they often arrive at different times, from different families. Not surprisingly, they frequently don’t get along. My two (female) cats come from different places, and they have, at best, a wary armistice; at worst, a howling fight. Well, one of them howls. The other is silent, as usual. But they both get along well with my housemate's cats. And as for the disturbing claim that your cat would eat you if you died at home and your body weren’t discovered: don’t believe it. Research shows that dogs are the culprit much more frequently. I did want to address this libelous slander. Even if it were true... so what? What do you care? You're dead. And at that point, there is only one way for you to continue to fulfill your contractual obligation to your feline overlord. |
For a city whose very name is synonymous with "joke," Cleveland has had an outsize influence on pop culture. Rock and roll, for example, was named there, and the city is home to that Hall of Fame. Science fiction icon Harlan Ellison was born there. And then there were these dudes. How Two Jewish Kids in 1930s Cleveland Altered the Course of American Pop Culture On Jerry Siegel, Joe Shuster, and the Birth of Superman So. LitHub isn't all about highbrow art. Good to know. In a small attic bedroom in Cleveland, in the Jewish neighborhood of Glenville, Jerry Siegel tried to sleep. It wasn’t the summer heat that was keeping him awake nor his snoring older brother Leo snoozing noisily beside him. Twisting and turning, Jerry had a new idea for a story in his head. Because when I think of Cleveland, I think about how hot it gets in the summer. But this was before A/C, I suppose. Jerry, a nerd with glasses, had had few friends at Glenville High—ignored not just by the girls but the boys, too. Starting to sound familiar. Thanks to magazines, like Amazing Stories, that Harry brought home, Jerry discovered a new genre called science fiction. Point of order: Science fiction was already over 100 years old at that point. But, okay, the pulps really made it explode into American consciousness, for better and for worse. Science fiction magazines would change Jerry’s life—though not for the better. Just as generations of parents would warn their children. One of his favorite books, Gladiator, told the tale of a man with superhuman strength who could run faster than a train and jump higher than a house. Another favorite character was Doc Savage, a pulp magazine hero whose first name was Clark and who was known as “The Man of Bronze.” I've heard that Stan Lee considered Doc Savage a kind of forerunner to comic book superheroes. That adventure genre is still around, and still a huge force in movies and TV. While LitHub here follows the New Yorker School of Not Getting to the Fucking Point, eventually, they get to the point: Now, on that summer night when he couldn’t sleep, Jerry, twenty-one and unemployed, finally got up, put on his glasses, slipped into the bathroom so as not to wake his brother, and started writing. He went back to bed, then threw off the covers after a couple of hours and wrote some more. By dawn, he had a complete script. He got dressed and, story in hand, took the porch steps at a gallop. I don't know how historically accurate this really incredibly long opening anecdote is, but it does track with what I know about the writing process. Huffing and puffing, Jerry arrived at the dilapidated two-story Maple Apartments that Joe and the Shuster family called home. “Joe, you gotta draw this,” he said, waking him up, thrusting the script beneath his blinking eyes. Parallels to Archimedes? Really? The first drawing of Joe’s that Jerry saw was one that Joe had saved from 1928, inspired by the new Fritz Lang movie Metropolis, which had blown both of their young, nerdy minds. The movie’s lesson—the worker being kept down by the demented, greedy capitalist—was perhaps lost on them, though it was a harbinger of things to come. In case you were wondering if Superman's Metropolis had any connection to the eponymous movie. The pair took inspiration from a new art form called comic books, particularly the ones that editor Max Gaines started putting together in 1933, repackaging previously published Sunday comic strips into a separate booklet with a colorful cover and saddle stitches. Funnies on Parade and Famous Funnies were given away as a promotion to buyers of Procter & Gamble products, who had to clip coupons and mail them in for a copy. The newly packaged funnies proved so popular that Gaines decided to sell them for ten cents apiece. It can be confusing, I think, to refer to the superhero genre as "comic books," when not all of them are what we think of as "comic." This origin story might help. Jerry suggested Joe put an S on Superman’s chest—not just for Superman’s name, but for Shuster and Siegel—and a cape on his back that would whip around, one of the few ways for Joe to show dramatic motion. And this, in case you were wondering why the cape became such a standard thing in superhero comics. It would be over sixty years before "No capes!" became a catchphrase in a superhero movie. Why would you need a cape to demonstrate action in a movie? Superman would be an alien with super strength whose real name was Kal-El (“Voice of God” in Hebrew). Kal-El came to Earth as an abandoned baby, much like Moses in the Old Testament, just as his planet, Krypton, was destroyed. Like Jerry, his father would die while he was still young. Kal-El would be adopted by a couple of Gentiles and renamed Clark Kent, a name Jerry took from Clark Gable and B-movie actor Kent Taylor. By day, Superman was a mild-mannered goy with glasses ( just like Jerry’s and Joe’s). He would live, naturally, in a city called Metropolis. Most telling, I think, isn't the specific cultural origins of Superman, but that he was an immigrant. There's a lot more at the article, of course. And whatever you think of the character—appropriately, he's had his ups and downs over the years—it's not often that we get to see the origin story of an entirely new literary genre. Famously, Siegel and Shuster got boned while their creation became a worldwide phenomenon. But in the end, as is usually the case, the nerds came out the winners. |
Summer solstice today, for those of us in the One True Hemisphere (so of course it's cloudy all day here), and another entry for "Journalistic Intentions" [18+] that has nothing to do with axial tilt: Pas de cheval Those are all really simple French words, and they're once again in the Dance section of this month's prompt list, but I'm at a loss when it comes to understanding how it could have anything to do with dancing. So, in honor of nothing in particular, I'm going to do something different this time, and look it up. Hey, look! It's a song by Panic! At The Disco You know, I probably should have given that band a fair shot. But once, long ago, someone sent me a link to a song of theirs that I hated. This one's actually pretty good. It is, however, one of those songs where the title doesn't show up anywhere in the lyrics. Like Baba O'Riley or Bohemian Rhapsody. Which means I still have no idea what the phrase is all about. ...Oh. Nothing to do with the French negative, but a step. Horse step. Step of horse. I should have guessed that. You know, sometimes, words have two meanings (that's a line from Stairway to Heaven, which is a title included in the song's lyrics). I wonder if we got "paw" from "pas," though, or vice-versa, so I'm taking another detour. Nope. You know what English word it is related to, though? Pace. And now things are starting to make sense, which is weird, because most things, once you really look at them, don't make sense. Sense is, in fact, overrated. And the word has at least two meanings. Dance is, ultimately, about movement. But you know what solstice means? Literally, "sun stopped." Kind of the opposite of movement. Wow, so I was able to relate the subject to today's astronomical significance after all. Sure, it was a stretch. But one needs to stretch if one is to practice ballet. |
As sometimes happens, we get two Cracked links in a row. This one deals with two of my favorite subjects: science and food. Not that I'm a big fan of breakfast cereal. I don't use milk enough to keep any in the house, so I mostly just eat it when I'm at a cheap motel whose idea of a free breakfast is a few stale muffins and some prepackaged cereal, along with some milk kept in questionable conditions. Nor does this article delve into the interesting history of mass-produced breakfast cereals. It's a fascinating combination of such core American values such as puritanism, processed food, too much sugar, marketing, convenience, and profits. Maybe another time. 4. Derivatives of Position and the Rice Krispies Gnomes Gnomes? Huh. For some reason, I always thought they were elves. I guess Keebler had the monopoly on the latter. And yes, there's a difference; ask any FRPG player and they'll tell you. At length. Derivatives of position are a complex series of metrics within physics to allow for as accurate a description of movement as possible... And half the readers just closed their browsers in abject terror. It's not all that complex, really. First derivative of position is velocity; second is acceleration. I've mentioned this before. Those are things we're all familiar with on a daily basis. If you're pulling away from a stoplight, your car has position, velocity, and acceleration. Where things get wonky is that acceleration can be uneven, as well, so you get the next derivative, called jerk. It’s the fourth, fifth and sixth that start to get breakfasty — they’re called snap, crackle and pop. These measurements — strictly speaking, “derivatives of the position vector with respect to time” — owe their names to someone simply trying to make themselves laugh. Scientists have senses of humor, too. Why else would they call the bony spiky thing at the end of certain dinosaurs' tails a thagomizer? 3. The Complexities of Fluid Dynamics and the Cheerios Effect Cheerios are inherently interesting because they're tiny toroids. I call them bagel seeds. You know how, in a bowl with only a few Cheerios left, they tend to group together? Physicists have noticed this, naming it the “Cheerios effect,” and investigated its potential ramifications within the field of fluid dynamics. Huh. I always figured that was related to surface tension and/or slight imbalances in electrostatic charge. But I never cared enough to investigate; after all, having them clump together is a feature at breakfast, not a bug. “The Cheerios effect arises from the interaction of gravity and surface tension — the tendency of molecules on the surface of a liquid to stick together, forming a thin film across the surface. Small objects like Cheerios aren’t heavy enough to break the surface tension of milk, so they float. Their weight, however, does create a small dent in the surface film. When one Cheerio dent gets close enough to another, they fall into each other, merging their dents and eventually forming clusters on the milk's surface.” I was half right. Yay. 2. The Tomography of Why the Big Pieces Go to the Top This never struck me as all that puzzling. Not if you invert the assertion: that the small pieces go to the bottom. You get breakage, and smaller pieces are more likely to slip through the voids. But again, I'm not actually a scientist. We’ve all been there — you pull the inner bag out of a new box of cereal to open it, and all the raisins (if you’re healthy) or marshmallows (if you’re not) are up at the top, like the head on a glass of beer but more infuriating. I think some people would call that a feature, not a bug, perhaps because all they can see is the immediate effect (more marshmallows, yay!) and don't think about the future (a bleak one indeed, featuring all the boring parts of the Lucky Charms). It's kind of like those perverts who don't shake the orange juice and get nice smooth beverage, leaving extra pulp for the next family member. “This will allow us to better design industrial equipment to minimize size segregation thus leading to more uniform mixtures. This is critical to many industries, for instance ensuring an even distribution of active ingredients in medicinal tablets, but also in food processing, mining and construction.” See? It's not just a theoretical consideration. 1. Compressing Cocoa Puffs to Save Future Skiers This is really irrelevant, but I never liked Cocoa Puffs. Or anything described as "chocolatey" instead of "chocolate." They saw vastly more complexity in how the cereal was deformed (a science word, not an offensive one) than anyone was expecting — three different types of velocity-dependent deformation and a propagating compaction band recorded visually for the first time in granular matter. There goes another half of readers. They also tested Cocoa Puffs and Cocoa Krispies in the compressor, to see what difference chocolate made. It takes more pressure to crush some cereal if it has a chocolate coating. It. Is. Not. Chocolate. Maybe one day, science will be able to tell the difference the way my taste buds can. |
When it comes to invention, some of us can be pretty clever. Well, not me. But some of us. From Cracked, so the usual countdown list and jokes. An icon of a floppy disk represents saving in Word documents. That’s even though floppies have been almost nonexistent for years, and haven’t been the primary place for saving docs in decades. And? Some "hang on a sec" computer icons display an hourglass, which is even more obsolete. “Kids today,” more than one commentator has speculated, “probably don’t even know what a floppy disk is. They just think that’s the ‘save symbol,’ and it doesn’t stand for anything in real life!” And those ignorant bastards also think # (technically called an octothorpe, which I mention because that's an awesome name) means something other than "number" or "pound." 5. Before Trains, Horses Pulled Goods Through Water Okay, I can kind of excuse being ignorant about obsolete technology. What I can't excuse is being ignorant about music, and if you weren't exposed to "Low Bridge, Everybody Down" as a kid, your teachers failed you. In England, people were using these horse barges as far back as Roman times, and the practice came back during the Industrial Revolution, once we finally had a lot to transport. In the 1790s, one single canal eased transport so much that the price of coal halved. The country approved dozens more canals in the years immediately thereafter, and by the middle of the next century, they measured 5,000 miles in length. The canal system in England is fascinating, and we don't hear much about it here in the US. 4. Before Plastic, We Mixed Sawdust With Blood Okay, some older technologies won't be missed. Though with current whingeing about plastics, perhaps this one will stage a comeback tour. Hemacite sounds like some sort of mineral. Yeah, that's because there is a mineral called hematite. But as all you vampires and fans of etymology know, hema- means blood. As does hemo- but not hemi- or homo-. People manufactured hemacite to deal with a pressing problem: a surplus of blood. Slaughterhouses produced blood in huge quantities, and people had limited success using the sticky stuff in fertilizer. Apparently, that's not a big an issue anymore. So if there's a resurgence in hemacite, maybe we can use blood from organ donors. Blood's an organ, right? 3. Pre-Satellites, We Used to Bounce Signals Off the Moon Scientists first used a satellite to relay human communications in 1954. This milestone will raise some eyebrows from most students of space history, who know that Sputnik didn’t launch until three years later. Saves me from quibbling about how the moon is a satellite. The military stopped using the “Moon bounce” in the 1960s, now that they had communications satellites. But amateur radio operators can still totally try EME themselves today. It’s not easy, but you can do it. May as well get to practicing now, so you’ll be prepared once a catastrophe knocks out all satellite communications and you need to send a message of hope to all survivors worldwide. Except the moon needs to be in the sky for this to work, so at best, you'd only catch about half the globe. Unless, of course, the flat-earthers are right [spoiler: they are not]. 2. Letterlocking: Envelopes Before Envelopes This bit is actually pretty cool, and much less gross than hemacite. I'd been seeing various articles about it recently. I don't know why; maybe because it's more secure than Twatter? The most secure way to send a message is physically. Seal it in an envelope, and the spies who pursue you will not easily be able to open it without leaving some evidence behind. Because we're all being spied upon, even outside the internet. 1. People Had to Listen Really Closely for Planes And I can't really do this one justice; you have to see the illustrative photographs. So, not really a lot of deep meaning in today's article. Just some cool history stuff. |
It's Time Machine Day, as looking at random posts from the past is what I usually do on Sundays. So I did my usual random number generation, which roped back this post from April of 2010: "Nothing's unforgiven in the four corners of Hell" Which was, essentially, content-free. Sorry about that. Not that I'm above linking videos, but I generally do it as a bonus, not as basically the entire entry. I remember going to that concert, and I remember I enjoyed it, but apparently I never said another word about it in the blog. It's been many years since I went to see a concert. This is not because I'm old or introverted; that's never stopped me before. It's also not because of COVID; I quit some time before that mess. No, it's because of two things: 1) Live Nation / Ticketmaster. "Oh, look, concert tickets are only $125." *goes through the process* "Those tickets just cost me $250." 2) I flat-out refuse to pay for any events at venues named after corporations. On the first point, if the up-front price had been $250, I might very well have paid it without complaint. But no, they have to add on service charges, convenience fees, venue fees, venue naming rights fees, taxes, the CEOs daughter needs a big wedding fees, overdraft fees, and fuck-you-I'm-a-monopoly charges. They're worse than banks or airlines at this bullshit, and that's saying a lot. I did read an article recently about how they might stop these deceitful practices, but as far as I'm concerned, the damage has already been done. Screw them. As for point 2, with all the above excess hidden fees, as well as at-venue advertising and overpriced concessions, you'd think venues would make enough money to keep their names. But no, there's no such thing as "enough money," so they whore out their very identity to some faceless corporation. One day, if it hasn't been done already (I don't have the intestinal fortitude to look it up) there will be a Live Nation Arena, and hopefully a black hole of corporate greed will form underneath it and suck it into oblivion (while empty, of course; I'm not a monster). So anyway, that Flogging Molly concert wasn't the last live show I ever attended. I think that was BrandiCarlile sometime in the mid-tens. A fine way to end a hobby I used to love. |
Today's article, from Atlas Obscura, covers a subject of major world importance. Everything You Need to Know About the True Origins of the Everything Bagel There’s a lot of history in every bite. Somehow (I think due to well-meaning gift-givers), I have not one, not two, but three containers of Everything Bagel Seasoning on my spice rack. I have never, not even once, used any. When I think about what to put them on, I'm always at a loss. Other breads? Somehow, that just doesn't work. Vegetables? Blasphemy. Meats? Heresy. And when I want an actual everything bagel, there are lots of bakeries around here that prepare them; some of them are even good. The everything bagel is the king of bagels. On this there should be no argument. With that kind of diversity, I think we're past hierarchical power structures. In the same way that it combines all of the key bagel toppings—sesame and poppy seeds, dried garlic and onion, and coarse salt—it’s also a combination of ancient traditions and new fads, Eastern ingredients and Western techniques. Not to mention its topologically interesting toroidal shape. With cream cheese and lox, it creates, more or less, the perfect bite. Dammit! Now I'm hungry. And I just had an everything bagel with lox and cream cheese yesterday. There are, however, arguments about who invented the everything bagel, and none of them are particularly compelling. This seems to be a trend in food (and drink). Several New Yorkers have staked their claims as its inventor... One thing we can say with some certainty is that it had to have been introduced in New York City. Let’s be honest, it’s probably not possible to have “invented” the concept of putting several different existing bagel toppings on a bagel. I imagine this is akin to putting pepperoni, pineapple, crab meat, ghost peppers, and figs on a pizza and claiming to have "invented" it. ...if there are five popular bagel toppings, it is fairly obvious to make a bagel with all of those ingredients. That’s not invention. Maybe not, but an everything bagel is obviously superior to my purely imaginary pizza creation up there. But there is one element of the everything bagel that is invention, and that’s the name. “Everything” is the accepted name for a fairly specific combination of toppings: It is not a “combo bagel” or a “spice-lover’s bagel” or, as the Canadians might call it, an “all-dressed bagel.” It is an everything bagel, and someone had to come up with that piece of clear, descriptive branding. Especially since it's clearly not "everything." By his own and most other accounts, that person was David Gussin. Around 1979 or 1980, he says, he was a teenager working at Charlie’s Bagels in the Howard Beach neighborhood of Queens, New York. If you're curious, that neighborhood is located very close to JFK Airport. He was doing typical teenage job stuff: cleaning, working the counter—and cleaning the oven, where excess bagel toppings accumulated when they fell off. “One day instead of throwing them out like I usually did, I gave them to Charlie and said, ‘Hey, make a bagel with these, we’ll call it the everything bagel.’ It wasn’t that big of a deal; we weren’t looking to make the next big bagel. Charlie was probably more interested in what horses he was going to bet on.” It's also not far (unless there's traffic, which there always is) from Belmont Park, where they race horses. Soon, a shop across the street started selling their own everything bagels, and word slowly spread. If you're skeptical about having two bagel places across the street from each other without one of them going out of business, well, all I can say is: Welcome to Queens. But there’s more to this story. What exactly is an everything bagel? And more importantly, why did it catch on? I think we know the answers. This article has already described the "what," and the "why" is self-evident: it is delicious. The article proceeds, regardless. In 2009, I moved to San Francisco after spending my entire life in the Northeast. At the coffee shop in my new neighborhood, I ordered an everything bagel. It came with sunflower seeds on it. I appreciate California, but they cannot do bagels. Or pizza. People have strong feelings about the right and wrong ways to prepare and consume certain foods, particularly beloved or traditional ones. Really? I hadn't noticed. I won't rehash the article's short description of the evolutionary origins of the bagel, but it's worth reading. In New York, bagels first gained widespread attention thanks to the sometimes vicious bagel strikes of the 1940s and 1950s. I'd heard rumors about these before. Dark times indeed. Glad I hadn't been born yet. It wasn’t until the 1960s that the bagel went national, thanks to a few innovations: machine-rolling, freezing, and pre-slicing. The Lender’s bagel combined all three of those processing conveniences and blew up the entire bagel industry. Suddenly the entire country was awash in bagels—and not particularly good ones. I want to emphasize that last bit. Lender's bagels are edible, and better than the abominations you can get at McDonald's, but that's the best I can say about them. The room-temperature variety is marginally better than the frozen one, but I don't know what preservatives they use to make that possible. I'd wager the orginals were pretty good, as Mr. Lender was a Jewish immigrant from Poland. But now it's just another processed, mass-produced food product. Back to the point of the article, which is the toppings. I won't go into that, but it's enlightening. To summarize, the toppings came from many and varied locations, as befits a thing invented in New York City. So who invented the everything bagel? An entire culinary tradition spanning continents and thousands of years. That, and David Gussin, since no one called it an everything bagel before he did. Probably. An actual new invention is rare. But sometimes, even an improvement on an existing invention can change the world. |
I probably overuse parentheses (but I don't care). "Best" in this case (and so many others) is entirely subjective, but okay. I love a good aside. I live for literary intrusion. I want comments on my comments, discursive thinking, footnotes. Footnotes can be a massive pain in the ass, especially in traditionally printed works without benefit of hyperlinks. What I’m saying is this: I can’t get enough parentheticals. So on the watery occasion of it being parenthesis-master Vladimir Nabokov’s birthday (April 22), I have collected few of my favorites... Like many articles I feature, this one's a bit old, over two years. But it's unlikely any truly great parentheticals have been introduced since then (especially here in my blog). Vladimir Nabokov We know this guy was a great writer, at least by lit-fic genre standards, but the article author's explanation of why the particular excerpt achieved greatness is (in my opinion) worth reading. Virginia Woolf No. Elisabeth Bishop Who? ...seriously, though, another good analysis, which I think would be more clear to me had I read the entire poem (which I haven't, and won't). Jamaica Kincaid To be honest with you, I could have chosen almost any sentence from A Small Place for this list. The book positively bristles with parentheses; Kincaid uses them to explain, to criticize, to condescend, to name, to emphasize, to speak the unspoken, to distance herself and at the same time, to implicate herself—reflecting her dual relationship to Antigua (both native and non-resident) and to the text. Well, that's a ringing endorsement for not reading Kincaid. Just because I overuse parentheses doesn't mean I tolerate it when other writers do. I also have a problem with repeating words too often, and it grates on my very last nerve when other writers do that, too. e.e. cummings ah,yes—the verysoul of pretentiousness Okay, yes, I'm biased; I'd rather slog through 10 bad science fiction books than one snobby lit-fic tome. But I think I get what the article is saying about parentheses, and I think we can all benefit from (at the very least) reading this article. |
Time for another entry for "Journalistic Intentions" [18+]: Strength Interesting word, strength. You don't get too many clusters of four consonants in English; twelfth and angst provide two other examples. Extend them to plurals and you can get five consonants in a row. Together with the three-consonant initial cluster in "strength," you get a strange word indeed: seven consonants with one vowel. There's apparently an archaic word, strengthed, which holds the record, but no one uses that anymore. Apparently we started using "strengthened" instead, giving it two syllables. Some say that the word "rhythm" is the longest consonant cluster in English, but as the "y" acts as a vowel there, I disagree. I'd also argue that the "-thm" cluster there contains a hidden schwa. We use these words often enough that their oddities barely register. As for the concept of strength itself, well, you could say the word does some heavy lifting. That is, you could say it if you wanted people to yeet tomatoes at you. In its most common usage, it refers to a person's physical power, as in how much weight you can curl or press or whatever. It can also refer to mental fortitude, though; and it's not always about muscular or mental power, but often it refers to the amount of stress an inanimate object can take before it snaps. The dictionary definitions of strength generally mention that it's "the quality of being strong" or some such. Which, to me, illustrates one of the problems with dictionaries; you have to also look up "strong." Any language (as opposed to translation) dictionary is entirely recursive. What does recursive mean? Let's check the Waltz Dictionary: Recursive (adj.) see recursive It takes a great deal of mental strength to deal with English. It's no wonder so many of us resort to glossolalia. |
Not even two weeks ago, I mentioned the writer Cormac McCarthy in a quoted passage. Now I hear he's passed on. So I've been racking my brain to come up with another name to drop here, to test my newfound powers. Sadly—or not, depending on your perspective—I can't think of anyone I'd actively wish death upon. Not even that politician you're thinking of right now. As Clarence Darrow (not Mark Twain) once noted, "I have never killed any one, but I have read some obituary notices with great satisfaction." To clarify, I didn't hate McCarthy (at least not that McCarthy), but he had little impact on my life. I never read his books or, to my recollection, seen any of the movie adaptations. So I had no reason to wish him dead, or to mourn his passing any more than that of any other individual. But I admit to great satisfaction upon hearing of the death of Pat Robertson last week. (Segué!) Today's article, from Cracked, has nothing to do with death, but does touch on the subject of commonly misattributed quotes. When you were young, you learned about how George Washington cut down a cherry tree, Ben Franklin flew a kite in a lightning storm and an apple hit Isaac Newton on the head. True or not (they are not, except maybe, to some extent, the one about Ben), those are "myths" in the original sense: foundational stories, kin to, but much more recent than, stories about Olympians or Asgardians. They reveal more about us than about the original subjects. You cursed your kindergarten teacher for feeding you those myths and gleefully accepted a new teacher — the internet. For me, that took a while, as the internet as we know it was invented maybe 20 years after I was in kindergarten. But obviously the joke here is that there's more misinformation on the internet than in a whole school full of urban-legend-spreading children. 5. Myth: Marie Curie’s Books Are Now Stored in Lead The Story: ...Her notebooks will be radioactive for another 1,000 years, and the French National Library (Bibliothèque Nationale de France, or BNF) stores them in lead boxes. Setting aside that we know from the setup that this is a myth, it's not like radioactivity happens for a set amount of time and then suddenly disappears. Almost everything is radioactive to some degree, and the ionization fades over time. The question is when it's safe enough to handle, and for how long. We were unable to visit Paris this week, but we reached out to the Bibliothèque Nationale de France. They asked exactly which publication we represented. The answer must not have impressed them because upon hearing it, they ceased replying. Probably with a muttered "putains américains." Vairon found the lead idea improbable, but he dutifully checked in with BNF, who confirmed that, no, the books are not stored in a lead-lined container, just in plastic, and they’re kept with the other precious archives. Coincidentally, that's exactly how my comic books are stored. Other people may be getting the fact from Wikipedia, which cites 2005’s A Short History of Nearly Everything by Bill Bryson, who is best known for writing funny travel books. Here, the name does give you a good idea of what it is: It’s a pop-sci book with too great a scope to be an authoritative source on anything. You shouldn’t cite it, for the same reason that you shouldn’t cite Cracked listicles or Wikipedia; you should look to their sources, if they have any. Which is what I've been saying. Yes, I do link to Wikipedia on occasion, but it's not like I'm writing scholarly articles here. As for Bill Bryson, I seem to be the only person on the planet who doesn't find him especially funny. 4. Ernest Hemingway Never Wrote ‘For Sale: Baby Shoes, Never Worn’ In fairness, this does sound like something Hemingway could have written. But as the article notes, he did not (or if he did, he plagiarized it). See also my comment above regarding a quote misattributed to the actual American humorist, Mark Twain. 3. Alfred Nobel’s Inspiration for the Nobel Prize Well, you see, Nobel lived next to a church and got tired of the bells ringing when he was trying to take a nap. So when someone finally removed the bell's clapper, Alfred awarded that person the first No-Bell Peace Prize. Yes, I just made that up. Fortunately, we're in no danger of that joke going viral. Anyway, to summarize this section, people think Nobel founded the Prizes in a fit of remorse after inventing something that blows shit up, but that turns out not to be the case. 2. We Have No Proof Alan Turing Killed Himself The Story: Alan Turing saved the world by cracking the Nazis’ Enigma code, and we did him dirty in return. After the war, Britain prosecuted him for homosexuality and forced him into chemical castration. That’s a terrible way to live. Turing ended up killing himself, in a fairy-tale manner: He injected cyanide into an apple then took a bite from the poison apple and died. As the article notes, there is no proof of this. But as apples have figured prominently in many much older myths and stories (see the above bit about Newton, e.g.), it's easy to understand how this story would have staying power. 1. Franz Ferdinand’s Assassin Never Stopped for a Sandwich This one is maybe a bit obscure unless you're a WWI history buff. Most everyone knows about the assassination itself, but the sandwich detail is the possibly obscure part. Regardless of details, though, we humans love a neat, tidy origin story. WWI started with the assassination of Ferdinand. The Trojan War happened because Helen batted her long, silky eyelashes at some Greek dude. The American Revolution started when some colonists polluted Boston Harbor. WWII was a direct result of the Germans bombing a different Harbor. That sort of thing. In reality, wars usually have complex causes that take actual work to tease out. The Story: Gavrilo Princip managed to shoot Ferdinand on June 28, 1914, thanks to a sandwich. Earlier plans to assassinate the guy didn’t go so well, so Princip ducked into a café for a quick bite. By amazing coincidence, Ferdinand’s revised motorcade route passed right outside this café. Princip dashed outside and took his shot, resulting in World War I, and all of the world history that followed. As the article also notes while dispelling this enduring myth: Nevermind whether, if Princip missed that shot, the group backing him would have pulled off the assassination on some other day, or whether some other event would have sent all the dominoes tumbling regardless. I just have to say one more thing about this, to debunk another myth: The sandwich was invented in the 18th century, and by 1914, plenty of people in different parts of the world ate sandwiches, but not in Sarajevo. The sandwich was not invented in the 18th century. The idea of putting other food into bread and eating it together goes back at least to Roman times, and probably earlier. Hell, the sage Hillel reportedly had the idea back in the first century BCE, and I seriously doubt he invented it. That Earl of Sandwich story? Great story, and maybe it's even true (though I'm somewhat disappointed that it was the Earl of Sandwich and not Lord Penistone, which would have given great humor to the question, "Is a hot dog a penistone?") Sliced bread, as we know it, wasn't invented until the 20th century, as I noted in "The Greatest Thing Since..." . Before then, people had to actually use a knife or break the bread with their hands. Saying that Montagu, Earl of Sandwich, invented the food is a bit like saying that Henry Ford invented the automobile—it's more like Ford invented the assembly line that made cars cheaper, thereby popularizing them. We all have to watch out for misinformation, especially when it sounds like it could be true. Such as the false definition of the Blue Moon, which I'm bringing up again (I hear you groaning) because we're scheduled to have another false Blue Moon in this coming August; the actual next Blue Moon is in August of next year. I'm not entirely immune to myth, myself. But at least I try to switch to the facts whenever I find them. |
Today's article, from Slate, is almost a year old. Still relevant, though, apart from a few details. Why the Myth of the Miserable Lottery Winner Just Won’t Die Actually, most people do just fine with their millions of fresh new dollars. On Friday night, Mega Millions held a drawing for a whopping $1.28 billion jackpot, the third largest in American history. That's the no-longer-relevant detail. Other sources put it at 1.337 billion, which I find amusing because 1337 is computer nerd code for "leet," meaning elite, meaning awesome. Also, it turned out that the ticket was redeemed anonymously, apparently by a two-person partnership. Some lotteries let you do that. Should you ever win the lottery, I highly recommend going that route, tempting as it may be to shout your good fortune from the rooftops. One in eight American adults play the lottery at least once a week, and almost half buy at least one ticket a year. I'm a gambler, but I don't buy lottery tickets. Because I do play other games of chance, I don't heap scorn upon those who do. Everyone has their thing. Unfortunately, even if you bought your annual ticket sometime in the last few days, that winner is probably not going to be you—unless you happen to be that lucky Illinoisan. Obviously, this article was written before the anonymous partnership stepped forward to claim the winnings. Maybe that’s okay, you tell yourself. Don’t lottery winners end up broke and miserable? It would be great to be rich, but I don’t need $1.1 billion. No one "needs" $1.1 billion. I wouldn't turn it down, though. However, the risk/return ratio on a lottery is too high for my taste. Except most lottery winners do not wind up broke, or miserable, or bankrupt. This is the important part of the article, as evidenced by the headlines. Lotteries have been characterized as a tax on people who are bad at math. I can't fully disagree with that, though it's not an absolute certainty that if you play the lottery, you're bad at math. What is probably true is that if you were bad with money before you win the lottery, you'll be bad with it afterward, as well. Money fixes a lot of things, but carelessness isn't one of them. However, in this particular case, everything about the outcome (as reported in the second link above) signals that the winners made good decisions: - Forming a partnership, which most likely clearly spelled out who would get how much of the winnings; - Remaining anonymous; - "Working with professional legal and financial advisors to support the claim process." Stories about regretful lottery winners are trotted out whenever jackpots get big. But as much as jealous losing bettors might want to think that winners’ unfathomably good luck is balanced out by bad, most people who strike it rich this way settle into lives of quiet, comfortable anonymity. You only hear about the bad-luckers, much as you generally only hear about the few planes that crash, as opposed to the millions that take off and land without major incidents. And yet, the myth of the miserable lottery winner persists. The history of this myth reveals a longstanding national discomfort with gambling, and exposes deep-seated cultural beliefs about the connection between wealth, work, and merit. And as with plane crashes, this sort of thing does happen, and with far more frequency than plane crashes. But, again, it's hardly a certainty. From the 18th through the 20th century, newspapers in the United States recounted the misfortunes of lottery winners from across the globe: A baker and his pregnant wife murdered for his winnings by an employee (Paris, France, 1765). A squandered jackpot invested in a failed shipping venture (Newburyport, Massachusetts, 1883). A winner dying of a heart attack immediately upon hearing news of his windfall (Bilbao, Spain, 1934). Isn't that ironic? University of Buffalo sociologist H. Roy Kaplan interviewed around 100 early lottery winners in the mid-1970s and found most of them happy, despite these challenges. Nonetheless, a narrative was born. I can't really blame journalists, though. It's a well-known truism that bad news sells better than good. "Man wins lottery, gets robbed" is inherently more interesting than "Man wins lottery, keeps working, dies happy." Abraham Shakespeare was killed by an acquaintance three years after winning $30 million in 2006. I'm just including this quote because that dude had an awesome name, worthy of a $30 million win. As an aside, yes, I know that the actual payouts are less than the stated payouts. The $30 million (or $1.337 billion or whatever) is the present value of a 20-year annuity; one usually has the option to take it as a lump sum up front, which reduces the payout by the value of the annuity; and taxes (at least in the US) are automatically withheld at delivery. But the stated payout numbers are still relevant, because it's a way to compare the different jackpot sizes directly. Kind of like how a window sticker announces a car's gas mileage at the dealer, but no one expects to actually get those numbers on the road. Their stories are repeated so often not because they are representative but because they are some of only a few examples of regretful winners. The vast majority of jackpot recipients collect their novelty checks at press conferences and are never heard from again. That may not be the best phrasing. "Never heard from again" can also imply someone's gone missing or dead. Research into winners in Germany, Singapore, and Britain found that winning the lottery does, in fact, make people happier, and a 2004 study found that 85.5 percent of winners in Ohio kept working, a sign of how many carried on with their normal, pre-jackpot lives. Some people actually like working. Not me. But some people. I knew a surveyor who had won a lottery; the payout in his case was "only" about a million bucks, but this was back when a million bucks meant something. Money, it seems, really can buy happiness. I joke a lot about how money can't buy happiness, but it can buy beer, and that's good enough for me. But on a more serious note, the way that old cliché is phrased is a problem: it implies that you trade money for happiness (because when you buy something, you trade money for it). This is rarely the case. On the other hand, having money can result in a peace of mind and contentment, which is basically happiness. Why, despite all the available evidence, does the myth persist? What does it mean that this narrative is believed so widely? The article goes on to answer those questions, though not in as detailed a way as I would have liked. But Americans’ enduring love of gambling has long been in conflict with an important element of the nation’s mythology: that the United State is a meritocracy founded on hard work, a place where the smart, the savvy, and the deserving rise to the top, no matter their background. The implication of this ethos is that hard work always yields a just reward. By design, the meritocracy leaves little room for chance. What they leave out is something that is self-evident to me: that this "mythology" is a myth in both senses of the word: a foundational story, and a fiction. It's abundantly clear that hard work does not automatically lead to financial success; if it did, sharecroppers would be millionaires. Plenty of the "smart, savvy, and deserving" are living paycheck to paycheck, while others make their millions, or more, by cheating and dodging (which, again by American foundational ethics, are no-nos). Also, even people who come by their fortunes "honestly" (hard work, etc.) can often lose it all: sports idols, rock stars, business owners, etc. And it's not always drugs or profligate spending that brings them down; here in the US, sometimes it just takes one visit to the emergency room to go bankrupt. I've written in here before about the role of luck in a person's fortune (financial or otherwise); this is no more or less true with regards to lottery winners. A lottery winner is no more or less deserving than a company CEO or a Hollywood superstar. Sure, the latter two presumably put in the work, but they were also lucky enough to have the talent to do whatever it is that made them money. In truth, the concept of "deserving" is a social construct, a story we tell ourselves. And in the end, just as good luck can happen to any of us, so can bad (the difference, of course, is subjective). And we all die in the end, so why not enjoy things while you can? If that means, to someone, playing the lottery and dreaming of a better or easier life, let them have the fantasy. I'd be more focused on how the government is spending their share. You hardly ever see stories about that. |
Another one for "Journalistic Intentions" [18+]... Swimming Ever wonder why you don't have fur? Other apes are covered with their own pelts, rather than those of other animals. And yet, we humans, who manage to survive in some of the world's most extreme climate conditions, don't. Some are hairier than others, of course, and evolution played an especially big joke on men, who, with age, tend to lose the hair on our heads while gaining it in our ears and noses. (Not me, of course. But some other men.) One hypothesis that was presented a while back is called the "aquatic ape hypothesis." Noting that most aquaphilic mammals are mostly hairless, the idea is that, at some point after diverging from our common ancestry with chimpanzees around six million years ago, some proto-humans spent a lot of time swimming. Fur is generally not suited to swimming. Drag, I suppose. The obvious examples are whales, seals, and any number of other aquatic mammals. The same can be said for feathers; hence, penguins. And then, by this theory, we left the water and yet maintained an affinity for it, as illustrated by the popularity of swimming. There are, of course, counterexamples, such as beavers. Maybe polar bears. The problem with the aquatic ape hypothesis, however attractive it may be, is that it's wrong. Well. Probably. Part of the problem is a misunderstanding of how evolution works. Another problem is the old chicken/egg thing: did we become (relatively) hairless because we were swimming, or did we get (relatively) good at swimming because we were hairless? There's little doubt in my mind that lack of fur makes for better swimming (not to mention drying off after), but there are several other possible reasons for the adaptation. Including that it's not a true survival adaptation at all, but a side effect of other genetic changes, or a result of sexual selection, or some combination of factors. What we do know is that, whatever the evolutionary reason, humans have the capacity to learn how to swim, and many do, and derive great pleasure from it, or even from watching other people swim. Not me, of course (though I did learn as a child and can at least still dog-paddle), but other people. And it can provide a survival advantage, especially when you get thrown off the boat for being too pedantic. |
Today is Throwback Day, and the random number generator pointed me to this short entry from October of 2018: "Amazon" It really is incredibly short, but I still have a few things to say. Upon looking at my Amazon account today, I saw: "Customer since 1997." It was a better internet all around back then. Except for the relative lack of streaming video. I was in my "no TV" phase, because broadcast TV was crap, and I flat-out refused to pay for cable. "Wait, you want me to pay you and still have to watch idiotic commercials? No." That's right. My Amazon account is now old enough to drink. Obviously, it's even older now. Old enough to drink more due to disillusionment with life. Which is ironic, because Drunk Me keeps buying surprises from Amazon for Sober Me, like Doctor Who boxed sets (complete with sonic screwdriver) and lame Halloween costumes. And, once, a breathalyzer. That doesn't happen as much anymore. Not even during the pandemic. Not because I didn't get blitzed, but because Amazon has become difficult enough to figure out when sober, and almost impossible when you're reeling from that seventh shot of tequila. Like, you go and search something simple, like, I don't know, Star Trek bedsheets. You'd expect, when searching for that particular three-word phrase, that it would bounce back a list of Star Trek-themed bed linens. But no. First you're confronted with Sponsored Items that have nothing to do with Star Trek, or sheets. Star Wars shower curtains. PVC plumbing supplies. DVD boxed sets of Survivor. Glitter-infused dildos. Electric drills. Psychedelic pencils. And then, after the barrage of ads, you still get the same eclectic mix of irrelevant (to the search terms) crap, with maybe, three pages in, one solitary offering of Star Trek bedsheets. For a single-sized bed. Which I suppose is fair. I didn't actually search for Star Trek bedsheets while preparing this entry, so I don't know if those particular items come up. I don't want that in my "suggested items" list for the next ten years. It would be different stuff depending on what you've searched for before, which is fine; if you're going to give me ads, at least give me targeted ones. But my point is, there's a whole lot of irrelevant stuff, most of which don't actually have their names in their titles but search engine terms, like, I don't know, Home Generator Maintenance Supplies Maintenance Supplies for Home Generator Electrical Mechanical Bolts Nuts Replacement Parts Oil Pan Drain Plugs. I really should get an ignition interlock breathalyzer for Amazon. I guess I haven't been drunk enough to do that yet. Turns out I didn't have to, because Amazon has made it virtually impossible to browse drunk. And I'm not even going to get into the other issues with the company. I'm only talking about the site's user interface, here. I still use them, because if there's anything I hate worse than ads, it's shopping in an actual store in an actual building. But damn, I miss the days when it was just a bookstore. |