Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
Did you think Urban Dictionary was something new? Like most kids, I amused myself in my youth by looking up certain words as soon as I encountered a new dictionary. Kid Me derived great enjoyment from finding Forbidden Words in school library dictionaries. A “dictionary of the vulgar tongue” may sound like some kind of prank gift, something you pick up as a means of upping the ante on your name-calling or adding some spice to your conversations for all occasions. But you won’t find this dictionary at Spencer’s Gifts. It’s tucked away at the British Library in London, shelved and looking prim and proper in its original 1785 binding. How much better, then, to find the 18th century proto-Urban Dictionary in such a staid institution as the British Library? It's like finding fart jokes at the Library of Congress. [The dictionary] looks a lot like another noted 18th-century dictionary—Samuel Johnson’s Dictionary of the English Language. The only thing differentiating these two is their focus: the English language versus the “vulgar tongue.” As the article points out, "vulgar" meant a different thing back then. Basically it was peasant-speak. Today, “it’s one of the more important slang books ever published,” says lexicographer Jesse Sheidlower, an adjunct assistant professor at Columbia University. “Johnson made a specific effort to keep out this kind of language.” It is not possible for me to think of Samuel Johnson's dictionary without recalling one of the funniest comedy shows of all time, of which the following is but a fragment: Anyway... The entries in Grose’s dictionary run the gamut from words and phrases common to laborers, military personnel, and bar frequenters to cant—the jargony language of criminals. Among the pages are such listings such as “cheeser,” another word for a fart; an “Admiral of the narrow seas,” someone who drunkenly vomits into the lap of the person sitting opposite him; and “to dance upon nothing,” meaning “to be hanged.” One wonders how many words the dictionary contained for private parts. I once participated in an impromptu roundtable discussion where we listed every synonym for "penis" we could come up with, and while I don't remember the exact final tallywhacker, it was well into the hundreds. But this isn’t just a collection of fun phraseology, explains Sheidlower. It’s a window into a crossroads of language at the heart of 18th-century Britain. Terms used in various underground criminal enterprises—like “bean feakers,” or bill counterfeiters—intermingle with simple words used among commonfolk like “lobkin,” which is just another word for a house or home. (Some of the words and phrases included live on into the modern vernacular with their centuries-old meaning, such as “to screw” and “to kick the bucket.”) Whereas I'm willing to bet the vast majority of the words and phrases have been lost from common speech, having lost their relevance, or replaced by more modern equivalents. And yet, a quick glance at the contents shows that others have, indeed, persisted; some have even entered formal language. I will point out, for example, that one of the definitions of "punk" therein is "a little whore." I will also note that, since it still amuses me to do so, I looked up some of the naughtier words and was delighted to find that some of them were included—albeit with apparent self-censorship. Grose and his dictionary gave the world a peek inside various groups in danger of having their cultures steamrolled and made the language of commoners as worthy of study as that of aristocrats. And the article ends with a link to the digitized version of the dictionary. If you're too lazy to go to the article to find it, here it is. . Incidentally, there's a word in there: Frenchified. (I found it when I was looking for other words starting with F.) It means "Infected with the venereal disease." I can only assume that there exists, or at least existed, a French version of the dictionary, in which the word is "Anglified." |
Most of you probably know by now that there's a new merit badge available to the community, one I commissioned and, of course, The StoryWitchress implemented. Here it is, called "Complexity": While obviously named after this blog, it's not restricted; anyone on WDC can send it to someone else on WDC. But the meaning might be obscure to some; therefore, I will explain it. This may be the only time I ever do so. Fair warning: math discussion ahead. But not a very technical one. The header for this blog has always had a brief definition of what a complex number is in mathematics. As I've stated before, though, for me the title is a pun of sorts; both "complex" and "numbers" have other definitions. For example, the former can describe a kind of psychological disorder, and the latter also can be a synonym for a musical composition. Also, all blog entries have an identifying number. Point is, I thought it was appropriate. Now, there's a deceptively simple iteration you can do on any number, which I won't go into in detail (you can find all the detail you want by searching for it, or look at the Wiki link I provide below), only point out that when you do this iteration, the result either blows up to infinity, or it doesn't. If the number is a complex number, and the iteration doesn't tend to infinity, it's in the Mandelbrot set, usually graphically represented in black. Other colors are assigned to numbers outside the Mandelbrot set, depending on how quickly the iterations tend to infinity. The cool thing, though, is that no matter how close you zoom in on a point on the boundary between "goes to infinity" and "doesn't go to infinity," you get the same sorts of spirals, whorls, and intricate designs. The boundary is self-similar at all scales, to any number of decimal places. (This is what I mean, in the blog header, by "Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty.") In the "real world," this can't happen. Zoom in closer and closer and eventually you get to atoms, and the stuff inside atoms, and there's a smallest size. To me, this is a metaphor for how imagination can extend reality. To see the image the MB is based on, go look at this article. There's not a lot of math there, either; it's more of a philosophical essay and a discussion of the life of Benoit Mandelbrot, after whom the set is obviously named. True to form, I'll quote from the article. (Just one passage, though) Mandelbrot needed a word for his discovery — for this staggering new geometry with its dazzling shapes and its dazzling perturbations of the basic intuitions of the human mind, this elegy for order composed in the new mathematical language of chaos. One winter afternoon in his early fifties, leafing through his son’s Latin dictionary, he paused at fractus — the adjective from the verb frangere, “to break.” Having survived his own early life as a Jewish refugee in Europe by metabolizing languages — his native Lithuanian, then French when his family fled to France, then English as he began his life in science — he recognized immediately the word’s echoes in the English fracture and fraction, concepts that resonated with the nature of his jagged self-replicating geometries. Out of the dead language of classical science he sculpted the vocabulary of a new sensemaking model for the living world. The word fractal was born — binominal and bilingual, both adjective and noun, the same in English and in French — and all the universe was new. Now if you want a more technical discussion and lots more pretty pictures (including an animation of a deep dive into the set boundary, showing its self-similarity), you can always peruse the Wikipedia page. Unlike the boundary of the Mandelbrot set, this blog won't go on forever, and neither will I. This Merit Badge will, however, hopefully outlast both of us. Oh, and if you want one, just comment below. I'm feeling magnanimous. That feeling won't last, so you have until midnight tonight, WDC time. |
I don't usually link to The Torygraph, but I am today. The answer to almost every "Why are..." question concerning airlines is either a) It makes or saves them money, or b) some regulation requires it. From Ryanair and British Airways to American Airlines (the world’s largest carrier), airlines across the board incorporate various shades of blue in their cabin seats, and it’s no coincidence. There does appear to be some psychology behind it. Because blue is associated with depression, which is one of the outcomes of suffering the indignities of 21st century air travel? Blue is associated with the positive qualities of “trust, efficiency, serenity, coolness, reflection and calm,” according to Colour Affects, the London-based consultancy run by Angela Wright, author of The Beginner’s Guide to Colour Psychology. Funny, those are also the qualities associated with [insert your zodiac sign here]. The astrology is probably more accurate. Nigel Goode, lead aviation designer and co-founder at Priestman Goode, which has been delivering aircraft interiors for 30 years for airlines, including most recently the Airbus Airspace cabins, states: "Our job as designers is to reinforce the airline’s brand and make it more recognisable, but our primary concern is to deliver an interior that maximises comfort to create a pleasant environment. Well, then, you failed. “It’s all about making the travelling experience less stressful and blue is said to evoke a feeling of calm. While some of the more budget airlines might use brasher, bolder shades, most others go with muted tones. The overarching aim is to create a home-like relaxing feel, so airlines tend to use muted colours that feel domestic, natural and earthy for that reason." How is blue "domestic, natural and earthy?" Also, if you want to make "the traveling experience less stressful," there are less questionable ways to do it like, oh, I don't know, not packing people in like olives in a jar? Cabin lighting is also geared towards creating a stress-free atmosphere on board, particularly in newer planes which have introduced soft LED lighting to replace the harsher light used in earlier models. Right. LEDs not having to be replaced nearly as often, and requiring less power, has nothing to do with it. As a general rule, most long-haul carriers won’t install leather seating because they can get unpleasantly sweaty. Fixing this for you: because leather's more expensive. Synthetic fabrics breathe, which makes for a more comfortable experience. They also absorb the farts of the last slob who sat in your seat. In fairness, I've sat in leather (I think it was faux-leather but whatever) seats on an airplane before, and the problem isn't sweat, but less friction. “Lighter-coloured interiors, however, are more commonly found in first and business class seats, given not as many people fly in those cabins as they do in economy, where you’ll see more of the darker shades," Mr Goode explains. That explanation makes no sense at all. There's a fixed number of each on an airplane, and these days, they're all filled. I'm betting it enforces class distinctions. Found only on wide-body aircraft like Boeing’s Dreamliner, there is a small room hidden behind what looks like a small cupboard from the exterior. Known as the 'crew rest', this tiny compartment opens up to a steep staircase that leads down to a few seats and bunk beds where crew members can sleep. Et alors? There are large areas on wide-body aircraft that are not used in the lower level, which have not been converted into seating areas because of the low height and lack of windows - both of which make the space too claustrophobic for passengers. Quoi? That's never stopped them before. Passenger seats with dummies strapped to them are put through the 16G test, a process which involves hurtling the seat down a 'sledge' ramp at high speeds to simulate a plane crash setting. All seats are required to withstand a 16g dynamic force. Right, because people can withstand a 16g force. I'm going to let the "dummies" thing slide. I'm feeling magnanimous. Seats on many low-cost carriers like Ryanair don’t have a recline mechanism, which also adds to the weight, and others go as far as removing the seatback net for magazines to help reduce weight. Want to start an argument online and sick of talking about Chicago "pizza?" Assert that a passenger in a reclining seat has every right to use said reclining mechanism to its fullest extent. They do, incidentally. “There’s a big push at the moment for the magazine pouch to be relocated just a tiny bit higher to allow just a little bit more space," notes Mr Goode. Wow. As an aside, for a while there, at least one airline (I think it was American) forbade passengers from sticking anything into the pouch other than the stuff that's native to it (safety card, barf bag, overpriced consumer goods ads). This was likely the most idiotic change to airplane rules since the non-smoking section (kind of like a non-pissing section in a pool, and I say that as a cigar smoker). Last time I flew that airline, though, there was no mention of this. Anyway, snark aside, some of this stuff is interesting, whether you fly or not. Me? I used to love flying, but these days, it's nothing but a hassle. I might change my mind, though, if they relocate the magazine pouch just a tiny bit higher. |
Everything I need to know about medieval peasants, I learned from Monty Python and the Holy Grail. And maybe today's article. What Did Medieval Peasants Know? The internet has become strangely nostalgic for life in the Middle Ages. The period, which spans roughly 500 to 1500, presents some problems for people trying to craft uncomplicated stories. “No age is tidy or made of whole cloth, and none is a more checkered fabric than the Middle Ages,” Tuchman wrote. Historians, she noted, have disagreed mightily on basic facts of the era... We know enough for comedy movies. You know Holy Grail is almost 50 years old now? Still one of the greatest and most quotable movies of all time. ...the Middle Ages have been a common hobbyhorse for people of all political persuasions who suspect modernity might be leading us down the primrose path, especially as the internet has become a more central and inescapable element of daily life. Our ancestors of the distant past can be invoked in conversations about nearly anything: They supposedly worked less, relaxed more, slept better, had better sex, and enjoyed better diets, among other things. Sure, you can prop up straw men and just make shit up about the past. But look, Hobbes (the philosopher not the tiger) described our lives as "nasty, brutish and short" for good reason. Namely, the past sucked. It sucked hard. I can kind of understand the Renaissance Faire people, but even they know they're just role-playing, and if one of them got a foot infection, they'd go to a modern hospital and not have a barber treat them with leeches. Or whatever. The present is no walk in the park, but the problem isn't the internet; it's endgame capitalism. And at least we have global trade and laser eye surgery. The problem is that these assertions about our glorious history usually don’t quite check out—they tend to be based on misunderstandings, disputed or outdated scholarship, or outright fabrications long ago passed off as historical record. But that doesn’t stop people from regularly revisiting the idea, counterintuitive though it may be, that some parts of life were meaningfully better for people who didn’t have antibiotics or refrigeration or little iPhone games to play to stave off boredom. What, exactly, is so irresistible about a return to the Middle Ages? Boredom? I'd bet the peasantry had lots of problems, but boredom wasn't one of them. But as the article points out, I don't know shit. He now thinks that English peasants in the late Middle Ages may have worked closer to 300 days a year. He reached that conclusion by inspecting the chemical composition of fossilized human remains, as well as through evidence of the kinds of goods that urban peasants in particular had access to. This is in reference to the claim that European peasants worked like 150 days out of the year. Which, even on the face of it, has got to be suspect. After all, what was the ruling body of Europe, the one thing that defined and united the continent during what we call the Middle Ages? The Church, obviously. And what's the Official Doctrine of the Church? "Six days shalt thou labor..." It's right there in the beginning. Now, sure, there's question about how much the lower classes could actually read, but they'd have gotten the message. So 365 minus 52 Sundays, and take off maybe a few other days for Suck the Duke's Dick Day or whatever, and 300 sounds reasonable. Compared to that, our standard two-day weekend is luxury itself. Clark and his colleagues have revised their estimates upward, but the school of thought his previous numbers belonged to still has many academic supporters, who generally base their estimates of how much peasants worked on records of per-day pay rates and annual incomes. “This other view is that they were quite poor, but they were poor kind of voluntarily, because they didn’t like work and didn’t want to do a lot of work,” he told me. There were probably signs up at medieval convenience stores: No Onne Wonts Too Werk Anymoor. “It allows people to make their own medieval mythology and cling to that,” Janega told me. “They’re just kind of navigating on vibes.” Like I just did. Only I know I did it. The clear delineations that people assume between work and personal life just aren’t particularly tidy for peasants doing agrarian labor. “They’re thinking of these people as having, like, a 9-to-5 job, like you’re a contracted employee with a salary and you get vacation days,” she told me. “The thing about having a day off is like, well, the cows ain't gonna milk themselves.” So while people are correct that European peasants celebrated many more communal holidays than modern Americans, in many cases, that just meant they weren’t expected to do a particular set of tasks for their lord. Minding the animals, crops, and themselves never really stopped. Their vacations weren’t exactly a long weekend in Miami—after all, they didn’t really have weekends. I'm imagining a feudal serf clocking in and out of their field. Er, their lord's field. Whatever. There were, of course, some other obvious downsides to medieval life. A huge chunk of the population died before the age of 5, and for people who made it out of childhood, the odds of seeing your 60th birthday weren’t great. There wasn’t any running water or electricity, and there was a very real possibility that mercenaries might one day show up and kill you because your feudal lord was beefing with some other lord. At least there was beer. On the other hand, there was a nonzero chance that you'd be burned as a witch for brewing it. If you’re looking for a vision of history where people were generally peaceful and contented, though, you might want to check in with societies outside of the Middle Ages. Perhaps look for a group of people not perpetually engaged in siege warfare. “Medieval peasants are a weird one to go to, because, you know, they were rebelling constantly,” Janega noted. “Why are they storming London and burning down the Savoy Palace, if this is a group of happy-go-lucky, simple folk who really love the way things are?” And maybe don't be so hyperfocused on Europe? I get it; the vast majority of Americans have (relatively) recent ancestry there. But other areas have much to teach us as well, as indicated by, for instance, the Chinese invention of the compass, the Indian invention of the concept of zero, and the Arabic invention of higher mathematics—all of which took place during the European Middle Ages. The article actually has quite a bit more relevant information than what I've copied here, but the takeaway, at least for me, is: don't look at the past with rose-colored glasses (spectacles were a medieval European invention). |
Today's your lucky day. Oh, wait... no. Well, maybe it is. But I'm here to talk about luck. The radical moral implications of luck in human life Acknowledging the role of luck is the secular equivalent of religious awakening. I think the headline goes too far, bordering on clickbait. But that doesn't mean the content is wrong. In July 2018 (when we first published this piece), there was a minor uproar when Kardashian scion Kylie Jenner, who is all of 22, appeared on the cover of Forbes’s 60 richest self-made women issue. As many people pointed out, Jenner’s success would have been impossible if she hadn’t been born white, healthy, rich, and famous. People say the Kardashians are famous for being famous. I don't know. I don't keep up with them (pun intended). I don't care about them. Their antics mean exactly nothing to me. I'm rather disappointed that Forbes, a magazine that my very serious father always read very seriously, would fall so low as to cover one of them for any reason, but hey, publications change. But anyway, no, they're not famous for being famous; they've invested a lot of time and money into self-promotion, netting them more money (if not more time). I can't help hearing about them, no matter how strenuously I try not to. I get the feeling that if I moved out to the middle of Nevada and started living under a rock in an abandoned silver mine, within a week someone would come trudging into the mine, lift the rock and go, "Hey, did you hear about the latest thing some Kardashian did?" All of which is to say that it's not just luck. Or if it is, it's lucky that they're so talented at self-promotion. She built a successful cosmetics company — now valued at $900 million, according to Forbes — not just with hard work but on a towering foundation of good luck. "Now" apparently being the date of the updated article, early 2020, still in the Before Time. And I will reiterate that the idea that "hard work" (whatever that is) alone brings success is easily refuted: if it were the case, migrant laborers would be billionaires. Around the same time, there was another minor uproar when Refinery29 published “A Week in New York City on $25/Hour,” an online diary by someone whose rent and bills are paid for by her parents. A publication I've never heard of. But I've seen similar articles, usually with the tone of "This young couple managed to buy a house and pay it off in full by the time they were 30," while if you actually read the article, you find that they were able to do so because their parents paid for most of their crap. It’s not difficult to see why many people take offense when reminded of their luck, especially those who have received the most. Allowing for luck can dent our self-conception. It can diminish our sense of control. It opens up all kinds of uncomfortable questions about obligations to other, less fortunate people. Oh, I don't take offense. I just smile (insofar as I can), narrow my eyes, and go, "So?" Nonetheless, this is a battle that cannot be bypassed. There can be no ceasefire. And now we're back to the hyperbolic tone of the headline. Individually, coming to terms with luck is the secular equivalent of religious awakening, the first step in building any coherent universalist moral perspective... Building a more compassionate society means reminding ourselves of luck, and of the gratitude and obligations it entails, against inevitable resistance. I find that to be somewhat contradictory. If I'm playing craps and I make my point, and I'm an atheist (remember, they're talking about secular morality here), to whom or what do I express gratitude? The dice? That's silly. God? Nonexistent. Lady Luck? Still atheist. Fortuna, Roman goddess of luck? Still atheist. Some nebulous concept of the quantum fluctuations of the universe? See "dice." Sure, if an actual person does something nice for you, you express gratitude. Sure, gratitude is actually an emotional state and is intransitive (meaning it doesn't require an object), but if I win at gambling, I don't say "thank you;" I just bask in the fortune. All of which is to say I'm not convinced that "gratitude and obligations" are a necessary byproduct of being lucky. But I'm willing to read on to see what the author might have to say about that. How much moral credit are we due for where we end up in life, and for who we end up? Conversely, how much responsibility or blame do we deserve?... How you answer these questions reveals a great deal about your moral worldview. To a first approximation, the more credit/responsibility you believe we are due, the more you will be inclined to accept default (often cruel and inequitable) social and economic outcomes. People basically get what they deserve. I think those are fair questions, and I accept that people will answer them differently. In my worldview, we only have the illusion of being able to make decisions. It's more like we do whatever it is that we do, and then either justify or regret it afterward. It's also very clear to me that people do not, in general, get what they deserve; it's more like they get something, and have to (or get to) live with it. The idea that people get what they deserve is pernicious. You end up worshiping successful people, and scorning those in poverty, on the basis of "well, they must have done something to deserve their state." (The article does delve into this morass later.) Of course it is true that you have no choice when it comes to your genes, your hair color, your basic body shape and appearance, your vulnerability to certain diseases. You’re stuck with what nature gives you — and it does not distribute its blessings equitably or according to merit. But you also have no choice when it comes to the vast bulk of the nurture that matters. On that point, I can agree. You can no more choose your parents, or your childhood environment, than you can choose your eye color (please don't tell me about colored contact lenses; you know what I mean). Here, a distinction made famous by psychologist Daniel Kahneman in his seminal Thinking, Fast and Slow is helpful. Kahneman argues that humans have two modes of thinking: “system one,” which is fast, instinctual, automatic, and often unconscious, and “system two,” which is slower, more deliberative, and emotionally “cooler” (generally traced to the prefrontal cortex). Our system one reactions are largely hardwired by the time we become adults. But what about system two? We do seem to have some control over it. We can use it, to some extent, to shape, channel, or even change our system one reactions over time — to change ourselves. The key word there, to me, is "seem." This argument implies that we are somehow separate from our selves, that there's a ghost in the machine, pulling the levers, and we're the ghost. The problem with that implication is that, well, we're not. System one, system two, whatever; they're both products of brain activity—products of a physical process. We do change, sure. Other environmental inputs give us more information, and the brain itself changes over time. Everyone is familiar with that struggle; indeed, the battle between systems one and two tends to be the central drama in most human lives. When we step back and reflect, we know we need to exercise more and eat less, to be more generous and less grumpy, to manage time better and be more productive. System two recognizes those as the right decisions; they make sense; the numbers work out. But then the moment comes and we’re sitting on the couch and system one feels very strongly that it doesn’t want to put on running shoes. It wants greasy takeout food. It wants to snap at the delivery guy for being late. Where is system two when it’s needed? It shows up later, full of regret and self-recrimination. Thanks a lot, system two. This is usually represented in cartoons with an angel on one shoulder and a devil on the other. As an aside, I take issue with the idea that we should be more productive. Productivity has led to myriad problems. Maybe we could do with being less productive. I'm skipping a bunch here, but something else I have issue with: The promise of great financial reward spurs risk-taking, market competition, and innovation. Markets, properly regulated, are a socially healthy form of gambling. No. They are not gambling. I mean, sure, if you take a short-term view, they can be. But unlike, say, gambling in a casino, investment in the stock market gives you the house edge. So unless you're also prepared to call running a small business or a casino "gambling," this is another pernicious misconception. And there’s no reason we shouldn’t ask everyone, especially those who have benefited most from luck — from being born a certain place, a certain color, to certain people in a certain economic bracket, sent to certain schools, introduced to certain people — to chip in to help those upon whom life’s lottery bestowed fewer gifts. Oh, you can ask all you want. You can even enforce the ask with taxes. But those types are lucky enough to be able to afford lawyers and accountants to minimize their tax burden. In the end, it doesn't matter whether someone knows they were lucky or not; what matters is how much they give a shit. |
As this entry is later than usual—as I vaguely remember mentioning in a note yesterday, I might have consumed an excessive amount of ethanol—I'm not going to comment too much. But it's an interesting bit of history, illustrating some of the best and worst of humanity. The Story of Charles Willson Peale’s Massive Mastodon When a European intellectual snubbed the U.S., the well-known artist excavated the fierce fossil as evidence of the new Republic’s strength and power In the 18th century, French naturalist George-Louis Leclerc, Comte du Buffon (1706-1778), published a multivolume work on natural history, Histoire naturelle, générale et particuliére. This massive treatise, which eventually grew to 44 quarto volumes, became an essential reference work for anyone interested in the study of nature... The Comte de Buffon advanced a claim in his ninth volume, published in 1797, that greatly irked American naturalists. He argued that America was devoid of large, powerful creatures and that its human inhabitants were “feeble” by comparison to their European counterparts. Obviously, Buffoon wasn't familiar with Sasquatch. As for "feeble," well, those "European counterparts" were busy systematically destroying the inhabitants by means of more advanced technology. The claim infuriated Thomas Jefferson, who spent much time and effort trying to refute it—even sending Buffon a large bull moose procured at considerable cost from Vermont. You know, this is about when, normally, I'd stop reading. Why? Because according to this article, the Comte died in 1778. The Wikipedia page claims he died in 1788 (I know there was a major calendar switch in the 18th century, but not that major). The ninth volume was, again according to this article, published in 1797, either 19 or nine years after his death. And yet Jefferson sent him a moose? To what, his mausoleum? So, okay, something's really wonky about the dates here, and that definitely needs resolved (especially as Leclerc was a noble and the French revolution was mostly a 1790s thing). In 1739, a French military expedition found the bones and teeth of an enormous creature along the Ohio River at Big Bone Lick in what would become the Commonwealth of Kentucky. I'm mostly just including this quote so that those of you unfamiliar with Kentucky can have a sensible 12-year-old chuckle at "Big Bone Lick." Of course, the local Shawnee people had long known about the presence of large bones and teeth at Big Bone Lick. It's right there in the name, folks. What? You didn't actually think the other definition applied? For millennia, bison, deer and elk congregated there to lick up the salt, and the indigenous people collected the salt as well. The Shawnee considered the large bones the remains of mighty great buffalos that had been killed by lightning. This is completely tangential to the article, but I've had this working hypothesis for a while now that the reason so many cultures have dragon myths is because they'd occasionally find dinosaur bones. Having no concept of deep time, they had to make up stories about how such enormous skeletons got to be part of the landscape, and those stories became dragon legends. I have no real support for this, but it tracks with what I know about humans. Anyway. Not much else to say, except that the article calls out my hometown, which I always think is cool (unless it's to recall the events of 2017). The rest of the story details the process of figuring out what those bones were (spoiler: mastodon), and, like I said, is an interesting look into the history of scientific discovery. Oh, but before I go, don't give much credence to those stories you keep finding about people trying to Jurassic-Park the mastodon back into existence. Most of them are sensationalist. |
No, I'm not trying to turn this into a cooking blog. Ugh. But sometimes a food article catches my eye, and also sometimes the random number generator spits them out back-to-back. In this case, and the last one, the hook isn't cook, but science. How to Use Baking Soda Like a Scientist This ingredient belongs in both the laboratory and the kitchen. Next time someone spouts off about not eating stuff with chemicals in it, you can point out that sodium bicarbonate is, by definition, a chemical, and they almost certainly eat delicious pancakes, which are made with baking powder (baking powder is baking soda with other stuff added). If that doesn't work, mention sodium chloride. If they're still being stubborn about it ("But I only eat organic salt"), point out that every ingredient contains chemicals. It still won't stop the ignorance, but at least you've gained the high ground. Whipping up a recipe can feel awfully similar to conducting a science experiment. Either one could involve adjusting burners or measuring out various powders and liquids, all while carefully watching to make sure your project doesn’t explode, burn, or turn a funny color. Be fair, now. Sometimes exploding is the point. Cookbook author and food writer Nik Sharma happens to be both a cook and a scientist. Oh, that name is so close to being an aptronym. Nik Shawarma would be an awesome name. And most scientists are also cooks. It's not like they get paid enough to hire a full-time chef. The only question is whether they apply one activity to the other. The important thing is that he's also a writer. In his writing, especially 2020’s The Flavor Equation, Sharma is educating the food world on the science behind the most common cooking techniques and ingredients. Yeah, this is kind of a book promo. But it seems like a useful book. The article then switches to the author/scientist/cook's point of view. My fascination for cooking and chemistry developed simultaneously. It all started in my high school chemistry lab, during a lesson on the relationship between acids and bases. When he was bitten by a radioactive papier-maché volcano? Sodium bicarbonate—or baking soda—was one of the first ingredients that made me realize that a kitchen is, in essence, a laboratory. Depends on your definition of "laboratory." Most cooks don't approach it from a scientific perspective, instead using recipes or their ancestral knowledge. Nothing wrong with that, as the goal is to provide something appetizing and edible, but it's not science. The Rise of Baking Soda Oh, ho ho ho. I see what you did there. I approve. Bakers have used carbonates as chemical leaveners since the Middle Ages. These substances release carbon dioxide bubbles when dissolved in water, or mixed with an acid. In a batter, this has a lightening, lifting effect. It's interesting to me that this postdated the use of yeast, which also has the effect of leavening baked goods (in addition to the magic it works in delicious fermented beverages). In either case, something's producing carbon dioxide. I would have expected it to be the other way around, as people didn't actually know what yeast was until, like, microscopes. Yes, yeast was used for thousands of years before someone said "holy shit, it's alive!" Sodium bicarbonate, NaHCO₃, is of course not a living organism. Fun Fact of the Day: NaHCO₃ can be found in the wild, in a mineral called nahcolite. Yes, its name is a pun on the periodic table symbols involved. I find this amusing. Baking soda is naturally alkaline, raising the pH when added to liquids or foods. Often, to reduce the acid in coffee or a very sour soup, I’ll stir in a tiny pinch of baking soda to neutralize and counteract the acidity. Someone once told me to add a bit of baking soda to ground beef before frying it, to make it brown better. I figured it couldn't hurt, so I tried it (science!). I was displeased with the results, but it did produce a faster browning action. Not sure of the chemical reason for that, but since I don't plan to do it again, it's not near the top of my curiosity list. If I’m cooking dried beans, I’ll first soak them overnight in a brine made with baking soda and salt, or cook pre-soaked beans with a smaller quantity of both. If you’ve ever cooked dried beans, only to have them turn out unpleasantly hard, this is the trick for you. No way. I get my beans from a can, as God intended. But the reason I'm quoting this line is to point out that "salt" as a culinary ingredient is almost exclusively sodium chloride, but scientists call any substance with a certain ionized crystalline form a "salt." Here, he's using the culinary definition, as baking soda is itself, chemically speaking, a salt. Baking soda can also act as a catalyst in two important food reactions. A tiny pinch of baking soda to vegetables or meats while roasting or sautéing accelerates the rate of sugar caramelization, and supercharges the Maillard reaction, the rate at which the amino acids in proteins react with sugar. Well, that's that low-level curiosity satisfied. You didn't think I'd actually leave you hanging, did you? Every time I look at the jar of baking soda sitting inside my pantry, I smile. Well, I hope you change it out every so often. Unlike table salt, it has a relatively short shelf life. No, that's not just Church & Dwight (the makers of Arm & Hammer, which is probably the best-known brand of baking soda, at least in the US) trying to get you to buy more of their product. Using it as a fridge freshener, now, that's them trying to get you to buy more of their product. There is little actual evidence that this works, but damn, they're good at marketing. By the way, it occurred to me the other day that I never really explained why I bang on about marketing in here sometimes. It's because this is, ultimately, a writer's blog; writers tend to want to publish, and publishing requires effective marketing. At the same time, I despise marketing excesses. There's a balance. Kind of like with adding sodium bicarbonate to food: too much and it leaves a bad taste. The article ends with a "Tips and Tricks" section, about which I only have a couple of comments: A pinch of baking soda mixed into a glass of water acts as an antacid to reduce heartburn. The jury's still out on whether sodium actually affects heart health. But to be on the safe side, I try not to overuse sodium salts, including NaCl and bicarb. No, "heartburn" doesn't really involve the heart; it's just that, having had a heart attack, I'm wary of certain overindulgences (booze doesn't usually contain sodium, except for margaritas). I'd check with an actual doctor before using this "trick" if you've got heart issues. Outside cooking, baking soda has many uses. It can also be used as a mild soap along with vinegar to clean kitchen counters, and stubborn grease marks. This, I can vouch for. It's also fun to watch the baking soda/vinegar reaction (the chemical result is sodium acetate, water, and carbon dioxide, but you know it as the middle school science fair "volcano" I referred to above). I'm also told that it can remove stains from nonstick pan surfaces. I haven't had much luck with that; I have a pan that badly needs a deep cleaning, but nothing has worked yet. Anyway, the real point here is: chemicals are your friends. |
Today, some hard-hitting journalism sure to be eligible for a Pullet Surprise. Is Garlic Getting Easier to Peel? To solve the mystery, I had to talk to horticulturalists, farmers, chefs—and also my local grocery store. Mmmm... garlic chicken... Where was I? Oh yeah. Most of the time, the answer to any question posed in a headline is "No." For most of my adult life, I was a garlic-phobe. Not because I didn’t like the flavor! No, like basically every human on earth, I adore garlic... I've known people who didn't like garlic. I avoid them, figuring they're vampires. I loved eating garlic. What I hated, for years, was peeling garlic. If only you could have relaxed your culinary snobbery long enough to purchase pre-peeled garlic. Or use (gasp) garlic powder. It has its place, you know. What a drag it was! Let’s say your recipe calls for three cloves of garlic. Your onions are fizzing, your pasta is bubbling, and you’re cursing, trying to separate the garlic’s skin from the cloves within. You know, I don't consider myself a gourmet chef, though I get by. And I've ranted before, in the Comedy newsletter and probably here too, about the frustration of garlic-peeling. But there's a concept called mise-en-place, which I knew about even before I started learning French. Literally "put in place" (the "place" is pronounced differently, of course), it means prepping most of your ingredients before starting cooking, so you don't find yourself struggling with peeling garlic, chopping onion, or whatever at critical stages in your cooking. It's not always necessary. If you're broiling chicken or whatever, you can do some chopping while it's cooking. But having the garlic peeled and minced before you even turn the stove on is, in my estimation, pretty basic. Like juicing a lemon or opening a stuck jar, peeling garlic is one of those kitchen tasks so pesky that gadget companies are forever trying to solve it for you. I’ve tried the shakers. I’ve tried the silicon rollers. I’ve tried cutting off the stem end and rolling the clove between my hands. Cut off the stem end, twist the cloves a bit to loosen the skin (this also releases some flavor), and rub them together between your hands while standing over a trash can to catch the paper. You do run the risk of dropping the garlic, so don't do that. Notice I said "cloves." No matter what a recipe says, a single clove is never enough. NaNoNette recommends squeezing them under the flat of your vegetable knife, and that can work too, but it requires more tools than just using your fingers. Also, juicing a lemon? Come on. The only trick to that is to do it through a sieve so the seeds don't squirt out onto your shrimp or whatever. But over the past three years or so, something strange has happened to the garlic I buy at the grocery store. It’s become so much easier to peel! The more you do something, the easier it gets. How did my garlic transform from sticky nightmare to user-friendly flavor dispenser? Have America’s garlic breeders suddenly focused on peelability as a saleable trait? Has competition from pre-peeled garlic somehow forced a change in the garlic farming world? I had to know the answer, so I called everyone I could think of who might know anything about garlic. So, not vampires. “I don’t know that anybody’s measured that, peelability,” said Barbara Hellier, a horticultural crops curator with the United States Department of Agriculture in Pullman, Washington. Like a home chef with a particularly tough clove, she wrestled to unwrap the subject:... Stretch metaphor. I asked if anyone was breeding garlic specifically to improve peelability, and she told me something I hadn’t previously known: “There’s hardly anyone breeding garlic at all.” Garlic, it turns out, isn’t like other crops, where you plant seeds, grow a plant, harvest it, and then, next year, plant a new seed. Garlic seeds, from fertilized flowers on garlic plants, look a little like onion seeds, but hardly anyone generates and plants them—because why would you? All you need to grow a new garlic plant is just one garlic clove off a garlic bulb. (When you’ve left your garlic sitting around so long a clove sprouts a green shoot, you’ve begun that process.) No garlic lasts that long around me. The issue, Kamenetsky explained, is that unlike other crops, garlic mostly can’t flower and be fertilized. “In garlic, this was damaged in ancient times,” she said. “For 5,000 years, people selected for bigger cloves, and they continually selected against flowering.” Okay, now, see, that's interesting. It's kind of like with bananas. Doesn't have anything to do with a peel, though. Oh yeah, I made that pun. Within a year of opening, Serafini found a California farm that sold pre-peeled garlic, which is where the Stinking Rose now sources all its A. sativum. “What they do is they put it in screens that heat the skin to a certain temperature, dry it, and then they put it through a wind treatment, like a wind tunnel almost.” The wind blows the skins off. Serafini now swears by pre-peeled garlic: “It’s really the best way! It’s more consistent.” See? Stop being a snob. But I hadn’t solved this mystery! My grocery store wasn’t getting some special easy-peel garlic. No one was really breeding easy-peel garlic. So what had happened? It was time to go to the source: Harris Teeter, my grocery chain. Ah, the problem begins to resolve itself. The article goes on to describe how garlic shipped from further away has more time to dry and thus might be easier to peel. I guess I’m grateful that the impossibly convoluted complexity of the intercontinental produce supply chain—which makes modern life more convenient in the short term but is destroying the planet for the long term—is the likely cause of my garlic’s new peelability. Yeah, at this point, I'll take convenience. We're doomed anyway, so we might as well enjoy the ride. One of the few actual benefits of living in late-stage capitalism is global trade. And the planet's not getting destroyed; only the biosphere. The result is a system in which buying local, freshly harvested produce can result, bizarrely, in a worse product. There's nothing at all bizarre about it. If buying local actually gave us better products, we'd never have switched to a global trade model. The only reasons to source local is to support local farms and give yourself something to brag about on social media. The former of which I normally support, except when I know the farmers are voting for the wrong politicians. But whatever. So, in general, like I said way back at the beginning, the answer is "no, garlic isn't getting easier to peel." Except in some individual circumstances. I'm going to keep wrestling with those little buggers anyway, because the taste is worth it. |
To be clear, I started saying "common sense is neither" before I found this article. But then I found this article, so of course I had to put it in my queue. It's from 2011, just to put some of its content into context. I started getting the idea that "common sense" was just another phrase for anti-intellectualism and know-nothingness. Some politicians run on a "common sense" platform, and the ones that do are all a bunch of down-home anti-book-learnin' ideologues, so I don't trust them. "Let's ignore scientific evidence and instead just go on my personal experience." Common sense, defined as "sound judgment derived from experience rather than study," is one of the most revered qualities in America. That's a silly damn definition. It's not sound, it's not judgment, and if it's derived from experience, then it's not common, is it? It's personal. It evokes images of early and simpler times in which industrious men and women built our country into what it is today. Let's be real here. Most "common-sense" folks are only thinking about the men from history. They only care about women in terms of how many babies they can make. People with common sense are seen as reasonable, down to earth, reliable, and practical. Nice passive voice there. Not by me. But here's the catch. Common sense is neither common nor sense. Which is what I've been saying. If common sense was common, then most people wouldn't make the kinds of decisions they do every day. People wouldn't buy stuff they can't afford. They wouldn't smoke cigarettes or eat junk food. They wouldn't gamble. And if you want to get really specific and timely, politicians wouldn't be tweeting pictures of their private parts to strangers. People wouldn't do the multitude of things that are clearly not good for them. Okay, whoa, hold the fuck up there. People do all those things for plenty of reasons, not all of them being that they lack any kind of sense. People buy stuff they can't afford because they're convinced by ad agencies that it will improve their lives more than saving money will. People smoke cigarettes because they're addicted, or because it's genuinely pleasurable. People eat junk food because it's cheap and easy, cheaper than healthy food; I can't reconcile this, at least for poor people, with buying only stuff you can afford. Gambling can be a problem, but for some of us it's just another entertainment expense, equivalent to going to a sportsball game or music concert. We all do things that aren't good for us because, on some level, they are good for us. (The "private parts" thing is a reference to a scandal that was current when the article was written.) This doesn't change my agreement with the general thrust of the article, but that bit is condescending as hell. This is the important bit: And common sense isn't real sense if we define sense as being sound judgment because relying on experience alone doesn't usually offer enough information to draw reliable conclusions. Heck, I think common sense is a contradiction in terms. Real sense can rarely be derived from experience alone because most people's experiences are limited. Our senses, our lived experiences, tell us the Earth is flat. It looks flat, doesn't it? Especially if you live in Kansas. And that we're at the center of the universe. A person who survives a car accident while not wearing their seat belt might believe that it's safer to not wear a seat belt. The news covers plane crashes religiously, but rarely car wrecks, so you'd think flying would be more dangerous than driving (it is not). You took horse dewormer and got better, so obviously the horse dewormer made you better, right? That's just common sense. It takes science, research, knowledge, book learnin' to extend our senses, to help us see reality beyond our limited individual experiences. No, science doesn't have all the answers. But it has way more than you do alone. Science tells us the planet's roughly spherical, and how gravity affects orbits, and that the universe doesn't have a "center." Research shows that you're safer with a seat belt than without one, but that doesn't mean you're invulnerable. Studies show that dewormer only cures worms; if you got better from something else while taking it, that was a coincidence. The word common, by definition, suggests that common sense is held by a large number of people. But the idea that if most people think something makes sense then it must be sound judgment has been disproven time and time again. Further, it is often people who might be accused of not having common sense who prove that what is common sense is not only not sense, but also completely wrong. Plus, common sense is often used by people who don't have the real knowledge, expertise, or direct experience to actually make sound judgments. "Global warming isn't real! Look at this snowball." I think we need to jettison this notion of the sanctity of common sense and instead embrace "reasoned sense," that is, sound judgment based on rigorous study of an issue (which also includes direct experience). No, this doesn't mean going through youtube or social media in search of things that ring your confirmation bias. Yes, I'm aware that this article rings my confirmation bias. No, this is not a contradiction. A course in scientific thinking and methodology for everyday life should be a requirement for all students. Such proactive education about precise thinking and real sense might reduce the number of truly dunderhead things that subsequent generations will do (the current generations are probably beyond remediation). Wow, this guy's a tool. Doesn't mean he's wrong, though. Without being receptive to answers that we may not want to hear, we might as well just ask ourselves what we want to be true and go with that, which is what many people with so-called common sense (most efficient, but often wrong). Which is what a lot of people seem to do anyway. (This article could have used an editor.) Let's be realistic. No one likes to see their "theories" disproven. This is not true. For instance, I have a "theory" (which isn't one in the scientific sense) that technology-using beings, such as us, are extraordinarily rare in the universe, to the point where there's not another one in this galaxy. I'd love for that to be disproven (unless of course it involves their technology blowing up the Earth). And it would be very easy to disprove it. Anyway. Quibbles aside, the main thrust of the article is something I absolutely agree with. Which of course makes it suspect. But I can live with that. |
I don't have a lot to say about today's article; I just think it's a good, fairly simple, example of how science gets scienced. Which Weighs More, a Pound of Stone or a Pound of Styrofoam? It’s not a trick question: your brain answers differently, depending on whether the materials are part of the same object or not Ah, but it is kind of a trick question, isn't it? There's objective weight, which can be measured by a scale, and subjective weight, which is what your muscles anticipate and feel. Consider two packages of the same weight, one large, one small (maybe the large one contains nothing but packing pillows). The small one will be easier to pick up and carry if only because of the bulk involved. That's not what this article is about, though. For more than a century, scientists thought they knew the answer to a curious question: why does 10 pounds of a low-density substance such as Styrofoam feel heavier than 10 pounds of stone? It isn’t heavier, of course, but repeated experiments have shown that it feels that way. Science tip #1: Just because you think you know the answer, doesn't mean you do. Always test. Now psychologists say their initial explanation may have been incomplete, and the new explanation could have far-reaching consequences, including for the way Netflix designs the algorithms that recommend movies to its customers. Science tip #2: Your experiment doesn't have to have practical, everyday applications. But if you can come up with one, it'll be easier to communicate it to the teeming hordes. The article goes on to describe the experiment and their results, which, as the headline hints, are that people are ass at guessing weights. Or something. Knowing how the brain estimates weight isn’t just an interesting experiment—it can actually help scientists develop smarter technologies that we use every day. Now that we know more about how context changes the brain’s decisions, programmers might be able to update technologies such as Netflix to imitate the brain more accurately and provide more fine-tuned recommendations for users. Article is from 2019. Netflix is still trying to recommend shows and movies to me that there's no way in hell I'd watch. Anyway, like I said, not much else to say, and you'll have to go to the article to see what the actual experiment was, because it wouldn't be easy to take any of it out of context. And I'm all about easy. Give me the smaller package. |
I'm not above making up words if I don't know a good one for the context. Or even just for fun. None of them have caught on, but English has some words that caught on for a while, and then... caught off? If your dream is to talk like Moira Rose from Schitt’s Creek... Who? No. ...look no further than Mrs. Byrne’s Dictionary of Unusual, Obscure, and Preposterous Words, one of the dictionaries Catherine O’Hara used to tweak her iconic character's lines. Still not interested in the show. The following terms for everyday things are ones you'll want to add to your lexicon ASAP. People who grew up with the internet seem to think they had the monopoly on turning words into acronyms. They did not; ASAP predated widespread use of computers. Some sources claim it's about a hundred years old They also didn't invent weed or sex. Just saying. On the other hand, a bunch of words that people think were acronyms weren't. Those are called backronyms because people love portmanteaux (I, on the other hand, do not). An example of a non-acronym is the ever-useful F word. Anyway. The article lists too many words for me to copy all of them, and besides, lawyers exist. So I'll just highlight a few. 3. Baragouin Another word for gibberish that dates back to the early 1600s. But why bother? Gibberish is easier to spell, pronounce, and remember, and also has the advantage of sounding like what it is. 4. Bumfodder Why yes, this is a 17th-century word for toilet paper. Again, easier to say "bumwad," or even "loo roll" if you're of a British bent. 9. Clinchpoop If you get into a confrontation with a jerk, consider calling them a clinchpoop, which the OED defines as “A term of contempt for one considered wanting in gentlemanly breeding.” I mean, sure, hit 'em with that word if you want them to hit you with their fists. 13. Eructation A fancy word for belching... Everything sounds more proper when using Latin root words. That's one reason we have so many. Defecation. Urination. Flatulence. This one, though, is just showing off. 18. Forjeskit “Forjesket sair, with weary legs,” Scottish poet Robert Burns wrote in 1785’s “Second Epistle to J. Lapraik.” It was the first use of the word, which means “exhausted from work,” according to Mrs. Byrne’s Dictionary. Still no definitive word on whether Scots is a dialect of or sister language to English, but I'm pretty sure this word belongs in that language. That's not the only Scots word at the link. 24. Join-hand Another word for cursive handwriting. Some words went obsolete for a reason. 31. Maquillage Another word for makeup that dates back to the late 1800s. And that one's French. Yes, English stole a bunch of words from French, but most of them came from way before the late 1800s. 32. Matutolypea According to Mrs. Byrne’s Dictionary, this term means “getting up on the wrong side of the bed.” Macmillan Dictionary notes that the word “is derived from the Latin name Matuta from Matuta Mater, the Roman Goddess of the dawn, and the Greek word lype meaning 'grief or sorrow.’” Unnecessary. Cumbersome. 40. Ombibulous According to Mrs. Byrne’s Dictionary, ombibulous describes “someone who drinks everything.” It was coined by H.L. Mencken, who once wrote, “I am ombibulous. I drink every known alcoholic drink and enjoy them all.” Finally! One I have reason to steal. 47. Scacchic “of or pertaining to chess,” according to the OED. As far as I've been able to figure out, the French word for "chess" is the same as the French word for "failure." Unsurprisingly, it seems to be related to this one, which comes from Italian. 51. Tapster Another word for a bartender. Hey look, another one I can actually use. Anyway, like I said, many more at the link. Personally, I think most of these, however unusual, have better words to describe their concepts. But some writers seem to take great delight in vexing their readers with obscure synonyms, so the article might be useful to them. |
About a year ago, I commented on a Bloomberg article about dining economics. This entry: "Oh Yeah, I Vaguely Remember "Dates"" . I had to go find it because today's Atlantic article rang a couple of chimes in the old belfry. It's not the same thing, so here is what you might call Dining Economics, Part 2. The Economic Principle That Helps Me Order at Restaurants If you’re just eating one dish, you’re missing out. I order at restaurants using just one criterion: Does it look like I'd enjoy it? The only "economic principle" involved in dining out should be "Can I afford this?" In the 19th century, when European thinkers began developing the economic principle of diminishing marginal utility, they probably weren’t dwelling on its implications for the best strategy for ordering food at a restaurant. Probably because they were doing their thinking at taverns, not restaurants. The basic concept that these early economists were getting at is that as you consume more and more of a thing, each successive unit of that thing tends to bring you less satisfaction—or, to use the economic term, utility—than the previous one. This may be a valid economic principle (I'll grant that it is), but for me, it simply doesn't apply to dining. My last bite of delicious steak is just as satisfying as the first. The crust of the pizza makes me just as happy as the point. And my third beer may, in fact, be more satisfying than my first. Recently, Adam Mastroianni, a postdoctoral research scholar at Columbia Business School, invoked this idea in his newsletter, Experimental History, to explain why a flight of beer can be more satisfying than a larger glass of a single brew. “The first sip is always the best sip,” he wrote, “and a flight allows you to have several first sips instead of just one.” This may be true for others. For me, it's not about diminishing marginal utility. It's about trying as many beers as possible without getting too drunk to enjoy them. Which I suppose plays right into the thesis of today's article, except that I don't have the same desire for variety in food. The same principle, I’d argue, applies to first bites: If the first half of a dish tends to be more satisfying than the second half, why not have the first half of two dishes instead of one whole dish? In other words, when you go to a restaurant, just share every dish with whomever you’re with. That way, you get more first bites. But what if they've ordered something you know you won't like? I've said this before, but just to reiterate: With a few exceptions, such as some appetizers, or pizza, I don't share food. Long ago, I was on a second or third date with someone. On previous dates, she'd pulled the old "I'll just have a salad" thing that some women think all men appreciate, and then proceeded to steal more than half of my fries. So I ordered more fries. Unsurprisingly, this was our last date. While I'm on the subject of sharing appetizers, can someone tell me why they always come in prime-number servings? Like, five. Or seven. Or whatever number is NOT equal to or easily divisible by the number of diners. It's freaking annoying. Or it would be if I didn't almost always dine alone these days, thus rendering whatever psych trick they're trying to pull invalid. Diversification can free you from indecision when you’re torn between menu items that sound equally awesome. For shit's sake, just commit. For instance, it is the answer to the classic conundrum of brunch: sweet or savory? What? The classic conundrum of brunch is: beer, mimosa, or bloody mary? Even I, a prolific meal-splitter, acknowledge that this approach has downsides and limitations. It can be difficult when people have different dietary restrictions or different budgets, and it doesn’t make sense if there’s a dish you know you don’t want to share. I'm especially not sharing my hamburger. That's gross. Or you have to slice it, which is blasphemy for a burger. Restaurants do show flickers of awareness that many people don’t want to be locked into eating all of a single dish: They serve buffets, which are basically just meals shared by every customer, and they commonly offer to serve dessert with multiple spoons. Confession: With the exception of some Indian restaurants, and even then only sometimes, I do not like buffets and would rather opt for menu ordering. No, this isn't a pandemic thing; I was like this in the Before Time. But a world in which meal-sharing is the default would represent a shift not just in logistics but in values. Whereas a one-dish-per-person paradigm prizes individual choice—and perhaps even endorses a notion of private property—sharing a meal elevates compromise and negotiation. Big fan of compromise and negotiation, and happy to practice it with something other than my pastrami reuben. In conclusion, no, economic principles (which may or may not be sound in the first place) don't apply to restaurant dining, except possibly the law of supply and demand. |
The best kind of article is the kind that confirms what I already believe. Because it's an epic waste of time? I'm kidding about the "best kind" thing, of course. But let's see what Art of Manliness says about this. No, it's not "make your wife do it." It's not promoting that kind of "manliness." I have a confession to make: I don’t make my bed. I never saw the point in it; I’m just going to mess up the covers again that night. Exactly. Also, I don't like sleeping in a "made" bed. When I'm in hotel rooms, for example, the first thing I have to do in the bed is yank all the tucked-in covers out. I don’t spend much time in my bedroom, and my guests don’t spend any... Well, maybe if you made your bed, it would have guests. I realize that not making your bed has a bit of slovenly shame associated with it. Nope. No shame here. I don't have a drill sergeant coming to bounce a quarter off the sheets or whatever the hell they do in the Army. In fact, when it comes to daily habits, it even has some cool cache. It’s the kind of foundational habit a four-star naval admiral could base a commencement address, and book, around, and even claim could very well help you change the world. You know what you could be doing in the ten minutes you've wasted making a bed? Changing the world. Or, better, sleeping an extra ten minutes. Believe me, when I was working, that ten minutes was damn precious. Given this cool, cleanly cache, I felt surprised (and a little vindicated) when I came across the following passage while recently reading A Bachelor’s Cupboard — a manual for young men on how to live independently published in 1906: A woman who, as the mother of several sons, has many young men as guests at her large country house, says she can invariably judge a man from the care he takes of his room. A young man who has been well brought up, she says, never fails to turn back his bedclothes [sheets and blankets] upon arising in the morning... So, hang on, don't make your bed, but do futz with the covers so they're all folded and whatnot? Still a waste of time. So, the standard for neatness and cleanliness a century back was the opposite of what it is today: rather than pulling his sheets and covers back over his bed up to the headboard, a well-bred gentleman was supposed to drape his bedding over the footboard, leaving both the blankets and the sheet-covered mattress entirely open to the air. Such an airing out was thought to promote freshness and good health (hence why you would also place your pillows by an open window). I will note that 1906 was before washing machines became a widely adopted thing. Back then, washing sheets and blankets would have been an all-day chore. Nowadays, you can even do it as often as once a year. Kidding. But I did just have to buy a new washer/dryer because the old ones broke under the weight of my bed covers. In 2005, a study was published which found that not making your bed may be better for you than making it. "A" study? Not impressed. I don't not do it for scientific reasons, but out of sheer laziness. More than a million dust mites live in your bed. These microscopic critters feed on the flakes of skin you slough off in your sheets, and thrive in warm, moist environments. Remember that next time you're in a hotel room with their sanitized sheets (not so sure about the blankets, though). I'm always sneering at the clean-freaks who take black lights to hotel rooms and never once consider that they're sleeping in far worse conditions at home. Plus, then they go to AirBnBs, which don't have the big industrial-scale steam power washers. Sleeping with dead, dehydrated dust mites may not seem significantly more appealing than lying with moist, live ones, but it’s their fecal matter (yes, dust mites are pooping in your bed) that trigger allergies and asthma. As I don't have allergies or asthma, I'm pretty sure I'm doing it right. Of course, to get the full, freshening effect of this, you should drape your bedding over the footboard of your bed, as The Bachelor’s Cupboard instructs, rather than leaving your sheets in a half-on/half-off rumple. Who the hell has a footboard these days? Okay, me. I do. This is because I went through a period when I'd wake up with severe calf cramps, and pressing my foot against the footboard for a stretch is less disruptive of my sleep cycle than standing up to do it. So I bought a bed frame. Anyway, point is, make your bed, or don't make your bed. I don't care. I'm far more concerned about how you treat other people than what you do in bed. |
Hot on the heels of my citrus-themed Comedy newsletter this week, an article about orange juice. (If you missed the NL, here's a link: "Lemons" ) How orange juice took over the breakfast table Orange juice used to be a treat you had to squeeze out yourself. More than a century ago, an overproduction of oranges helped create the morning staple we know and love. Love? Speak for yourself. I'd rather eat an orange. Curious, I tried to look up whether the fruit or the juice is considered better for health; unsurprisingly, as with so much of nutrition science, I got mixed results. I shan't post the links here; just google "orange or orange juice." It’s bright, but somewhat boring, and bears the dubious halo of being something good for you. Few of us give it much thought, other than to recall its oft-trumpeted Vitamin C content. Lots of things have Vitamin C. I've gone months without citrus products, and most of my teeth haven't fallen out. Except in that dream. You know the one. But processed orange juice as a daily drink, you might be surprised to learn, is a relatively recent arrival. Its present status as a global phenomenon is the creation of 20th-Century marketers, dealing with a whole lot of oranges and nowhere to dump them. Of course it's marketers. It's always marketers. The same people who convinced us to buy chicken wings, pet rocks, and bottled tap water. Much has been made about how "Evian" spelled backwards is "naive," but that's ridiculous; after all, "star" spelled backwards is "rats," for example. No, I'd be way more concerned that one of the French words for the kitchen sink is "évier." But I digress. The fruits were shipped all over and eaten fresh or juiced in the home, producing a delicious honey-coloured elixir. California relied on the navel orange and the Valencia orange,; the latter was the best for juicing. The navel orange itself is interesting. You know how it's got this little knob inside, by the "navel?" That's a buried other orange. It's apparently the tradeoff from genetic engineering for them to have no seeds. Seeded oranges are annoying; they're too much work. Citrus itself is quite interesting, genetically speaking, but that's another topic. Florida, however, grew four varieties, and all of these were decent juice oranges. That meant that when, in 1909, the growers met to deal with a burgeoning problem – a glut of oranges, too many for the market to bear – juicing them, rather than curbing their production, was considered a feasible solution. For context, in-home refrigerators didn't start to be a thing until like the 1920s or 30s. Be that as it may, oranges, juiced and otherwise, were the subject of a strenuous advertising campaign by orange interests in the 1920s, when the discovery of vitamins was a current event. Vitamin: what you do when your friends drop by. Vitamin C was a perfect reason to consume more oranges. Things really got off the ground when nutrition personality Elmer McCollum popularised a mysterious ailment he said resulted from eating too many "acid-producing" foods, like bread and milk: acidosis. Right, because no one associates citrus with acid. In fact, true acidosis, which has a variety of causes, cannot be remedied by eating lettuce and citrus, as McCollum claimed. But that didn’t stop the imagination of the citrus industry from taking advantage of this new fear. Classic marketing: If demand doesn't exist, create it. I've mentioned my First Rule of Comedy before: "Never let the facts get in the way of a good joke. Or a bad one. Especially a bad one." Well, substitute "marketing campaign" for "joke," and you have the basics of marketing conquered. The promise of a new way to make juice that could be kept frozen, then reconstituted in people’s homes, prompted them into even more production, however. They ramped up tree planting in the 1940s. Again, though, nothing in here about the parallel development and adoption of the in-home refrigerator/freezer, essential if you're going to sell orange juice concentrate. You might as well write a story about the rise of the automobile and fail to mention gas stations. Wait, this is the BBC. Petrol stations. When John McPhee checked into a Florida hotel for a reporting trip more than 50 years ago, he discovered that even in the heartland of oranges, fresh juice was a dim memory. When you only consume inferior product, you start thinking that it's the best. See also: wasabi, milk chocolate, light beer, dried mashed potatoes. "Next door was a restaurant, with orange trees, full of fruit, spreading over its parking lot," he wrote in his book Oranges. "I went in for dinner, and, since I would be staying for some time and this was the only restaurant in the neighborhood, I checked on the possibility of fresh juice for breakfast. There were never any requests for fresh orange juice, the waitress explained, apparently unmindful of the one that had just been made. "Fresh is either too sour or too watery or too something," she said. "Frozen is the same every day. People want to know what they’re getting." Which also helps to explain why tourists go to McDonald's instead of local restaurants. It had taken a few decades, but with the help of advertising and processing technology, the dumping ground for extra oranges was solidly ensconced as its own product, far outpacing oranges themselves in sales. And don't misunderstand me: people like what they like, and I understand that, even if I sometimes mock it. The problem is that most people only think they've made their own choices. They haven't. Ad agencies have. |
As far as I'm concerned, history ended in 1969; I can trace my first memories to the moon landing that year. After that, it's "lived experience." Part of my lived experience was being a photographer. I learned early on how to develop film—black and white film, that is; color was beyond my resources, if not expertise. Usually I'd take my color rolls to a reputable photo shop. Sometimes, I'd use Fotomat. Fotomat was in the photography business, offering tiny huts situated in shopping plaza parking lots that were staffed by just one employee. Men were dubbed Fotomacs. Women were known as Fotomates, and management required them to wear short-shorts, or “hot pants,” in a nod to the strategy used for flight attendants at Pacific Southwest Airlines. "But at least we're equal opportunity!" Seriously, though, if you don't remember those kiosks, look at the picture in the article. How's anyone supposed to know if the "Fotomate" is wearing hot pants? Or pants, period? Cars pulled up to the Fotomat location and dropped off film they wanted processed. After being shuttled via courier to a local photo lab, it would be ready for pick-up the following day. Meanwhile, everyone involved got a good look at the naughty pictures you'd taken. In the 1960s, Americans were fond of Kodak Instamatic cameras and film. People submitted the familiar yellow spools full of images from weddings, birthdays, trips, and other social events to photo processing labs, which might take days to return prints. "...and this is a picture of my thumb... and a close-up of my thumb..." You might expect me to rag on Instamatics. I will not do so. They had their place. One of the first lessons I learned while doing photography was, "It's not the camera; it's the photographer." This did not stop me from owning a Nikon and lusting after a Hasselblad. The concept of a kiosk where people could easily drop off and pick up film that would be ready overnight originated in Florida, where Charles Brown opened the first location in 1965. After buying Brown's stock shares and arranging for a royalty, Fleet and Graham founded the Fotomat Corporation in 1967, with Graham president and Fleet vice-president. I didn't know they were quite that old. But that stretches back into what I'm calling "history." Charles Brown isn't exactly a rare name, though I question the morals of any Brown naming their kid Charlie after the mid-50s. Another famous Charles Brown ran AT&T for a while, and, if I recall, oversaw its breakup. Or maybe it was the same guy; I can't be arsed to research it. That Charlie Brown was one of the reasons I got into computers, but that's a story for another day. While it was relatively easy to slot in a Fotomat hut in a parking lot, a business operating as an island surrounded by traffic had its problems. Remembering an old Fotomat in New Dorp on Staten Island, residents on Facebook recalled plowing into the kiosk or backing into it. It had a bright yellow roof for a reason, but Staten Idiots are blind. There was also the matter of bathrooms: They weren’t any. Employees often made arrangements to duck into local supermarkets or other stores when nature demanded it. I guess Amazon followed their lead on a much larger scale. But Graham’s controversial business practices made him a short-timer. In 1971, he was ousted from Fotomat over allegations he was misusing funds for his own personal gain, including his political interests—Graham was a supporter of both Richard Nixon and football player-turned-congressman Jack Kemp, who became an assistant to the president in the Fotomat corporation and referred football pros to become franchisees. Now, see, if I'd have known that, I'd never have used them. Even though 1971 was long before I was able to drive up to one. By the early 1980s, Fotomat—now minus Fleet, who had sold off his shares, and Graham—had opened over 4000 locations. That was both impressive and problematic. Fotomat had far overextended itself, sometimes opening kiosks so close to one another it cannibalized sales. You'd think Radio Shack would have learned from that, but they did not. The real death blow for Fotomat, however, wasn’t over-expansion. It was the emergence of the one-hour minilab. Also known in the photography business as "Freakin' Sorcery." The company tried to recalibrate, converting home movies to videotape and even offering VHS rental during the VCR boom of the 1980s, but it wasn’t successful. Maybe if they'd offered a VHS-by-mail service? Fleet, who had exited Fotomat years prior—the company had been sold to Konica—was no worse for the wear. Prior to his death in 1995, he authored a book, Hue and Cry, which called into question the authenticity of works attributed to William Shakespeare. Questioning the authenticity of works attributed to William Shakespeare is a giant red flag that you're an elitist asshole. "How could a commoner have penned such art?" Graham’s future after Fotomat was far more colorful. Promoting a bogus gold mining operation he named Au Magnetics, he promised he could turn sand into gold. Instead, he was accused of fleecing investors. You can turn sand into gold. It would cost more than you'd get for the gold. That doesn't stop people who have a hard-on for gold. As for the Fotomat locations themselves: Following the company’s collapse, many were repurposed into other businesses. Some became coffee shops; others morphed into watch repair kiosks, locksmith huts, windshield wiper dealers, or tailors. Presumably, none of the owners who took over mandated their employees wear hot pants. It's been a while since I've seen one; I think all the old kiosks around here were completely removed. It's not like they were built to last. Only their memory remains, a faded photograph on a questionable website. I feel no nostalgia for them, however; they were a product of their time, and today's digital photography is far superior to that of the Instamatic era (though not to the late, lamented Kodachrome). And on that note, this song is appropriate today: They give us those nice bright colors Give us the greens of summers Makes you think all the world's a sunny day, oh yeah I got a Nikon camera I love to take a photograph So mama, don't take my Kodachrome away |
Another Cracked article, this one tickling my confirmation bias, so I'm sharing it. I know I've written about this sort of thing before. Here's one from eleven months ago: "Out There" . Short version: I note the similarities between alien abduction stories and sleep paralysis, because I've experienced the latter far more than I'd like (that is, more than 0 times). If you’ve followed my work here on Cracked or ever had me corner you at a houseparty, you know that I have a complicated relationship with the supernatural. It’s something which I am deeply fascinated by, desperately want to be true, and am also completely convinced it’s all B.S. of a Q-Ray Bracelet order of magnitude. I can relate, except for the "desperately want to be true" part. I want explanations, not confirmations. Now, I'm not going to do a lot of article pasting today. I just don't have the time. But the article goes into one of the most famous alien abduction stories, Betty and Barney Hill. Then, Abductees generally report a sense of missing time, strange lights, looming ominous presences, and, most notably, some sort of bodily violation, usually with metallic instruments. This is the famous ‘anal probing,’ which when you think about is really bizarre. When we humans discover a new species here on earth, our first step isn’t usually to jam stuff up its butt. Never had that happen to me, but then, I find I'm way less fascinated by butts than most humans. I was just reading yesterday about some guy who went to the ER with a hot water bottle rammed up his ass (Rectum? Damn near killed 'im.) There is only one way to have that happen, and it is not by accident, even if you think you've managed to convince the ER staff that it was. And it is something that I would never, ever consider. As far as I'm concerned, that orifice is strictly Exit Only. But I digress. Yes, odds are you know someone with an alien abduction story. And if you don’t, you do now – because I have one of my own. And yes, he explains this later: Look, I’m not saying there isn’t life on other planets. Probabilistically speaking, there almost certainly is. But I don’t think any intelligent species has visited Earth, at least in any of our lifetimes. We have equipment sensitive enough to detect minor seismic activity on other planets. The amount of energy it would take to propel an object through space at a speed anywhere approaching practical would be so mind-bogglingly enormous we’d definitely know about it. Not only that, even assuming that aliens were visiting from Proxima Centauri, the nearest star system to ours, and even assuming they were traveling at speeds approaching c, we’d still see them coming for about four and a half years. Minor nitpick: no, we wouldn't, because an object traveling at near the speed of light relative to us, headed in our direction, wouldn't... oh, hell, I don't have time to wrestle with general relativity right now. Point is, I still agree with his first points there, as I've banged on in here about way too often. So here’s my theory, based on what happened to me. When I was a teenager, around fourteen or so, I started having night terrors. You don’t remember night terrors, but it didn’t take too long for my grandmother (whom I lived with) and my friends whose houses I would stay the night at to start waking me up in the dead of night, asking if I was okay. Some of you lucky bastards don't remember night terrors. I’d occasionally have nightmares I’d remember, and they’d always be variations of the same themes: I’d be lying down, completely immobilized. I wanted to move, but couldn’t. There was a blindingly bright light and I could hear what sounded like the cadence of speech, but couldn’t make sense of it. There was a looming figure I could vaguely sense, and then there was a blinding pain in my legs. Like hot razorblades being dragged along my calves. Apart from the pain, which I fortunately haven't experienced, that sounds a lot like my own experience with sleep paralysis. In other words, I had almost all of the signs of an alien abduction. But I know I wasn’t abducted by aliens. In fact, I know exactly what happened. See, when I was thirteen, I had a surgery on my legs because I was prone to walking on my tiptoes and it was giving me the backpain of a retired roofer. And that might explain why my experience is different from his: I never had surgery as a child. So, yes, I think that people have been waking up from general anesthetic during surgery, but remember it mostly subconsciously, like a dream. The bodily violation. The feelings of fear and immobility, like sleep paralysis. Confusion. Bright lights. Menacing creatures with bluish-gray skin, no nose (and sometimes no mouth), often with no hair, and strange, dark eyes? What, you mean like this? [picture of surgeon] I think that's a fair hypothesis. It certainly doesn't explain every report of alien abduction, but it doesn't have to. People probably start feeling all of this terror and strange dreams, and, unsure what’s causing it, latch onto the culturally ubiquitous idea of alien abduction. Remember when I mentioned that the Hill’s case, which set the pattern for the prototypical alien abduction story, took place in 1961? Well, in 1956, halothane was introduced as an inhalatory general anesthetic. Sounds like a valid point. Of course, I wouldn't accept it as absolute truth until they do serious studies on it, but it's absolutely worth looking into. Fentanyl, another drug that became used in general anesthesia, was first synthesized in 1960. All of this happened right around the time that there was an explosion in alien abduction claims. Right now, we have possible correlation, but no evidence for causation. So, sure. I've been saying that alien abduction experiences track with sleep paralysis, but this works too. It could very well be both, or even have other explanations. Like I said, though, don't think I'm just swallowing this whole. That would be just as rash as wholeheartedly believing in alien abduction itself. But it's a perspective that might have some merit. |
I've mentioned before that if you ask four economists a question, you'll get six answers. None of them will be right. If you want "right," you have to go to comedy sites like Cracked. Okay, so not completely right. Plenty of people want to work, and would even if they'd be able to eat if they didn't. But I'm not one of those people, so I'll accept the title as hyperbole, not literalism. An older generation disapproving of the incoming one is a tale as old as time. I don't disapprove of the younger generation. I disapprove of every marketing gimmick assuming that they're the only ones that matter. Well, every marketing gimmick except ads for Depends. And even there, I'm not so sure. The latest bug to crawl up the butt of the boomer generation is their collective decision, in what they’re calling “the Great Resignation,” is the idea that these ding-dang gosh-darn millennial snowflakes just simply don’t want to work anymore. Considering that the youngest of what marketers have decided is the Boomer generation are pushing 60 right now, I know who doesn't want to work anymore, and it's the folks near or past retirement age. The definition of "retirement" is "not wanting to work anymore." Instructor at University of Calgary and Twitter user Paul Fairie, smelling the stench of an argument endlessly made, put together a thread of quotes from publications going back a full century accusing workers of not wanting to work. It's an image, so you'll have to go to the link above to look at it. In summary, it's variations of the phrase "no one wants to work anymore" in publications going back to the 19th century. Once again, the people in need of labor for incredibly unrewarding jobs have pointed their finger at a willpower-less public. After all, they’re talking about fruit-picking, a job that is literally often used as a punchline or example of undesirable work. One that is now the go-to example of the work taken by desperate and less-than-documented migrant workers because, well, the job f**king sucks. I can think of a few jobs I'd hate more than fruit-picking, but the point stands. What’s even more bizarre about these froth-mouthed claims is that simple economic data doesn’t back up the narrative. The current unemployment rate in the U.S., via the Bureau of Labor Statistics, is 3.6%. This is one of the lowest unemployment rates in the last 50 years or more. Mostly I'm including this quote because you're going to hear people claiming that unemployment is at an all-time high. Usually followed by "Vote for me in November." I guess politicians want to work. Or at least provide the appearance of working. So apparently, the amount of people that want to work is the highest it’s ever been, but no one wants to work anymore. How do these two things co-exist? It’s pretty simple: people don’t want to work TERRIBLE JOBS, and they don’t have to. And as far as I'm concerned, this is great. There's a word for being forced to work a horrible job just so you don't die, and it was supposedly outlawed in the 19th century here in the US. The unemployment rate shows that there is a large supply of jobs. If no one wants to work for you, that likely means whatever job you’re offering is in very low demand. So maybe do what the businessman you hold in such high regard would do in this kind of situation: examine why what you’re offering sucks so much that nobody wants it. Of course, it's not always about the paycheck, or at least not just about that, but you know how people would say "I wouldn't do X for a million dollars?" Well, how about ten million? "Hm, I'll think about it." Is paying your workers a living wage not financially viable? Well, unfortunately it seems like your genius business plan was actually financially unviable and was held afloat by simple desperation. We can argue about what a living wage really means, but at the very least it means a single person being able to afford water, food, shelter, and electricity. The words you’re looking for aren’t “nobody wants to work anymore,” they’re “I can’t find anyone desperate enough to do awful things.” Which, for anyone looking at a picture beyond numbers, isn’t the worst state of things. It's not just "doing awful things." I derive immense satisfaction from stories about people quitting decent jobs because of shitty management. Because they can. Employers (remember, I used to be one) and managers have had the upper hand for far too long, and it's about time someone gave them a reality check. By the way, the entire statement “no one wants to work anymore” is inherently dumb. Nobody likes to work, except for mattress testers and candy tasters. People work because it’s necessary to live and do the things in life that actually bring them joy. This is the only part of the article I have an issue with. I know several people who, if they weren't somehow contributing something (aka working), would feel bad. Like I said, I'm not one of those people, but I know they exist. The open question is: would they take that job as a sanitation technician (second class) if they didn't need the money? In conclusion, we're at the point right now where demand is high and supply is low. Classically, when that happens, prices go up to balance things. That's Econ 101, which I actually passed. The product in question, though, isn't bananas, but labor. And yes, this may be a cause of inflation in the short term, but in the long term, you have more people with more money they're willing to spend, and if production ramps up to compensate, things stabilize again. If not, if inflation continues at an astonishing rate, then we've just proven what I've been saying all along: capitalism requires an underclass. Don't like it? Change the system. Not my problem anymore, because I don't want to work at solving it. |
We're going to have a bit of pun today. Difficulty: other language. Why the Japanese Calendar Is Full of Unofficial Food Holidays A linguistic quirk created both Banana Day and Strawberry Day. Over the last few years, an increasing number of unofficial food holidays have popped up all over the Japanese calendar. Banana Day, Strawberry Day, Curry Day; the list goes on. The only things better than food holidays are drink holidays. I like to think it's always Beer Day somewhere. Each is celebrated with a flurry of online attention and clever marketing tie-ins from restaurants, trade groups, and food companies. Because of course it is. But how does a random date become Strawberry Day, for instance? Part of a larger trend known as dajare no hi, or (bad) pun days, most of these celebrations depend on wordplay, to which the Japanese language is particularly well-suited. You know, if I'd known how much learning other languages would up my pun game, I'd have done it long ago. I only know a few Japanese words, mostly ones related to food, martial arts and cars, but even there, there are opportunities for puns. Nissan, the car company, for example: I've been calling it "23" because it sounds like the Japanese for 2 and 3. That's not where the name came from, though. To make matters worse, 1 through 5 in Japanese is "ich ni san shi go," which I can never forget now because of its resemblance to "Each Nissan, she go." I'm probably butchering Japanese here. If so, sumimasen. But I do know a bit more French, and I am still amused about the French word for seal (the animal, not the seventwo things holding back the apocalypse). I've probably mentioned this in here before, but just in case: The word is phoque, but it's pronounced just like a certain versatile Germanic curse. To make matters worse, if you want to say, "I'm going to feed the seals" in French, you'd say "Je vais donner à manger aux phoques." Since the final s isn't pronounced, and the "aux" is pronounced almost exactly like "oh," well, don't say that in front of your mom. But I digress. Take Banana no hi (Banana Day) for example. Broken down, the Japanese pronunciation—ba-na-na—corresponds to the Japanese for eight (ba) and seven (nana). The seventh day of the eighth month, August 7, thus becomes “Banana Day.” Don't you dare scoff. We have Star Wars Day here. May the Fourth be with you. The only surprising thing there is that it goes month/day like in the US and not day/month like in civilized countries. Similarly, the Japanese for strawberry—ichigo—corresponds to the number one (ichi) and five (go). Depending on one’s location, Strawberry Day, or ichigo no hi, is marked on either January 15, or even the 15th day of each month. Depending on how many strawberries there are for sale, I suppose. In a 1999 paper for the Japanese Society for Language and Humor Studies, the linguist Heiyo Nagashima explained how puns have existed in Japanese since the time of the Manyoshu, a collection of poetry published in the later Nara Period (710-794). If you have to study humor, it's not funny anymore. Regardless, this long history of puns is probably one reason I've always been interested in Japanese culture. Japanese is also a heavily inferential language, with emphasis placed both on the ability of a writer to convey information as succinctly as possible, and that of a reader to infer the correct meaning. Without knowing that many food-day puns depend on dates, they can be something of a riddle. And that's why I never started any kind of formal study of the language. Pocky no hi: Pocky Day. Observed on November 11, since 11/11 resembles the long skinny shape of the popular sweet snack. Also known as Pocky and Pretz Day, which includes the savory version. Which is probably for the best, not focusing on the other association of that date. And I'm glad this one came up out of my rotation today and not tomorrow, which is the anniversary of Japan's surrender at the end of WWII. Niku no hi: Meat Day. Corresponding to two (ni) and nine (ku), Meat Day is commonly marked by stores and restaurants on the 29th of each month, although February 9 (2/9) is also often observed. The use of the Japanese particle "no" is interesting, by the way. From what I understand, in these contexts, it indicates a kind of possessive. Like we use apostrophes. These puns are not just confined to food. February 22, for example, is Cat Day. In Japanese, cats say nyan, rather than “meow,” which may also be read as the number two. 2022 was particularly notable for its 2/22/22 date, which was naturally dubbed “Super Cat Day.” Any country with a Cat Day is okay by me. How can both "nyan" and "ni" be "two?" you ask. Well, it's not unheard of for there to be several names for one concept, and also, these are transliterations. While bad puns in Japanese produce the same eye-rolls as their English counterparts, there’s no denying that clever wordplay and special foods can make ordinary days into something worth celebrating. A pun is only funny to the perpetrator. At least until the victims come after them with pitchforks and/or katana. |
Today's article—something from NPR—mostly applies to the US. People from other countries, feel free to shake your heads in disbelief. 1. Move to another country; or 2. Just die. But medical billing and health insurance systems in the U.S. are complex, and many patients have difficulty navigating them. This is by design. One of the great certainties in life, apart from death and taxes, is that you will need medical care at some point. Some people have figured out that we don't really want to die (or move to another country), so they find ways to extract money from us for that privilege. Some cost is, of course, unavoidable, but if you consider other actual needs, you'll find that they're free or heavily subsidized: Air - generally free Water - Low to no cost, depending on where you live. Mine is provided by *gasp* the government Food - Can certainly be expensive, but some of the basics are subsidized Beer - Okay, you got me there. The cheap stuff is basically water, anyway. The only other basic need that's not cheap is housing. I'm not going to go into the reasons for that, but as with health care, the profit motive can get out of control. If you're worried about incurring debt during a health crisis or are struggling to deal with bills you already have, you're not alone. Some 100 million people — including 41% of U.S. adults — have health care debt, according to a recent survey by KFF (Kaiser Family Foundation). 41% is absolutely mind-blowing. But as you should know, there's debt and then there's debt. I'm technically in debt, for example, paying for this dang computer every month—but it's a 0% interest promotional offer, and I'm prepared to pay it in full before the promotion period expires, which will probably piss Dell off, but I don't care. "It shouldn't be on the patients who are experiencing the medical issues to navigate this complicated system," said Nicolas Cordova, a health care lawyer with the New Mexico Center on Law and Poverty. But consumers who inform themselves have a better chance of avoiding debt traps. Especially when some of the patients aren't in a position to figure out complicated payment options. Mental health, for example. Or people in comas. Even people with health insurance can land in debt; indeed, one of the biggest problems, consumer advocates said, is that so many people are underinsured, which means they can get hit with huge out-of-pocket costs from coinsurance and high deductibles. I don't think I'm stupid or ignorant, but figuring out what my insurance policy will and will not cover is harder for me than understanding calculus. It seems completely arbitrary. Get the best insurance coverage you can afford — even when you're healthy. Make sure you know what the copays, coinsurance, and deductibles will be. And then watch as you get sick and you find out you're still not covered. If you're uninsured but need health care, you might qualify for public insurance like Medicaid or Medicare. Ask the provider or hospital if they can help you check your eligibility before you commit to a care plan — and then stay with providers who participate in those programs. Easier said than done. I've heard horror stories about people who, while lying helpless in a hospital bed, had a doctor come in to check on them. The doctor happened to be out of network. They got boned. Note, I'm not blaming doctors here. Highly trained medical specialists deserve to be paid. It's a question of who's doing the paying, and how many layers there are between you and the actual doctor. After your doctors map out your treatment plan, check whether all the providers you need to see are in-network and whether any part of the treatment needs to be preauthorized. Ask lots of questions of your insurance provider, doctor's office, or hospital, especially for planned procedures, said Joy Dockter, a lawyer at Central California Legal Services, a public interest law firm. That's great in theory, and it may even work in practice for, say, a broken arm. But some medical emergencies are absolutely time-sensitive, and/or you're not going to be in any position to ask questions. Additionally, said Mark Rukavina, a program director at health equity advocacy group Community Catalyst, if the drug you want isn't covered by your insurance, ask whether the drugmaker has a patient assistance program; many do, though eligibility requirements vary. My doctor wrote me a prescription for a certain medication. It's very expensive, so the insurance won't cover it. They say they'll cover it when I get diabetes. Not before. Never mind that this medication is supposed to keep me from getting diabetes in the first place. Another oddity: I'm on three pills. They're relatively low-cost compared to that other one. The doc writes me a prescription for a 90-day supply of each of them. I take it to the pharmacy, where I find out that my insurance will only let me get a 30-day supply of each at about $30 each. That's $90 a month. Or, if I tell the pharmacist to forget I have insurance, I can get the 90-day supply—also at a cost of $30. That averages out to $30 a month. I also "forgot" to tell the eye doctor that I had insurance, and thus my cataract surgery was cheaper. If you're uninsured, ask for a cost estimate in advance. Rukavina noted that the federal No Surprises Act, which took effect in January, requires providers to give uninsured patients "good faith" estimates of what planned care will cost. From what I've gathered, those estimates would fit nicely in the Fantasy section of your local bookstore. Almost every hospital offers some form of financial assistance, or "charity care." Each hospital sets its own eligibility requirements but typically will waive or discount bills for patients earning less than two to three times the federal poverty level. (Three times the federal poverty level for a household of four in 2022 would be $83,250.) I'm a household of one (housemate doesn't count in this calculation), and my income is sometimes below poverty level, sometimes above. I assume the income calculation is based on last year's tax return. What if I need care the year after a slightly larger income? Even if you're not sure whether you qualify, it's worth trying. Gather up documents such as pay stubs or income tax returns. Do not expect this to be an easy process. For example, Walker said, health care providers often require documentation to be faxed. "One of the most common refrains I heard from experts: Persistence pays," Walker said. So, hypothetical scenario: I'm lying in bed, unable to move, barely able to even stay awake, and I'm supposed to gather up all my old tax returns, then find a fax machine (nearest one's at a Staples about two miles from me, and I don't have a car)? Ambulance services, which can lead to huge bills, might offer charity care programs, so ask whether you qualify. Another hypothetical scenario: I've just been hit by a bus, still alive but concussed, and I'm supposed to ask the EMTs if I qualify for charity care? Keep an eye on costs as they come up, said Louisville cancer patient Lori Mangum, who is now chief operating officer of Gilda's Club Kentuckiana, a cancer support group she relied on. Ask a family member or a support group to help you keep track, she said. I don't have family or a support group. Consumer protections in the No Surprises Act should help limit out-of-network charges. HAHAHAHAHA Rukavina noted that if you are not insured or not using your insurance and asked for an estimate in advance, you can dispute bills that exceed the estimates by $400. For patients seeking more information about the No Surprises Act and what it covers, Rukavina recommended calling the government's No Surprises Help Desk at 800-985-3059. I haven't called that number, but if I did, and got put on hold for four hours, that would be No Surprise. That's what happens when I contact any other business or government agency that's supposed to "help" and gets no direct benefit for it. If you know you cannot pay the bill, negotiate with the hospital administration or billing department. "That's almost always possible" because hospitals want to avoid the costly administrative burden of sending bills to collections, said Ge Bai, a professor of accounting and health care policy at Johns Hopkins University. You know what would avoid costly administrative burdens? A system where everyone pays based on their income, and everything that's not an elective procedure is covered. And isn't tied to work. Yeah, we're back in the Fantasy section. But I look at it this way: I don't have kids, and yet I finance public schools in two localities through taxes. I'm okay with this, because an educated populace is in my—and everyone's—best interests. Similarly, a healthy populace is in everyone's best interests. And in the end, there's no real difference between paying for health insurance, and paying taxes. But a simpler system would benefit everyone except the leeches who currently suck the blood out of every medical transaction. Unfortunately, those leeches are also lobbyists, so it ain't gonna happen. |
Sometimes, I can relate to the things I share here. This is not one of those times. You just... eat. That's how. But I guess that's not enough for extroverts. In the spring of 2020, as my world shrunk to the square footage of my apartment, food became a mode of injecting pleasure and delight into an otherwise bleak and lonely period of my life. Oh great. A pandemic whine. As time passed, I wondered when, or if, I’d get to dine with friends and family again. I entered a state of despair. Don't get me wrong. I enjoy sharing meals with friends and family. But for me, the despair would be if I didn't know when I'd get to eat alone again. I relied on books, Netflix, and even work to distract myself at dinner. The horror. Eventually I downloaded TikTok, and then that became my new dining companion. The horror! (This time it's not meant sarcastically.) I began seeing myself mirrored on my “For You” page, which served up videos of other people eating alone. In the videos, creators talked to their presumed audiences in animated voices: “I’m so proud of you for eating today,” “No matter what, you deserve to nourish your body,” or “I’m going to take a bite, and then you take one.” Why were these people filming an ordinary, solitary experience and sharing it online? And why were millions of strangers, myself included, watching them every night? At this point, I got the feeling that even if the article answered those questions, I still wouldn't comprehend. How do you go about "not eating?" My entire day, no matter what else I'm doing, is taken up with one of four things: Preparing food, eating food, thinking about the next meal, and, while sleeping, dreaming about food. Even if I decide something like, "I need to not eat for the next twelve hours," I'm still thinking about what breakfast will be. Many of these videos are designed to encourage viewers, especially those with eating disorders or mental-health diagnoses, to eat in tandem with the creator. Okay. Okay, I can accept that some people have mental health issues about it. By no means am I trying to minimize those; I have my own, different issues. But, as with people who are gay or who enjoy anchovies on pizza, I can't fully understand; I can only accept. They found me, in the strange way that the TikTok algorithm knows you better than you know yourself. Which, right there, is enough reason for me to avoid that platform like poverty. ("Like the plague" is overused, clichéd, and it turns out people don't avoid that.) One account that I visited frequently was @foodwithsoy, run by Soy Nguyen, a food influencer based in Los Angeles. I did have a nice laugh at that aptronym, until I got to the "influencer" part and nearly puked. It would be pronounced like "Soy Win." The videos can also balance out messages pushing diet culture and weight loss, says Jaime Sidani, an assistant professor of public health at the University of Pittsburgh. There are real concerns that apps like TikTok can serve as a conduit for harmful eating behavior and poor body image. Okay, so... that much I can appreciate. In conclusion, while I felt no affinity to the article, I'm posting it here in the hopes that it might help someone else. Unless you're already using DikTok, in which case there's no hope for you. |