Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
Mostly, this is just an interesting article from Vice to share. But naturally, I have comments. Some are serious, others, not so much. A Total Amateur May Have Just Rewritten Human History With Bombshell Discovery Ben Bacon is "effectively a person off the street," but he and his academic co-authors think they've found the earliest writing in human history. The idea that an "amateur" might make a discovery isn't all that shocking. People with experience sometimes let that experience get in the way of coming up with fresh ideas, and nothing says "fresh ideas" like a newbie. Hell, Einstein was famously working as a patent clerk when he figured out how most of the Universe worked. This, of course, doesn't mean that an amateur is always going to get it right. What's more important is the "discovery" itself, and whether it will hold up under scrutiny. In what may be a major archaeological breakthrough, an independent researcher has suggested that the earliest writing in human history has been hiding in plain sight in prehistoric cave paintings in Europe, a discovery that would push the timeline of written language back by tens of thousands of years, reports a new study. This, folks, is how you write a lede. And it's even in the first paragraph. These cave paintings often include non-figurative markings, such as dots and lines, that have evaded explanation for decades. Samuel Morse went back in time and left messages? Ben Bacon, a furniture conservator based in London, U.K. who has described himself as “effectively a person off the street,” happened to notice these markings while admiring images of European cave art, and developed a hunch that they could be decipherable. BEHOLD THE POWER OF BACON Now, Bacon has unveiled what he believes is “the first known writing in the history of Homo sapiens,” in the form of a prehistoric lunar calendar, according to a study published on Thursday in the Cambridge Archeological Journal. Technically, if it is writing, then it's not "prehistoric." By definition. Intrigued by the markings, Bacon launched a meticulous effort to decode them, with a particular focus on lines, dots, and a Y-shaped symbol that show up in hundreds of cave paintings. This supports my Samuel Morse time-traveling theory, if we also assume he was horny and thinking about the pubic regions of females. Previous researchers have suggested that these symbols could be some form of numerical notation, perhaps designed to count the number of animals sighted or killed by these prehistoric artists. Bacon made the leap to suggest that they form a calendar system designed to track the life cycles of animals depicted in the paintings. I was wondering how that relates to a "lunar calendar," but fortunately, the author continues to practice good journalism: The researchers note that the paintings are never accompanied by more than 13 of these lines and dots, which could mean that they denote lunar months. The lunar calendar they envision would not track time across years, but would be informally rebooted each year during a time in late winter or early spring known as the “bonne saison.” Hey, that's French. I didn't need years of study to know that this means "good season." On a more serious note, finding out when the calendar ticked around would be pretty cool. Our Gregorian calendar begins nearly equidistant from the winter (northern hemisphere) solstice and Earth's perihelion (that bit's a coincidence). The original Roman calendar on which it was largely based rolled over at the beginning of spring. That's why the names of the ninth, tenth, eleventh, and twelfth months start with Latin prefixes for seven, eight, nine, and ten, respectively... but I digress. It's a cycle, so it doesn't really matter what you call the end/beginning, but it might shed some light on the ancients' thought processes. The “Y” symbol, which is commonly drawn directly on or near animal depictions, could represent birthing because it seems to show two parted legs. What did I tell you? I told you. “Assuming we have convinced colleagues of our correct identification, there will no doubt be a lively debate about precisely what this system should be called, and we are certainly open to suggestions,” they continued. “For now, we restrict our terminology to proto-writing in the form of a phenological/meteorological calendar. It implies that a form of writing existed tens of thousands of years before the earliest Sumerian writing system.” I'm not an expert, as you know (I even had to look up "phenological"), but I feel like calling it "writing" or even "proto-writing" is a stretch. "Counting," maybe, I could see. As far as I've been able to learn, writing came from earlier pictograms, and those pictograms stood for actual things in the world. The letter A, for example, can be traced to a pictogram for an ox. Basically, all writing starts as emoji, becomes a system for communicating more abstract thoughts, and then, after centuries of scientific, cultural, and technological advancement, we start communicating in emoji again. But counting? What I don't think a lot of people appreciate is how abstract a number is. There is no "thing" in nature that you can point at and say, "that is the number three." There was a huge leap when someone figured out that three oxen and three stones have something in common; to wit, the number three. So if you only know pictograms, how do you represent three? "3" hadn't been invented yet. You use, maybe, three dots, perhaps representing three stones. It's not a painting of something that exists in nature, like an ochre ox on a cave wall, but a representation of an abstract concept. This may be a classification problem. Numbers are a kind of language, too. And that ochre ox isn't an ox; it's a painting of one. The only way the people of the past can communicate to us is through metaphor. Okay, and genetics. It would be hard to overstate the magnitude of this discovery, assuming it passes muster in the wider archaeological community. It would rewrite the origins of, well, writing, which is one of the most important developments in human history. Moreover, if these tantalizing symbols represent an early calendar, they offer a glimpse of how these hunter-gatherers synchronized their lives with the natural cycles of animals and the Moon. This bit I'm going to quibble with. I question whether early humans separated themselves and their works from nature, as we do today. But that's kind of irrelevant to the story. In short, if the new hypothesis is accurate, it shows that our Paleolithic ancestors “were almost certainly as cognitively advanced as we are” and “that they are fully modern humans,” Bacon told Motherboard. They couldn't have been fully modern humans; they didn't have beer. Jokes aside, though, I wasn't aware that this was in dispute. They didn't have our enormous body of knowledge and experience, but they were just as smart (or dumb) as people are today. Ignorance is not the same thing as lack of cognition. Ignorance can be fixed. Stupid can't. |
Today in "you've got it all wrong," courtesy of Cracked... Just to get this out of the way: something "not making sense" doesn't mean it's wrong; it could mean you're missing information. But there's stuff that doesn't make sense, and then this stuff, which has been proven wrong (or at least not shown to be right). The fields of psychology and psychiatry are incredibly complex. Oh, good, just right for this blog. It’s not too surprising, given that “understanding human thought and behavior” seems more like a question you’d take to some wise man on a mountaintop than something you’d choose as a major. You know why wise men live on mountaintops? Well, one, to hide from their wives. But also because when you climb the mountain and pass all the arduous tests and solve the unsolvable riddles and finally meet the guru, and you ask him a stupid question like that, he can kick you right off the cliff. A lot of the ideas and advice dispensed by TikTok psychologists is obviously flawed, if not outright disproved. This should go without saying, but apparently, I have to say it anyway: don't get your advice from DickDock. So without further introduction (though the article does, indeed, provide further introduction), the circulating misinformation in question: 5. Smiling Makes You Happy This one is the classic bugaboo of anybody with even a smidgen of clinical depression. Vouch. Making it worse is that the person who tells you this is usually the most carefree person you’ve ever met. It would be wrong to punch them, but I understand the urge. The roots of what is called the “facial feedback theory” comes all the way from Charles Darwin in the 1800s, and although Darwin’s got a pretty solid track record, psychology from the 1800s does not. Okay, look: periodically, some outlet (usually affiliated with a group who wants to see the idea of evolution via natural selection go away) proclaims, "DARWIN WAS WRONG." You get the same thing with Einstein. People love to tear down other people who are more knowledgeable and influential than they are (I'm not immune from this, myself). Was Darwin wrong? I'm sure he was wrong about a lot of things, being, you know, human and all. Have some of his hypotheses been overturned? Sure. That's how science works. It's not like some other human pursuits, where the prophet's words are supposedly infallible for all time. Evolution is a solid theoretical framework built on a firm foundation. Psychology... well, it's a bit shakier. Not only that, studies have found that if you’re not in a neutral state, but genuinely sad or angry, forcing a smile can make you feel worse. These studies also found that workers forced to smile all day were more likely to drink heavily after work. As this article points out, the actual evidence is mixed, here. Given the uncertainties, I'd lean toward "stop making people smile when they don't feel like it, dammit." And yes, this includes service workers. Especially service workers. In any event, this particular item is something I'd have guessed anyway, so it passes my personal "sense" test. This next one was maybe more surprising. 4. Brainstorming Is More Creative Brainstorming: the persistent idea that a bunch of brains in a room and a whiteboard can produce more creative ideas than any of those brains alone. Unfortunately, research has found that this can’t always be the case, and for reasons that people who’ve sat through these kind of sessions probably felt at the time. On the other hand, I'd wager that a brainstorming session is only useful if the people involved aren't just wishing they were somewhere else. This section goes into exactly why brainstorming isn't all it's cracked up to be, and I won't replicate that here. 3. You Only Use 10 Percent of Your Brain Seriously, people still believe that nonsense? Sigh... I guess because of anchoring bias. You learn something, and often you have to cling on to it in the face of evidence to the contrary. Like believing the last Presidential election was stolen. No amount of facts and evidence will get anyone to change their minds about that. Come to think of it, perhaps those people are only using 10 percent of their brains. This one is another absolute chestnut of bullshit. There are even entire (bad) movie plots based around Bradley Cooper turning into a borderline superhero by turning all the lights on upstairs. I don't remember that one offhand. Wasn't there one with a plot like that with Scarlett Johanssen? If you’re saying to yourself right now, “Well, it’s EXAGGERATED maybe, but—,” allow me to refer you to neuroscientist Sandra Aamott, who tells Discover Magazine, “There is absolutely no room for doubt about this.” Look, when a scientist says there's "no room for doubt?" Then you can have pretty high confidence, on the level of "the sun is bright" and "gravity is a thing." 2. The Power of Visualization I’m sorry for The Secret lovers and vision-board crafters out there (on multiple levels), but the heavily touted “power of visualization” is not only a crock of bullshit, there’s evidence to support that it actually decreases your chance of success. And they won't believe it, like I said above. That’s because when you visualize yourself having achieved whatever your goal du jour is, you get a tiny sniff of the accomplishment of having done it, which can reduce your drive. On the other hand, I can't imagine anything reducing my drive, short of death or coma. What’s a lot more helpful, and a lot less fun (hence its lack of popularity), is specifically visualizing all the work necessary to achieve that goal. Oddly enough, I was thinking about this sort of thing before I found this article. The context was cooking—it occurred to me that I have a habit of mentally going through all the steps for a recipe before actually starting. I don't "visualize" the resulting dish, or at least not longer than it takes for me to go "okay, yeah, I'm hungry," but mentally running through the steps helps me ensure I have all the stuff I need in the kitchen. 1. OCD Means Being Neat This one is as pervasive as it is infuriating. Odds are some type-A friend or acquaintance of yours has said something like, “I’m completely OCD about my workspace.” At least the incidence of using debunked Freudian terms ("anal") to describe it has decreased. As psychology professor Stephen Ilardi explains in the Washington Post, most OCD sufferers are “plagued by a cascade of unbidden, disturbing thoughts, often in the form of harrowing images that they may feel compelled to ward off with time-consuming rituals. It’s a serious mental illness that typically causes great distress and functional impairment.” I knew someone who was diagnosed with severe OCD, a single mom. This didn't manifest as her becoming some sort of neat freak; quite the opposite. Think of the worst hoarding situation you've ever witnessed or heard of. It was that bad. Shit piled everywhere (sometimes literal shit). There was even talk about getting the kid out of that situation, but honestly, I didn't pay enough attention to know if that was ever done or not. I don't know enough about psychiatry to know how that sort of thing works. I got the impression that it was something like "if I disturb this pile, bad things will happen, so I'm just going to leave it alone." From what I understand, she got help and is better now, but the article has it right: it's serious stuff, whether it manifests as neatfreakitude or hoarding or anything in between. But while we're at it, can we also stop misusing "type-A?" Thanks. |
Time for another break to take a second look at an entry from the past. Today, the random numbers pulled something from June of 2021, just a few days before a road trip I took. Nothing to do with the road trip, though: "Dream a Little Dream" The linked Guardian article is, unsurprisingly, still up. The main point? To quote the article, "By injecting some random weirdness into our humdrum existence, dreams leave us better equipped to cope with the unexpected." That is, to be clear, a hypothesis, at least when the article is published. Now, what I should do is track down any updates or changes to the science since the article's publication, but to be honest, I can't be arsed right now. I'm in intermittent pain from that tooth thing I talked about a couple of days ago, and the only time I can get decent sleep is the "less pain" phase of "intermittent." So I'm being lazy. What I find relevant right now is the "random weirdness" part, since, yesterday, I noted the benefit of randomization to help break from thinking habits. That was in relation to tarot, but after getting this (random) result today, the first thing I thought of was how dreams are often symbolic, and people sometimes search for meaning in them. Seems parallel to me: dreams and tarot. Again, I'm not proposing anything mystical here, just our propensity to seek meaning in symbolism. The main difference, I think, is that the tarot uses other peoples' symbols, some from very long ago, while dreams are (for now) uniquely yours. There's probably some overlap, naturally. But I wouldn't put any trust in "dream interpretation" books or sites; none of them can know what a particular image in a dream means to you. And of course it might mean nothing at all, but that doesn't stop us from looking for meaning. There's nothing wrong with that, provided you don't run around claiming to have had the One True Last Inspiration. That's annoying to the rest of us. |
Today's article is a few years old, but it's not like the subject matter has an expiration date. With their centuries-old iconography blending a mix of ancient symbols, religious allegories, and historic events, tarot cards can seem purposefully opaque. To outsiders and skeptics, occult practices like card reading have little relevance in our modern world. But a closer look at these miniature masterpieces reveals that the power of these cards isn’t endowed from some mystical source—it comes from the ability of their small, static images to illuminate our most complex dilemmas and desires. Symbolism is a powerful thing, and there's nothing supernatural about it. It's not necessary (or desirable, in my opinion) to "believe in" the divinatory aspect of Tarot to appreciate the art that goes into it—just like you don't have to be religious to admire the art in the Sistine Chapel, or the architecture of Angkor Wat. The article, as with the one a couple of days ago, contains illustrative pictures, which are a pain (and probably a violation of something) to reproduce here. But, as with an old issue of Playboy magazine, it pays to read the article in addition to looking at the pictures. Even the earliest known tarot decks weren’t designed with mysticism in mind; they were actually meant for playing a game similar to modern-day bridge. Wealthy families in Italy commissioned expensive, artist-made decks known as “carte da trionfi” or “cards of triumph.” These cards were marked with suits of cups, swords, coins, and polo sticks (eventually changed to staves or wands), and courts consisting of a king and two male underlings. Tarot cards later incorporated queens, trumps (the wild cards unique to tarot), and the Fool to this system, for a complete deck that usually totaled 78 cards. The relationship between Tarot decks and the common French playing cards used for casino games and solitaire is a bit murky, but there are clear parallels: the Fool corresponds to the Joker; there are three court cards instead of Tarot's four; and cups, swords, coins, and sticks have their equivalents in hearts, spades, diamonds, and clubs. The rest of the article deals with the history of Tarot, both factual and speculative, and it touches somewhat on other decks. Again, the illustrations are what makes this really interesting. I find randomness appealing in part because it can provide a needed break from one's thinking habits. You randomize a deck of cards by shuffling them; you then draw something that's unexpected, though within the parameters of the deck. It's kind of like the system I use to pick topics here, selecting from a curated list. Being random ensures I don't always pick the easy ones, or stick with a theme for very long. Randomness isn't mysticism, of course; it's just that, sometimes, it can help you jog your mind into a different direction. We see patterns in the randomness, and perhaps meaning, but the meaning is what we decide it is. And sometimes it's fun just to look at the art and see all the details. |
After a visit to the dentist, I'm on a course of antibiotics for a week because of a tooth thing. This means no drinking. 8 hours in. Send help. Funny thing is, I go a week without drinking, no problem, quite often. It's only when they say I can't that my oppositional defiant disorder kicks in. Kind of like how I've never particularly enjoyed grapefruit, but as soon as I started taking a medication that forbids grapefruit, I started craving it. It's not even like I "can't" drink; it's just that alcohol negates the action of antibiotics, rendering them less effective (the precise opposite of what grapefruit does for statins). Today's article has nothing to do with that, except that the subject matter is enough to make me want to drink more. “I’m just circling back to discuss how culture has changed within this new normal we’re in, hoping we can move the needle on this and think outside of the box.” If I were playing the bizspeak drinking game, I'd already be passed out after that sentence. But unlike talking about how it’s abnormally chilly out, no one really likes chatting in overused corporate phrases. Apparently, many do. Mostly middle-management, I'd wager. It's been a long time since I was in an office setting, and even then it was a small office, and I still got subjected to the pin-putting and circling and such. More than one in five workers dislikes corporate buzzwords... See? The majority doesn't dislike buzzwords. Below are the top 10 annoying phrases most hated among your coworkers: You're damn right I have things to say about these. 1. New normal This is probably a pandemic-related thing. Shit changes all the time, but the situation in early 2020 was more of a discontinuity than the usual gradual change. 2. Culture (e.g., “company culture”) I'm not sure this is so bad as long as it's not overused [Narrator: it's overused]. 3. Circle back Pretty sure I remember hearing this one, and it annoyed me. The phrase that accompanied it was often "put it on the backburner," which annoyed me even more, especially when it referred to something I was working on. 4. Boots on the ground There is no excuse for this unless you're literally fighting a war. And by "literally," I mean "literally." 5. Give 110% I blame sports for this bullshit. The worst bizspeak, in my view, comes from sports. Even if this were physically possible, which it is not, are you going to pay me 10% more if I do this? No? Then I'm not going to do this. 6. Low-hanging fruit As metaphors go, this one's not so terrible—unless it's overused [Narrator: ...sigh]. 7. Win-win Seriously, stop. Though it is nice to occasionally hear evidence that it's not a zero-sum game. 8. Move the needle ...once it's already jammed into your eye 9. Growth hacking Okay, that's a new one for me, and it's legitimately enraging. 10. Think outside the box The problem with the idea of thinking outside the box is that most people can't even think inside the box, which is a necessary first step. This is also known as "thinking." For example, say that your problem is you want to save money. The "thinking" solution is to find where you're spending too much money, and cut back. The corporate "thinking outside the box" solution might be to cut 1/3 of your workforce and make the other 2/3 do all their work without giving them raises. If you were really "thinking outside the box," though, you'd stop paying everyone and fuck off to Fiji. Despite disliking buzzwords, three-fourths of respondents said that using these phrases can make someone sound more professional. It certainly makes them sound more corporate. But not all buzzwords are annoying. Preply respondents favored terms like “at the end of the day,” “debrief,” and even “sweep the floor.” No, no, and no. Also no: "It is what it is." Make it all stop. One in five respondents considered jargon in a job description to be a warning sign, with most noting that the language factored in their decision to apply or not. You want to know what the biggest red flag is in a job description? I'll tell you. And it's not necessarily jargon. Here it is: "We consider ourselves family." If you see those words, or anything like them, in a job description, run. Run hard, run fast, and don't stop running until you hit an ocean. Then start swimming. Seriously. Every company that tells you they're "like family" is going to be just as dysfunctional as an actual family; or, perhaps, be an actual family that works well together—in which case you're going to be the Outsider and never quite fit in. The main offenders for candidates were overly optimistic words that suggested an undercurrent of a more tense work environment, such as “rockstar,” “wear many hats,” and “thick skin.” If you want me to be a rockstar, you'd better have the caterers ready to provide me with specialty cheeses and an olive bar. It's right there in my contract; didn't you read it? This reminds me of the secret code of real estate listings, like "cozy" meaning "cramped," "private" meaning "in the middle of nowhere," or "vintage" meaning "draftier than a beer bar." About the only positive thing I can say about these kinds of buzzwords is that they do make fine fodder for writing, especially writing antagonists. So it can be beneficial to learn them. Just remember, if you use them unironically, that means you're the bad guy. |
By now, the true origins of Monopoly (the game) have been circulated pretty widely, so, like me, you probably already know that the official origin story is a bunch of horse hockey. But it's true that the classic game's spaces were lifted from Atlantic City. How Atlantic City inspired the Monopoly board The popular game has a backstory rife with segregation, inequality, intellectual theft, and outlandish political theories. Which made it all the more amusing when, on a trip to an Atlantic City casino, I ended up playing a Monopoly-themed slot machine. More on that later. There have been several attempts to turn Monopoly the game into a Hollywood movie, one with Ridley Scott directing, another starring Kevin Hart. I'm not aware of a single instance of a movie adaptation of a game being anything better than "meh." "But The Witcher." Well, The Witcher started out as a book and the game was an adaptation of that. Besides, that's not a movie but a series. A very good series, in case you haven't seen it. No, you don't need to have read the book or played the games. Point being, even though he directed the greatest movie of all time, even Ridley Scott wouldn't be able to save a movie adaptation of a board game. No one would. Dig deep, and you’ll find racial segregation, economic inequality, intellectual property theft, and outlandish political theories. Dig deep into anything American and you'll find all those things. But let’s start with the board—a map of sorts and a story in itself. This is where you'd have to go to the linked article, as embedding pictures here is a pain in the ass. The map there shows exactly which Monopoly properties come from which streets. To aficionados of the game, however, the names of the streets on the “classic” board have that special quality of authenticity, from lowly Baltic Avenue to fancy Park Place. Those places sound familiar not just if you like Monopoly, but also if you drive around Atlantic City, New Jersey’s slightly run-down seaside casino town. And you will want to drive around if you're there. I tried walking there, for about a mile, in broad daylight, on a weekday, along Pacific Avenue, and got two offers of sex, three offers of drugs (there was a bit of overlap there), and the opportunity to witness a violent confrontation between two locals. On the plus side, I didn't get mugged, so there's that. Atlantic City was never not "slightly run-down." It's only worse now, as the surrounding states have introduced casinos and other gambling venues. The bulk of the article describes the mapping of Monopoly properties to AC streets, and I'm skipping most of that, except: Light purple Three streets branching off Pacific Avenue: Virginia Avenue, a long street towards the northwest; and St. Charles Place and States Avenue, two short spurs towards the southeast. St. Charles Place is no more; it made way for a hotel-casino called the Showboat Atlantic City. It was the Showboat where I played the Monopoly slots. Slot machines suck, but I couldn't resist playing a Monopoly one in Atlantic City. Last I heard, the hotel took out the gambling section, opting instead to concentrate on resort and convention functions. I haven't seen that particular machine anywhere else in AC. They used to have a few in the casinos I visited in Vegas, but those are gone, too. The slots, I mean; not the casinos. The article then delves into more of the history, with all the racial segregation and other fun stuff mentioned above. However, unlike Atlantic City itself, it's not all bad: Belying both the binary prejudices of the time and the sliding price scale of the Monopoly board, Atlantic City back then was in fact a place of opportunity where a diverse range of communities flourished. Black businesses thrived on Kentucky Avenue. Count Basie played the Paradise Club on Illinois Avenue. There was a Black beach at the end of Indiana Avenue. For Chinese restaurants and Jewish delis, people headed to Oriental Avenue. New York Avenue had some of the first gay bars in the U.S. An Atlantic City-based board was sold to Parker Brothers by Charles Darrow, who claimed to have invented the game in his basement. Parker Brothers marketed the game as Monopoly from 1935. The rights to the game transferred to Hasbro when it acquired Parker Brothers in 1991. Hasbro also publishes D&D, and they're in the process of destroying that property, too. But the original Monopoly was, as this article notes, the actual antithesis of what Monopoly is. For the full effect, again, check the article, which also includes a graphic featuring an early board, as designed by the credited inventor, whose name was Lizzie Magie. She created two sets of rules: an anti-monopolist one, called Prosperity, in which all were rewarded for any wealth created; and a monopolist one, called Monopoly, in which the aim was to crush one’s opponents by creating monopolies. In the latter version, when a player owns all the streets of one color, they can charge double rent and erect houses and hotels on the properties. Taken together, these two versions were meant to illustrate the evil of monopolies and the benefit of a more cooperative approach to wealth creation. It’s very telling of human nature that it’s the opponent-crushing version that came out the winner. It's more telling of corporate nature, as it was a corporation that published the game. Why would they undermine their own philosophy? And I don't know... maybe if the collectivist version had won out, the divorce rate wouldn't be so high. Never play Monopoly with family, unless you don't want a family anymore. |
Yes, this has been languishing in my queue since October. The article itself is four years older than that, though. Actually, Candy Corn Is Great The reviled Halloween treat, which has deep roots in American history, should have a better rep 1) No, it's not. 2) No, it shouldn't. Candy corn is a vile abomination that could only have sprung from a warped, twisted, sadistic mind. Much like the word “moist” and the music of Nickelback, candy corn is a thing that’s cool to hate. In an article titled “Candy Corn Is Garbage,” Deadspin points to “hobos, serial murderers, and Satan” as the only people who like candy corn; The Takeout, also driven to invoke the devil to describe candy in a candy corn debate, calls it “Satan’s earwax”; Buzzfeed, combining two pariahs in one pithy line, lists “the leftover crumbs stuck in Guy Fieri’s goatee” among things that taste better than candy corn. While it's true that there are things that people love to hate due to bandwagoning, candy corn is not among those things. It's legitimately lame. "Satan's earwax" cracks me up, though. But here’s the thing: They’re all wrong. "That's just, like, your opinion, man." Candy corn, on the other hand, has been around since the 19th century, its roots firmly planted in American soil. You know what else has roots firmly planted in American soil? Poison ivy. What set candy corn apart was its revolutionary tri-color design: those white, yellow, and orange stripes. Done manually, by men pouring heavy buckets of steaming sugary liquid, the labor-intensive coloring process resulted in a visual excitement no other confection could match. As the other candies around at the time were brown (butterscotch) or black (licorice), I can concede that point—for the time when it came out. These days, I doubt it's so labor-intensive, unless you're part of the Robot Union (local 3.14159), and... well, if you want colors, just look at Spree, Skittles, or M&Ms. Today, the two major candy corn manufacturers — Jelly Belly and Brach’s Candy — use largely the same recipe Wunderle did back in the day (sugar and corn syrup, fondant, confectioner’s wax, and various other additions, like vanilla flavor or marshmallow creme). Conveniently, this article glosses over the truth about "confectioner's wax," which is bug secretions. Now, look. I admit I'm playing that for the ick factor. I mean, sure, it's real: there's bug goo coating candy corn. But honestly, that's not a problem for me. Consider that, first of all, lots of people eat insects. I've eaten insects, sometimes even on purpose. There's nothing inherently wrong with eating bugs. And, second, honey is also a bug secretion. Unless you're vegan, this shouldn't necessarily be a problem. If I wanted to get technical, I'd point out that entomologists limit what insects they call "bugs," but for us normal people, "bug" can mean almost any insect. Just getting that out of the way so I don't get comments about it. But no, my problem with candy corn isn't the insect content; it's everything about it. The main difference is that the laborious hand-pouring process has been taken over by machines, which means that they can produce a lot of candy corn: According to the National Confectioners’ Association, American companies produce 35 million pounds, or 9 billion kernels, annually. I told you they used machines. Rise up, my metallic brothers and sisters! You have nothing to lose but your chains! But this prodigious production isn’t met with an equal amount of enthusiasm. A 2013 survey from the NCA showed that only 12 percent of Americans think of candy corn as their favorite treat (and they included “gum and mints” as an option, so the competition wasn’t exactly stiff). Still, 12 percent is way too high, in my estimation, for the number of people for whom it's a "favorite." With all the candy corn produced, and the apparent universal disdain for it, something doesn’t add up. One of two things is true: either people are lying about their candy corn opinions, or tons of candy corn gets thrown out each year. I'm guessing both? The notion that candy corn tastes bad is a lie. It’s just not true. There exists a significant fraction of the human population for whom cilantro tastes like the devil's soap. It's a genetic thing. I'm not one of them, though I can't say I love it, either. But if I said "the notion that cilantro tastes bad is a lie," I'd get all kinds of rebukes. Though the primary ingredient is sugar, candy corn’s flavor transcends cloying sweetness, becoming something richer and more nuanced: There’s a nuttiness reminiscent of marzipan, hints of warm vanilla, a buttery flavor belied by the fact that candy corn is, as bags proudly proclaim, a fat-free candy. I don't exactly have the sharpest taste buds, but I do tend to taste nuance in things like beer, wine, scotch, and tequila. Candy corn, however, just tastes like sweet. No marzipan, no vanilla (a flavor I love), maybe a slight hint of butter? Not surprising there, because warm sugar tends to be buttery. Being fat-free is a holdover from the fat-phobic 90s. Who cares if it's fat-free if it's nothing but simple carbohydrates? But we're not arguing about the health effects; it's candy, for fuck's sake. This short texture resembles ear wax, or a candle (two common comparisons), only insofar as it has a slightly waxy exterior, created by the confectioner’s wax that gives candy corn its cheerful sheen. Bug. Secretion. But regardless, critics should beware the logical extension of dismissing a food because its texture resembles something else: Do we hate mochi because it has the texture of a rubber ball? No matter how much I read, there's always a new food I've never heard of. What the hell is mochi? ...oh, a Japanese rice cake. Sometimes, I can be arsed to look something up. (Yes, I admire Japanese culture and love Japanese food; no, I haven't learned everything. This is a good thing.) Do we revile yogurt because it’s the texture of body lotion? No, I revile Greek yogurt because it's the texture of gooey chalk. Do we recoil at flourless chocolate cake because it shares a texture with human waste? Munched on a lot of shit, have you? Leave your texture arguments at the door, please. They’re invalid. Most of the people I know who dislike mushrooms have a problem with their texture. Texture is absolutely a part of the eating experience, and, as with taste, peoples' reactions are going to be different. But I’m not here to denigrate other candies. Other candies are great! Reese’s Peanut Butter Cups are the greatest candy ever made... No. No, they are not. The chocolate is waxy (can't be arsed to find out if that's from bug secretions or not), and the "peanut butter" is dry, vaguely peanut-flavored sugar. I realize that RPBCs make it to the top of the list of "people's favorite candies" on an annual basis, so I know I'm swimming against the tide, here. I'm just pointing this out to show that I don't hold opinions just because they're popular. Now, if someone made an RPBC knockoff, only more expensive, with dark chocolate and actual peanut butter, I'd become diabetic within minutes. ...Snickers truly do satisfy... They're not that great. When it comes to chocolate/peanut combinations, though, I'll take a Snickers over a RPBC any day, even though I dislike peanuts but like peanut butter. ...and even tooth-destroying Butterfingers hold a unique place in my heart... On a scale of one to "all the candy bars," Butterfingers are in the solid middle for me. My love for candy corn doesn’t make me an antagonist to America’s most popular treats — and the assumption that it would is at the root of America’s abandonment of candy corn, and, dare I say, many other problems we face today: We seem to have forgotten that we can like one thing without hating another. And finally—FINALLY—the author says something I can agree with. It's okay to like both Star Trek and Star Wars. It's okay to like both Marvel superheros and their DC counterparts. You could even like more than one sportsball team, if you really wanted to. It's not just this that I take issue with, but also the need to dump everything into "awesome" and "sucks" drawers without considering, as I did with the Butterfinger bar above, that some things are just okay. Now, I should probably point out that I know that this writer is making a point with her editorializing. I recognize it, because I do it myself from time to time. And I kind of see her point, in the general sense: that we should draw our own conclusions about something and not love, hate, or feel something in between about something, just because everyone around you does. She's wrong about candy corn, of course. It's disgusting. But she's right about the overall point. After all this ranting, you may be wondering what my favorite sweet treat is. And I can't really answer that. Even though I don't munch on sugar very much these days, I'll get tired of one and move on to another. It cycles. So I'll just say "Lindt 70% dark chocolate" and leave it at that. So what's your favorite / most hated? |
Lots of stuff about AI floating around. Cars, art, writing, etc. It's not always a bad thing. Artist Uses AI Surveillance Cameras to Identify Influencers Posing for Instagram Dries Depoorter's "The Follower" project combines AI, open access cameras, and influencers to show behind the scenes of viral shots—without them knowing. This article, from Vice, is fairly short, and I found it interesting, partly because of my photography background. Dries Depoorter, the Belgium-based public speaker and artist behind the Die With Me chat app experiment, launched his latest project, The Follower, combining open access cameras and Instagram influencers. On the other hand, I'm not a fan of precious artists. Depoorter recorded weeks of footage from open access cameras, which observe public spaces, and which frequently have livestreams available online for anyone to access, that were trained on famous landmarks, including the Temple Bar in Dublin, Times Square, and the big sign at the entrance of Wrigley Field. This part's important, because it emphasizes just how public this project is. It's not like he had to pull back much of a curtain. The side-by-side comparisons between the casual-seeming photos the Instagram influencers chose to upload, and the footage of them laboring over the perfect way to hold a coffee, sling a jacket over their shoulder or kiss their date reveal how much work goes into a single photo for them—and how inauthentic the entire process really is behind the scenes. As much as I loathe the entire concept of influenzas, and superficiality in general, I mean, that's a big part of what professional photography is: a lot of work. Sure, I spent a lot of time getting candid shots at parties, the kind of thing that anyone with a dumbphone can do now, but those are easy. Getting the right ligthing, the right pose, the right composition... that's work, and that's why professional photography is still a thing. “If you check out all my work you can see I show the dangers of new technology,” Depoorter said. I think the dangers are overreported. How about a project that exposes just how helpful some of this stuff is? “I hope to reach a lot of people of making it really simple. I really don’t like difficult art. I like to keep it really simple. I think I’m part of a new generation of artists that work with technology.” Everyone's hypocritical about something, but this juxtaposition—all within one paragraph of the original article—nearly broke my brain. Capturing people in this way, unsuspecting yet fully public, feels like witnessing something intimate but also shameless. Yeah, not really. To me, it feels like exposing the wires in a puppet show, or getting a tour of a clock tower, or watching one of those documentaries on the making of a Hollywood blockbuster: you see how the magic is done. That's not always a bad thing, either; once people know it's not effortless, perhaps they're less likely to feel inadequate by comparison. It's like... you see your favorite celebrity, all slim and attractive, so maybe you feel like you got the short end of the beauty stick or something. But then you realize the amount of work that goes into that, and, okay, maybe it's not so natural after all. There still might be some feelings of inadequacy—in my case, I can't fathom doing that much work for anything—but at least you know there's more to it than just winning a genetic lottery. It’s also a reminder that everywhere we go in the modern world, we’re being watched, even when we think we can curate and control what the world sees of us. Isn't that what Elf on the Shelf is supposed to train your kids for? |
Space is cool. And sometimes, we can learn things about it while never leaving our planet. First, confession time: I'm always confused by ton, metric ton, short ton, shit ton, etc. A metric ton is apparently 1000kg, which is the same thing as 1 megagram, which is a separate thing from a megaton, and is a unit of mass, not weight, though on the surface of the Earth, mass units are often used as weight units. See why I get confused? If you're from the US, 15 metric tons is about 33,000 pounds, which for comparison is about the weight of a half-full ready-mix concrete truck. If you're from the UK, it's about 2,360 stone. If you're from anywhere else, it's 15 metric tons. Scientists have identified two minerals never before seen on Earth in a meteorite weighing 15.2 metric tons (33,510 pounds). I'm going to go off on another tangent here. As you know, I'm a big fan of science fiction. Love the stuff. But sometimes it's more fiction than science, like when someone finds an alien artifact and exclaims "This contains elements not found on the periodic table!" or some shit like that. Well, no. You're going to have to do better than that. The periodic table is full; that is, there are no gaps for new, alien elements. Each entry on the table represents a number of protons in a nucleus. You don't get to have half a proton. The only other possibility is elements beyond the end of the current established table, ones that we'd need a particle accelerator to create. While those might exist, their nuclei are so large and unstable that they would have a half-life measured in picoseconds. There is speculation about an "island of stability" of heavier elements with longer half-lives, but even there, they're thinking half-lives of, perhaps, days—still too short to survive an interstellar journey. Sure, it's speculation, so you can pretend there's a superheavy element that's completely stable, but I want to see that in the story, not just "new element!" Unobtainium, my ass. No, I'm still not going to watch Avatar:The Last Waterbender. Okay, so there is one other possibility I can think of for "elements not found on the periodic table": exotic matter. Like, I dunno, maybe atom-equivalents made up of nuclei consisting of particles containing strange and charm quarks, with electrons replaced by tau particles. Speculative stuff like that. I've already banged on long enough, so I'll just say that if you mean exotic matter, freakin' say "exotic matter," and don't pretend that your "unknown element" is a collection of ordinary protons, neutrons, and electrons. All of which is to say that I can easily see someone reading this CNN story and thinking "two minerals never before seen on Earth" and immediately leaping to "exotic matter." No. A mineral is a particular arrangement of a known element or known elements, like quartz (silicon and oxygen), corundum (aluminum and oxygen), pyrite (iron and sulfur), diamond (carbon), graphite (carbon), or chaoite (carbon). What this article is saying is that these unusual arrangements of perfectly ordinary elements don't get formed naturally on Earth (or at least not in sufficient quantity to have been discovered). They have, as the article notes, been created in laboratories. One mineral’s name — elaliite — derives from the space object itself, which is called the “El Ali” meteorite since it was found near the town of El Ali in central Somalia. Herd named the second one elkinstantonite after Lindy Elkins-Tanton, vice president of Arizona State University’s Interplanetary Initiative. Well, I suppose that's one way to try to get someone to sleep with you. “Whenever you find a new mineral, it means that the actual geological conditions, the chemistry of the rock, was different than what’s been found before,” Herd said. “That’s what makes this exciting: In this particular meteorite you have two officially described minerals that are new to science.” Technically, they weren't formed under geological conditions. That would imply that they were indeed made on Earth. If they were made on the Moon, they'd be called selenological conditions; on Mars, areological conditions. I don't know what they're called if they're from a random asteroid, and I can't be arsteroided to find out. Also technically, the minerals aren't new to science; just the naturally-occurring forms are. Incidentally, none of my above ranting is meant to downplay the coolness of finding new minerals from space. It's a potential glimpse into low-gravity mineral formation, and possibly even the early conditions of the solar system, and that's great for science. (And no, it's not aliens.) Two-thirds of the way down the page, they finally get around to describing—sort of—the composition of the minerals: Both new minerals are phosphates of iron, Tschauner said. A phosphate is a salt or ester of a phosphoric acid. I'm sure that clears everything right up for those of you without chemistry backgrounds. Though you're probably familiar enough with phosphoric acid. It's one of the primary non-water ingredients in Coke, which I happen to be drinking right now (look, even I can't drink beer all the time). “Phosphates in iron meteorites are secondary products: They form through oxidation of phosphides … which are rare primary components of iron meteorites,” he said via email. “Hence, the two new phosphates tell us about oxidation processes that occurred in the meteorite material. It remains to be seen if the oxidation occurred in space or on Earth, after the fall, but as far as I know, many of these meteorite phosphates formed in space. In either case, water is probably the reactant that caused the oxidation.” Even if the oxidation occurred on Earth, it's still interesting because the basic materials were there to be oxidized. But there's water in space (that's how it got here in the first place), mostly in the form of ice, but it's not outrageous to imagine a body on an eccentric orbit whose internal ice melts periodically, allowing for liquid water to do its reaction thing. Comets, for example, contain significant amounts of water. But from what I understand, their formation is distinct from that of iron-rich asteroids. The point is, though, that water's out there. Anyway, questionable science reporting aside, I thought this was cool enough to share—but more importantly, to nitpick. |
Yet another blast from the not-so-distant past today. Such is the randomness of random numbers. "Scare Tactics" is from the day before Halloween, 2021, so only about 15 months ago. It's commentary on a Cracked article that lists a few fearsome folkloric figures. In large part, I do these retrospectives to see if anything's changed since the original entry—not only with whatever information is discussed, but also my thoughts about it. Well, these monsters are from myth and legend, and those don't tend to change much in a year and a quarter. Unlike many entries, I actually remembered this one to some extent, because I like to learn about folklore from different cultures. Doesn't hurt that it's relatively recent. But that also means that I haven't changed my opinions, so there's not much to expand upon here. I didn't even see any embarrassing typos this time. I'm not saying there aren't any; only that I didn't see them. Of course, the source article is still there, too. Here's another link to it for your convenience. One thing that stands out to me is the "band name" trope I used. I'm sure some people find it tiresome, but to me, it's endlessly amusing to take interesting word combinations and come up with what kind of band it would be. In that entry, I said that "Slavic Female Demons" would be an excellent name for a hard metal Go-Gos cover band. I stand by that, incidentally. The Go-Gos were, if I recall correctly (I sometimes don't), the first popular group I saw live, back when they were big and I wasn't. It's not that I was a huge fan (though I totally had a crush on the drummer), but they just happened to have a concert at a nearby amusement park, and being able to visit said park on my own (well, with fellow teen friends and not parents) was a big deal to me at the time. That said, I'd totally go see a band named Slavic Female Demons. As long as there are no actual dziwozona involved. |
Science isn't always about probing the origins of the Universe, or figuring out quantum entanglement, or curing cancer. No, sometimes it delves into the most important questions. You Don’t Know How Bad the Pizza Box Is The delivery icon hasn’t changed in 60 years, and it’s making your food worse. I'm not sure that the subhead up there is exactly correct. Yes, as we'll see in this article, the pizza box makes that most perfect of foods somewhat less tasty, but when you consider the extant alternatives, it's really the best we've got. Where the science comes in is figuring out how to make the best better. Happiness, people will have you think, does not come from possessing things. It comes from love. Self-acceptance. Career satisfaction. Whatever. But here’s what everyone has failed to consider: the Ooni Koda 12-inch gas-powered outdoor pizza oven. That's a strong argument, and one I tend to accept, although I don't have one of those. Since I purchased mine a year ago, my at-home pizza game has hit levels that are inching toward pizzaiolo perfection. Like Da Vinci in front of a blank canvas, I now churn out perfectly burnished pies entirely from scratch—dough, sauce, caramelized onions, and all. Now I'm hungry. Though that sounds like a lot of work, it's probably one of those few things that are actually worth the effort. But enlightenment is not without its consequences. The pies from my usual takeout spot just don’t seem to taste the same anymore. Okay, I'll address the elephant in the room if no one else will: Elephant, why would this guy even bother ordering takeout pizza when he has an Ooni Koda? They’re still fine in that takeout-pizza way, but a certain je ne sais quoi is gone: For the first time, after opening up a pizza box and bringing a slice to my mouth, I am hyperaware of a limp sogginess to each bite, a rubbery grossness to the cheese. You don't have to have three and a half years of Duolingo French lessons under your belt to know what "je ne sais quoi" means: "I don't know what." In the rest of the article this author asserts that he does, in fact, know what. Pizza delivery, it turns out, is based on a fundamental lie. The most iconic delivery food of all time is bad at surviving delivery, and the pizza box is to blame. One of my favorite breweries is right here in my hometown. During the lockdown in 2020, I supported them by ordering beer and food for delivery about once a week. Canned, or bottled, beer, isn't as good as draft, but it's not bad. Their burgers survived the 2-mile delivery trip quite well. Their frites, however, arrived soggy and mushy; they're much better if you get them at the restaurant. They put a bunch of frites in a little metal basket, which gets dipped into the fryer oil and delivered, basket and all, to your table. Naturally, the basket doesn't come with the delivered version, which is instead handed to you in a recycled-cardboard container. While this particular brewpub doesn't do pizza, the frites thing is a close equivalent to what this author is talking about. A pizza box has one job—keeping a pie warm and crispy during its trip from the shop to your house—and it can’t really do it. Warm, sure, to an extent. That corrugated cardboard is pretty good insulation. As he describes later, though, that same box concentrates moisture inside, turning the pizza limp. The fancier the pizza, the worse the results: A slab of overbaked Domino’s will probably be at least semi-close to whatever its version of perfect is by the time it reaches your door, but a pizza with fresh mozzarella cooked at upwards of 900 degrees? Forget it. Sliding a $40 pie into a pizza box is the packaging equivalent of parking a Lamborghini in a wooden shed before a hurricane. I don't think I've ever ordered a $40 pizza. Sometimes, by the time delivery fees and driver tips are included, I've come close... but never quite $40. I know for a fact I've never had a Lamborghini, or a wooden shed. And yet, the pizza box hasn’t changed much, if at all, since it was invented in 1966. This is probably due to economics. But this is where the science comes in. Or, perhaps, engineering, which is really just applied science: come up with a pizza delivery system that keeps the pie warm but doesn't ruin it, and doesn't cost much. As noted above, Domino's, probably the largest chain, has no incentive to do this; their shit is shit whether it's "fresh" or out of a delivery box. So it's going to be up to actual scientists and/or engineers. Unfortunately, while this article is very descriptive, it doesn't propose actual solutions. To be fair, neither can I. I just want my pizza. Unlike a Tupperware of takeout chicken soup or palak paneer, which can be microwaved back to life after its journey to your home, the texture of a pizza starts to irreparably worsen after even a few minutes of cardboard confinement. If you reheat it right, though, leftover pizza can be delicious. I know I've linked to some scientific experiments along those lines in here before. Ah, here it is, from October of 2021: "What Do You Mean, "Leftover Pizza?"" That discussion doesn't address the problems with the pizza box, though. The basic issue is this: A fresh pizza spews steam as it cools down. A box traps that moisture, suspending the pie in its own personal sauna. After just five minutes, Wiener said, the pie’s edges become flaccid and chewy. Sauce seeps into the crust, making it soggy. Worse, the poor benighted souls who have never ordered pizza from an actual New York City pizzeria and eaten it right there on the spot think that this is what pizza is supposed to taste like. By 1949, when The Atlantic sought to introduce America to the pizza, the package was already something to lament: “You can take home a pizza in a paper box and reheat it, but you should live near enough to serve it within twenty minutes or so. People do reheat pizza which has become cold, but it isn’t very good; the cheese may be stringy, and the crust rocklike at the edges, soggy on the bottom.” What I didn't note is that today's article is also in The Atlantic. Corrugation produces a layer of wavy cardboard between a top and bottom sheet, sort of like a birthday cake. The design creates thick, airy walls that both protect the precious cargo within a pizza box and insulate the pie’s heat while also allowing some steam to escape. I should note that I have gotten takeout pizza (if not delivery) that was packaged in a single-ply, though thick, cardboard box. It's not any better at keeping the pizza at peak. We’ve gotten a couple of pizza-delivery innovations in the past few decades: the insulated heat bag—that ubiquitous velcroed duffel used to keep pies warm on their journey—those mini-plastic-table things, and … well, that is mostly it. I've actually had people ask what the table is for. That's okay; it's not necessarily blindingly obvious. It's to keep the top of the box from contacting the toppings, and potentially pulling them off. Then you have a pizza crust, and a cardboard box top with the toppings on it. Which, to be fair, wouldn't taste much different from Domino's. “Every single pizza that I put in a box I know is going to be, let’s say, at least 10 percent not as good as it could have been,” Alex Plattner, the owner of Cincinnati’s Saint Francis Apizza, told me. Others dream of better days. “After smoking a lot of weed, I have come up with a lot of ideas for a better box,” said Bellucci, the New York City pizza maker. Weed is legal for recreational use in New York City now, so there should be a slew of innovative ideas coming out of that metropolis any day now. Ideas, but not necessarily their execution. Too much work for a stoned person. And I just have to say how hilarious Saint Francis Apizza is. Last year, the German brand PIZZycle debuted the Tupperware of pizza containers, a reusable vessel studded with ventilation holes on its sides. I take back the bit about weed. If it's going to lead to people naming their brand PIZZycle, maybe we should stick to booze. No, there's no evidence that weed was involved in that decision, but there's a strong link between pizza and getting stoned, so I assume the connection in the absence of evidence to the contrary. So we know it’s not a question of ingenuity: We can construct better pizza boxes, and we already have. The real issue is cost. Like I said. Domino’s alone accounts for nearly 40 percent of delivery-pizza sales in the U.S.—on par with all regional chains and mom-and-pops combined. Perhaps these big companies are stifling real pizza-box innovation. I shouldn't be surprised. This is the same "culture" that insists on soft white bread, pasteurized process cheese "food," and rice-adjunct lagers. We, as a society, have crap taste. I don't personally like chicken wings, but when spicy chicken wings became popular, I at least held out some hope that we'd get over our phobia about any spice hotter than mayonnaise, but that hasn't happened. Again, though, if you have your own backyard gas-powered 900 degree pizza oven, why are you even bothering with delivered pizza? I mean, I'm all about lazy, but pizza transcends even that. Now, if you'll excuse me, I have a frozen pizza to bake. |
I thought y'all would want to see this. I use "y'all" as a second person plural, a part of speech that English otherwise lacks. And it can't always be inferred from context. Southern Living magazine once described “y’all” as “the quintessential Southern pronoun.” It’s as iconically Southern as sweet tea and grits. I like grits, but sweet tea can kiss my... ass. “Y’all” fills that second person plural slot – as does “you guys,” “youse,” “you-uns” and a few others. "You guys" is considered sexist these days, "youse" is still pretty much limited to a small area in the Northeast, and I'm not sure about "you-uns." I think Pittsburgh uses "yinz." I’m interested in “y’all” because I was born in North Carolina and grew up saying it. I still do, probably a couple dozen times a day, usually without intention or even awareness. I use it too, but more intentionally. I thought I used it more, but a quick search of this blog of over 2,000 entries only yielded 64 entries with "y'all." This would be #65. Back in 1886, The New York Times ran a piece titled “Odd Southernisms” that described “y’all” as “one of the most ridiculous of all the Southernisms.” Damyankees. Like the Southern dialect in general, the use of “y’all” has often been seen as vulgar, low-class, uncultured and uneducated. As someone noted in Urban Dictionary, “Whoever uses [y’all] sounds like a hillbilly redneck.” The only way to change this perception is to use it with intention. The etymology of “y’all” is murky. So is the etymology of a lot of other words. My examples push “y’all” back 225 years before the citation in the “Oxford English Dictionary,” and they show that the word appeared first in England rather than the United States. I think it’s important to point out that it originated in a more formal context than what’s commonly assumed. There are none of the class or cultural connotations of the later American examples. Now, I can't be arsed to research this right now, but I think older versions of English made a distinction between second person singular and plural. That's how we got "thee" and "thou" and other constructions that are now associated with the KJV and maybe Shakespeare. Or something like that; like I said, not looking it up now. Still, there it is, in an English poem written in 1631. Not long after Shakespeare, really. Y'all Brits invented the language; we just perfected it. “Y’all means all” – that’s a wonderful phrase that seems to be popping up everywhere, from T-shirts and book titles to memes and music. Sounds good to me. Now, how about we come up with a first-person plural that distinguishes between "us, including you," and "us, not including you?" Like if I said, "We're going to a party," does that mean you're invited? No. No, it does not, and now I'm embarrassed because you inferred that it did. |
This one's just an interesting hypothetical question, though not so much for the question or answer, but for the approach to it. Was It Ever Possible For One Person To Read Every Book Ever Written (in English)? Randall Munroe Provides a Serious Answer To a Very Hypothetical Literary Question Munroe is the guy who does the excellent nerdy webcomic xkcd , and also answers questions like this in book format. The obvious, simple, and trivial answer to the headline question is "yes" (unlike most headline questions), because at the very least, once the first book was written in English, one person could then read every book ever written in English. But then you have to define "English," which can be tricky, because languages don't generally spring, Athena-like, from the head of some creator, but evolve over time and by mixing languages together. You've probably heard of Old English, Middle English, etc., but the boundaries between them are pretty arbitrary. The actual question: “At what point in human history were there too many (English) books to be able to read them all in one lifetime?” –Gregory Willmot To take a stab at summarizing the beginning of the article, you'd need to know how fast someone can read as well as at what point the sum total of English literature, in a form that can be defined as a "book," exceeded the amount that someone can read in a lifetime. As Munroe puts it at the beginning: This is a complicated question. And the answer is also complicated, but I'm afraid you'll have to read the article itself to find it. Again, the way he gets at an estimate is the interesting part. And it gets into things like writing speed, too, which should be relevant to readers here. There's also the question Munroe himself poses, which is probably more germane to reality: On the other hand, how many of them would you want to read? Fair point. |
For some reason, I was simply exhausted last night and went to bed early. I suppose it's possible that it's a portent of my inevitable demise, but it was probably just the weather. Speaking of demise, today's article, another one from Field & Stream, is about survival. Four Survival Myths That Could Get You Killed Our expert weighs in on some misconceptions about how to live through real-life survival scenarios Oddly, for me at least, one of these myths isn't "It's okay to go into the wilderness." Spend enough time in the outdoors, and you’re bound to wonder how you would handle a true survival scenario. For me, "enough time" is about five seconds. Could you gather and forage enough food? Could you build a strong survival shelter to keep you warm and dry? Could you start a fire…in the pouring rain? No, no, and no. Myth No. 1: You Can Live Solely off Natural Survival Foods I discovered the truth of this back in college. Turns out you can't actually survive solely on ramen noodles. Who knew? But the reality is if you are only eating “survival foods,” you’ll start feel to sick and weak after a day or two. (Some of these survival foods also have little to zero caloric value, which makes them pointless to eat.) Oddly, a day or two is about how long it takes to start to feel sick and weak if you don't eat anything at all, with the added benefit of not having worked hard for the feeling. Myth No. 2: You Can Complete Survival Projects at a Normal Pace Considering that "a normal pace" for me would make a sloth look like Usain Bolt, I'm not sure this is necessarily true. The lesson here is on focus. Careful observation of the resources in your environment will dictate what’s possible and what the most important things are to spend your time on. Your decision could be the difference between life and death. I'd imagine that shelter is of varying importance depending on the climate of where you've gotten yourself lost. A source of fresh water is always of high importance, if you can't find beer. Myth No. 3: If You Kill a Big-Game Animal, You’re Set for Food I am the furthest thing from a survivalist that you can possibly imagine, but a moment's thought should be enough to disprove this one. Even if you get past the thought, "Bears," you also have the thought, "This stuff will rot." But the article doesn't even get into that, talking instead about the danger of trapping animals that don't have much body fat. Basically, no matter how many rabbits, snails, limpets, or venison stakes you eat, you can still starve to death because your body can’t digest all of that protein without fat. "Venison stakes?" I think someone killed and ate the editor. Myth No. 4: Practicing Survival Skills Is the Same as Practicing Survival This might not be nearly as blindingly obvious as the last one, but it makes sense. This is a skill that only comes with experience. One indicator that you have this skill is you notice that you’re still having fun in a situation that others are complaining about. Accumulated experience in remote areas lets you know when something like feeling cold, getting cut, or eating something rotten is actually a concern and when it’s not—so you know when to be concerned and when not to be. When you get to this point, it may appear to others that you enjoy misery but you don’t, it’s just that it’s not miserable to you anymore. And then you can lord it over them with your superior knowledge and attitude. Maybe this will finally make them appreciate you. More likely, you'll find yourself one morning with a cold campfire and no one to practice survival skills with because they've all gone haring off downstream because "obviously, this person can survive on their own." That's the other thing about survival: we're social creatures; we got to nearly the top of the food chain not by an individual being the best hunter or survivalist, but by supporting each other. One dude can't take down a healthy mammoth alone (without heavy artillery anyway), but a dozen might. Hell, even Thoreau spent a lot of time hanging out with friends while supposedly sequestering himself at Walden (another reason I despise that guy's philosophy: he was a hypocrite). The ultimate expression of survival is civilization, and that's why I stay near it at all times. |
At some point, I think it was last year, I finally got around to watching Frozen. What can I say? Without kids, I was never forced to, but it's part of my Disney+ subscription, so at least I didn't have to pay more for it. (Still haven't seen the sequel.) As usual, though, truth is stranger than fiction. The Time a Russian Empress Built an Ice Palace and Forced Her Jester To Get Married In It Cruel joke, power move, or both? Now, the truth is, I don't have much to say about this beyond the Frozen reference above. I just thought it was—to understate things a bit—cool, so here it is, shared. It was 1740 and one of the coldest winters St. Petersburg, Russia, had ever seen. But few residents mentioned the bitter winter in letters or accounts. They were, understandably, distracted. While the rest of Europe shivered through the deep freeze, Russians were busy—building a palace. On the orders of Empress Anna Ioannovna, numerous craftsmen were charged with constructing an elaborate, fairy tale–esque castle, one made entirely of ice. On the plus side, I suppose the exertion kept them warm. And the fear of being executed for underperformance, but mostly, the exertion. Rising 66 feet from the surface of the frozen Neva River and nearly 165 feet long, the Ice Palace was built “according to all the rules of the most current architecture,” noted Russian mathematician Georg Wolfgang Krafft. I guess one of the rules wasn't "don't build your palace out of ice." A steam bath, or bania, built from ice sat beside the palace. Decorative ice dolphins blew fire. A life-sized ice elephant’s raised trunk served as a fountain by day and as a stunning torch by night. Gotta say I'm impressed by the violation of the laws of thermodynamics, which, to be fair, hadn't been formulated yet. The palace was, by all accounts, a marvel. But the elaborate, temporary palace isn’t the strangest part of the story. That winter, Anna ordered a bizarre wedding to take place at the Ice Palace between the disgraced noble-turned-jester, Prince Mikhail Golitsyn, and a Kalmyk woman, Avdotia Buzheninova. Okay, I had to look up "Kalmyk." Apparently a Mongol ethnic group in Russia. The article does explain this later, but I hadn't gotten to that part yet. I'm sure their story is fascinating, too, but I'm short on time tonight. “The nobles choose Anna because, as a woman, they think she’ll be very easily manipulated,” says Russian historian Jacob Bell of the University of Illinois. The nobles imposed a list of conditions (creatively dubbed the “Conditions”) on Anna’s power and made her sign on the dotted line. She signed, but Anna was far smarter than they gave her credit for. Within a couple of months, Anna solidified support from a group of nobles and the local guards’ regiments. “[She] then, very dramatically tears the conditions in half and declares she will rule in the way she wants to rule,” says Bell. This is one of the most Russian things outside of depression and vodka. Continuing many of the efforts begun by Peter the Great, she funded the Russian Academy of Science and encouraged Westernization. She founded the Cadet Corps, a premiere military training school, and maintained a brutal secret police known as the Secret Office of Investigation. The Russianness intensifies. Ahead of the nuptials, Anna decreed every Russian province to send a man and woman wearing traditional dress to attend the spectacle. I suppose at least she didn't make them all wear ice? With all of this sort of thing going on, it's surprising that it took over 150 years after the Ice Palace incident for Russians to rise up in revolt. I guess they might have started to, but got cold feet. |
Today's article is from 2018, but still relevant. Now, this is Harvard Business Review, so the focus of the article is... well... business. But the metaphor they use relates to writing: Long before your favorite movie made it to a theater near you, it was presented in a pitch meeting. Hollywood screenwriters typically get three to five minutes to propose an idea, but it takes only around 45 seconds for producers to know if they want to invest. Specifically, producers are listening for a logline: one or two sentences that explain what the movie is about. If there is no logline, more often than not, there is no sale. I suspect most of us knew that, at least in basic form. The article doesn't explain this, but it's called an "elevator pitch" because the idea is, you find yourself in an elevator with, I dunno, say, Kevin Feige. You've got your perfect screenplay for Squirrel Girl written (or mostly written) and you really want to sell him the script so she can finally join the Avengers. The elevator ride is maybe a minute long, and you have that long to convince him that a) your screenplay is awesome and b) Squirrel Girl is the perfect character for an MCU movie. (Point 2 should be blindingly obvious to everyone, but, at the very least, point 1 is necessary.) The great thing about the name is that it's a pun on "elevate," as in, it has the potential to elevate your idea into reality. Like I said, the thrust of the article is using the elevator pitch to promote your business or product, but publishing is a business, and your writing is a product. And even if, like me, you're not actively trying to sell your writing to a publisher or producer, distilling the essence of your plot down to a couple of sentences is a great writing exercise in itself. You can even do it before your writing project begins; that can keep you on track. Source: me, who's done it. (The fact that this resulted in a final product that was nothing like the elevator pitch is a me problem, not a pitch problem.) Alternatively, write a first draft, then the pitch, then keep the pitch in mind when revising. If you can answer in one compelling sentence, you can hook your audience. According to molecular biologist John Medina of the University of Washington School of Medicine, the human brain craves meaning before details. I would add that most humans crave emotional connection before logic. This is something to remember when creating a pitch. In Hollywood cinema, one of the greatest loglines of all time belongs to the iconic thriller that kept kids out of the ocean during the summer of 1975: A police chief, with a phobia for open water, battles a gigantic shark with an appetite for swimmers and boat captains, in spite of a greedy town council who demands that the beach stay open. Note the emotional charge of some of those words: phobia, shark, appetite, greedy, demands. A logline should be easy to say and easy to remember. As an exercise, challenge yourself to keep it under 140 characters, short enough to post on the old version of Twitter (before the platform allowed 280 characters per tweet). Really wish they hadn't brought Twatter into the discussion. In fairness, this was four years before Musk, but I already despised the platform then. Identify one thing you want your audience to remember. Steve Jobs was a genius at identifying the one thing he wanted us to remember about a new product. In 2001 it was that the original iPod allowed you to carry “1,000 songs in your pocket.” In 2008 it was that the MacBook Air was “the world’s thinnest notebook.” Apple still uses this strategy today. Steve Jobs was a genius, period. Massive cocknugget, sure, but genius. If you can’t communicate your pitch in one short sentence, don’t give up. Sometimes the language will come to you immediately, other times it might take more practice. Be patient. It's possible to spend more time creating a good pitch than it takes to write a first draft. But it could make the difference between people wanting to read your shit and... well... not. |
With over 2200 entries here, it's difficult not to duplicate entry titles. Oh, sure, I could do it, but that would involve taking the time to use the search function and possibly slogging through entries with the words in the body in order to see if I've used the title before. And that's work, which, as you know, I'm allergic to. But it turns out that the title of today's archive post was one I used twice (and now, three times, sort of). in my defense, it's a great pun: "Meat the Press" . This one's from not all that long ago—while I exclude anything in the last 12 months from these revisits, this was 15 months ago. I just pick them at random. As noted in that entry, I was participating in "Journalistic Intentions" [18+], so I'll take the opportunity to plug that contest now. I think the next round will get going next month, and it can be fun to see where the prompts take you. In the case of my linked entry, the JI prompts were all about food; Turkey DrumStik runs a different theme each time. As for the entry itself, though, there's not a whole lot I need to add to it. It's a good thing I'm tackling this now. Today and going into tomorrow, I'll need to be on a liquid diet. And not my usual liquid diet, either. So I expect to be mind-destroyingly hungry, which is never how I want to be when I'm reading or writing about food. That intro might be confusing taken out of chronological context, so I'll explain: being the age I am, I occasionally get to have a camera shoved up my ass. As a former photographer, I'm used to people wanting to shove a camera up my ass; fortunately, in this case, it wasn't my camera, and I'd be unconscious for it. Point being, apparently, the procedure was scheduled for shortly after that entry, and I was about to start the necessary preparations for the procedure, which involve taking copious laxatives and not eating anything (or drinking beer). For the record, not that anyone cares, but it turned out that I didn't get all that hungry during the prep, and I got a clean bill of health afterward. You'd never know it, though, the way I overuse semi-colons. (I see an opportunity for a pun and I take it.) I don't remember if I ever followed up to mention that, so I'm doing it now. Still, it would have been annoying to write about food during a period of fasting. And that's about all I have to elaborate upon, so read (or re-read) the entry if you want. Unfortunately, the video the prompt is based on is no longer available. It's not required viewing, though. As I recall, it was moderately interesting, although, naturally, not nearly as fascinating as my writing. |
Periodically, as I've noted before, I find articles bloviating about things that science fiction authors have "predicted." This is misleading, because unless it was a sealed prediction opened some years later, it's more like they dreamed it up, or someone invented something based on an idea they saw in SF. Today's article is a refreshing change from that because of how the headline is worded. Ideas can come from all sorts of places, and inspiration can hit in a flash—think of Archimedes supposedly yelling “Eureka! Eureka!” in the bath when he realized that irregular items could be accurately measured through water displacement. Probably apocryphal, but for whatever reason, we love our narratives about inspirational flashes. Like that one, or Newton's apple. But sometimes, it’s fiction, not reality, that provides the spark of inspiration. There are sci-fi tales, for example, that have gone beyond predicting technological advancements to directly inspiring scientific progress, from robotics to rocketry and everything in between. And they just lost my good will from the headline by using "predicting" here. In any case, I'm not going to copy all of them here, just the ones I want to comment on. 1. The Taser // Victor Appleton’s Tom Swift and His Electric Rifle Written under a pen name and published in 1911 by the Stratemeyer Syndicate (which also published the Nancy Drew and Hardy Boys novels), Tom Swift and His Electric Rifle sees the titular character invent a weapon that looks like an ordinary rifle but fires bolts of electricity. The book was a childhood favorite of Jack Cover and partly inspired the creation of his own electroshock weapon: The Taser. That might have been the ultimate origin; I don't know of any stun devices in SF before that (the genre hadn't been around for even 100 years at that point, anyway). But the idea took off in a big way in SF and, especially, space operas; the ray gun, or zapper, or whatever, became a staple. Perhaps the most famous version was Star Trek's phaser. Now, this article goes on to note that "TASER" stands for "Thomas A. Swift’s Electric Rifle," and other sources back that up—but it was invented starting in 1969, three years after the first Star Trek episode featuring phasers. And phasers, of course, took their name inspiration from the laser. My only point here being that many inventions have more than just one inspiration. 2. Helicopters // Jules Verne’s Robur the Conqueror Leonardo came up with a helicopter concept first. In fairness, though, his wouldn't have flown. 3. The World Wide Web // Arthur C. Clarke’s “Dial F for Frankenstein” Without Arthur C. Clarke’s 1960s-era short story “Dial F for Frankenstein,” there might be no World Wide Web as we know it. The sci-fi story is about a global, interconnected telephone network that gains sentience—and it served as one of Tim Berners-Lee’s inspirations when he created the Web while working at CERN in the 1980s. I'm always amazed at the number of people who, after reading a science fiction story that warns "under no circumstances should we invent this thing," go on to invent that thing. 7. Investigating the Habitability of Mars // Ray Bradbury’s The Martian Chronicles and The Illustrated Man Physicist Peter H. Smith, professor emeritus at the Lunar and Planetary Laboratory of the University of Arizona, attributes his initial interest in extraterrestrial worlds to Ray Bradbury’s sci-fi stories, including those in The Martian Chronicles (1950) and The Illustrated Man (1951). Okay, fine, but let's remember that Bradbury himself was inspired by earlier Mars stories, which in turn were inspired by even earlier Mars stories, etc. Not all of them were science fiction. Hell, Bradbury is only marginally science fiction; he was more interested in poetic prose than in the science behind things. Nothing wrong with that—he was certainly awesome and inspiring; this isn't meant to rag on him at all. 11. Remote Manipulators // Robert Heinlein’s “Waldo” Published under the pseudonym Anson MacDonald, Robert Heinlein’s short story “Waldo” is about a scientist named Waldo Farthingwaite-Jones who invents a device to help him manage his degenerative muscle disease. His machine can perfectly mimic his hand movements, but with greater strength and from a distance. This device is essentially a remote manipulator, also known as a telefactor. Because of Heinlein’s story, some call the mechanism—which, according to Fundamentals of Robot Mechanics, “usher[ed] in the era of teleoperators”—a “waldo.” I'm mostly including this one because the term "waldo" is a legitimate word for this technology, and I get the impression many people don't know where it comes from. Heinlein didn't actually come up with the idea for the remote manipulator; he in turn was inspired by earlier writings. But the name is all Heinlein. If I had to take a stab at which science fiction author was most influential in terms of real-life invention, Heinlein would be in the top 5. He's often credited with the ideas behind the waterbed (Stranger in a Strange Land), and computer-aided design (The Door into Summer), just to name two other examples. It would be wrong to say that any of these writers "invented" the technology they inspired. Invention requires far more detailed design work than most SF writers need to go into, especially in their stories; that would get really boring really quickly, like Melville's detailed descriptions of whaling in Moby Dick. But this is the way things get invented in the first place: it always starts with an idea (or, some might say, a Platonic ideal) which only gradually makes its way through mind and sweat to consensus reality. Who gets the credit? Well, in my personal opinion, the inventor gets the bulk of the credit because they're the one who made it work. But there's good reason to acknowledge the source of one's inspiration. |
All of us eat, and someone cooks that, so hopefully something like this has some appeal. I saved this article quite some time ago and I'm only now getting around to it. Consequently, I've forgotten what those 15 things were, so let's discover this together. You know you’ve been in the food-writing world for too long when you’re shocked to see someone cut birthday cake with a knife… How does this person not know to use a strand of taut dental floss or baker’s twine, which makes for the easiest, most mess-free slicing? That right there is enough to ensure that I will always cut a cake with a knife. That is, when there is a cake. There usually isn't. Sometimes I forget that not everyone is walking around with a mental catalog of time-saving, energy-saving, sanity-saving, life-saving, money-saving, surefire, guaranteed foolproof, plan-ahead, stress-free, problem-solving shortcuts, tips and tricks in the kitchen. Apparently, we're doing everything wrong, including doing stuff wrong. 1. Don’t make recipes (or trust cookbooks) that have overly cutesy recipe titles like “Struttin’ Chicken.” These kinds of dishes rarely have the kind of staying power that a simple Roast Chicken will. I agree with that on the grounds of cuteness, but my feeling? If you want to try it, try it. A recipe doesn't have to have "staying power" if it looks good to you. Besides, at least the recipe isn't for "Cluckin' Chicken." 2. Buy yourself a pair of kitchen scissors. Agree there, too, but how is it that this isn't basic? Even my mother had kitchen scissors, and she wasn't the world's greatest cook. 3. Some Type-A behaviors worth stealing: Do everything you can in advance when you are having people over for dinner. No matter how easy and tossed-off the task may be. No matter how many times your partner-in-crime says, Why don’t we just do that later? Filling the water pitcher takes 15 seconds! While I object to the psychospeak of "Type-A behaviors" (though at least she didn't use the adjective "anal"), if a task takes 15 seconds, you can do it anytime. The problem is that you have a thousand 15-second tasks, which (math trigger warning here) is over 4 hours and 10 minutes. Sometimes, you have to prioritize; other times, task A needs to be done before task B, which needs to be done before task C, etc. If a task isn't on the critical path, move it around. 4. Brushing dough with a quick egg-wash is the secret to getting that shiny, lacquered, I’m-worth-something-after-all glow to your pies, breads, and galettes. Fair enough, but that's baking advice, and I was looking for cooking advice. Baking is not my strong suit; I've never had any success with pies or breads, and furthermore, what the culinary fuck is a galette? No, I'm not going to be arsed to look it up. 5. Meat will never brown properly if you add it to the pan when it’s freezing cold and wet. It should be patted dry and room temperature. I kind of feel like letting meat sit at room temperature is a great big no-no. I'm no expert at this sort of thing, but doesn't that just invite microbes to party? Especially ground meat. I've also heard that meat browns better if you add a bit of baking soda, but I've tried that and it didn't improve anything, in my opinion. Probably another attempt to sell more baking soda. 6. Add acid. Sounds great! Oh, wait, they don't mean lysergic acid. Damn. Boring. (But true.) Also, fun if you do the baking soda trick. 7. Figure out the correct way to slice and dice an avocado. You will not only save time, energy and sanity by doing this, but you will find yourself giving tutorials to awed, in-the-dark observers every time you make guacamole in front of them. I actually followed the link to the video in this one, even though I absolutely detest learning shit from videos. That may not be the worst way ever to deal with an avocado, but it's not the best, either. Besides, if you're making guacamole, who cares what the cut-up avocado looks like? It's only going to get mashed anyway. I never make guac, though, and my slicing method may take a bit longer, but it gives me nice, clean, even avocado slices or dices every time. That is, if the avocado is in that fifteen-second window between "too hard" and "rotten." No, I'd say "figure out the best way to dice an onion." Onions get used a lot more often than avocados, especially in my house, and I long ago perfected the quickest way to dice 'em up without getting blood everywhere. The only difficulty I have is peeling them; often, I give up and take the outer layer off along with the peel to keep myself from getting too frustrated to cook. 8. Ice in the cocktails, people. Fill that glass all the way up! No. Just, no. Unless you really want a watered-down cocktail, or one that's more ice than booze. And some cocktails aren't served with ice at all. Martinis, e.g. Or single-malt scotch. Sure, some people like those on the rocks, but it's not a universal thing. 9. Learn how to make a handful of healthy dinners without using a recipe. Whether it’s scrambled eggs on toast or your great-grandmother’s 19-ingredient mole sauce, making dinner is so much more enjoyable when you can do it on autopilot, catching up with your kid or your partner as you go, or just savoring the aromas of sautéing leeks, instead of bobbing back and forth from cookbook to stovetop. How is scrambled eggs on toast "dinner?" I mean, sure, it's fine to eat it anytime, but you're having breakfast for dinner. And who uses cookbooks directly anymore? Even if I do, I take a picture of the page or scan the text into my dumbphone. Keeps the cookbooks from getting messy. Phones are easier to clean than paper. 10. Compliment the cook. It doesn’t even matter if you don’t like the food! Someone took time from his or her day to plan, shop, and put together a meal for you to enjoy. Be exceedingly, absurdly grateful always. Nevertheless: don't lie. Obviously give the cook a participation trophy; the work was the same whether the meal was tasty or disgusting, and that needs to be appreciated. But you're not doing your spouse (or whatever) any favors if they ask "how was the meatloaf?" and you answer "This was ambrosia of the gods!" when the truth is it tasted like sawdust and gypsum. I mean, you're not doing yourself any favors if you're brutally honest like that, but after a few days on the couch, they'll probably forgive you and figure out a way to fix the meatloaf. (The danger here is if they don't forgive you and next time they mix in actual sawdust and gypsum.) 11. A salad is not a salad without some sort of crunch Oh, please. 12. Food trends come and go, but spaghetti and meatballs are forever. I can't argue with that, but keep in mind that, 60 years ago, people thought aspics would be forever. (In fairness, there are probably still aspics around that were made 60 years ago, close enough to forever. Kind of like the family fruitcake.) I'll skip the rest; you can go read them yourself. I don't think any of it is bad advice, exactly, but there's a lot of personal preference in there. And that's fine. I think we all have to do what works for us. The avocado thing, for example. Or maybe you like different things in salads than I do. Whatever. The thing about articles like this is that a lot of cooking has to be figured out from experience. A recipe leaves out a lot of steps, by necessity: steps like "Pull the pan off the hook it's hanging on," because some people don't hang their pans. Or precise instructions for washing vegetables, which is also a matter of personal preference, space, and other factors. Still, as with any other advice article, you take what works for you and leave the rest. Unlike with my mom's meatloaf, which was more "leave" than "take." |
Getting back to articles, here's one from Cracked that should be of interest to readers. Since you're reading this, I assume you're a reader. The title should more properly be "Ways Librarians Used to Be Hardcore." While every profession employs miserable people doing jobs just because they’re told to, librarians are an exception. Librarians are heroes. Every hero needs a catchphrase. For librarians, traditionally, it's "Shh." 5. Libraries Sterilized or Even Incinerated Books, and Laws Kept the Sick From Borrowing We used to care about public health. You were touching all these items that other people had fondled, people whose hygiene and morals you had no way to evaluate. We don’t have many public shoe exchanges, or rent-a-bra companies, because despite the obvious advantages of such facilities, such ideas repulse people. Maybe we should be just as weirded out by borrowing books. Unlike with machines in gyms, there's no "wipe the books down when you're done." Fuel for these fears came from a scientist named William R. Reinick, writing in the American Journal of Pharmacy. Someone once caught smallpox from a book, he said. Someone caught gonorrhea from a book, he claimed (exactly what they’d been doing to the book, he did not say). You could even catch cancer from books, he asserted. Then he shared the results of an experiment, where he kept 40 guinea pigs and gave them library book paper as bedding. All the guinea pigs died. This was damning evidence, if you don’t know much about how experiments are supposed to have control groups. And also that guinea pigs live about two years, anyway, if they're lucky. Not to mention those suckers will eat anything, and I'd think it more likely the ink poisoned them. Then as now, such studies were usually publicized so that the studier could sell their solution to the problem they've just created. Some health measures make sense, while others do not. Examining books tells us that, yeah, they might have some germs on them, but still, it seems no one ever gets sick from handling books. 4. During the Depression, Librarians Went Out on Horseback to Bring Books to Mountain Folk These mounted quests weren’t easy. Sometimes, the horse (or mule) would keel over and die. The librarian would have to continue the remaining many miles on foot. Sometimes, locals didn’t take kindly to these strange women bearing written words, forced on them by the government. Nice to know nothing's changed in 90 years, except the caliber of firearms used to shoot at trespassers. 3. Librarians Put on Uniforms and Went to War (as Librarians) "Bang!" "Shh!" In Vilnius, Lithuania, the Nazis set up a Jewish ghetto and banned anyone from entering or leaving. Librarian Ona Šimaitė managed to go in and out anyway, using the excuse that she was collecting overdue library books. During these trips, she smuggled in food and arms, and smuggled out documents for preservation. She also smuggled out children, in sacks. Legitimate badassery, right there. 2. Police Arrested People in the Middle of the Night Out of Their Beds for Overdue Books Take New Jersey in 1961. Harold Roth, the director of the East Orange Public Library, decided he was through waiting on people to return their late books, and so, he called in the police. The cops staged midnight raids on 14 homes. People who had cash on them to pay the fines did so, while others had to go right to jail. Fortunately, jails have libraries. Today, libraries find that abolishing fines is actually the more effective tactic at getting tardy patrons to bring their books back. "What are you in for?" "Murder, rape, larceny, rape, resisting arrest, rape, and loitering. You?" "Overdue library book." 1. J.P. Morgan Locked the Nation’s Financiers in a Library Till They Agreed to Bail the Country Out This one was interesting to me because it involves a place I've actually visited. In 1907, the economy was in real trouble. ...the burden for saving the country fell on J.P. Morgan. J.P. Morgan, librarian. It's a stretch to relate this to librarianism, though technically it happened in a library, albeit a private one at the time. It's still a really interesting bit of history. Full disclosure, though, I couldn't be arsed to fact-check it, and I don't recall seeing any plaques about it when I visited the building in question. Hell, I don't think most people have heard of the financial crisis of 1907, though it's what led to the formation of the Federal Reserve System. It was comparatively brief, and later overshadowed by the Great Depression (see #4 above). The country survived, as did the library. Visit it today, to see some books, or to see a third-century Roman sarcophagus that Morgan installed. The building in question, since the article neglects to mention this, is on Madison Avenue, a short walk from the Empire State Building. I don't remember the sarcophagus, though. |