Not for the faint of art. |
Complex Numbers A complex number is expressed in the standard form a + bi, where a and b are real numbers and i is defined by i^2 = -1 (that is, i is the square root of -1). For example, 3 + 2i is a complex number. The bi term is often referred to as an imaginary number (though this may be misleading, as it is no more "imaginary" than the symbolic abstractions we know as the "real" numbers). Thus, every complex number has a real part, a, and an imaginary part, bi. Complex numbers are often represented on a graph known as the "complex plane," where the horizontal axis represents the infinity of real numbers, and the vertical axis represents the infinity of imaginary numbers. Thus, each complex number has a unique representation on the complex plane: some closer to real; others, more imaginary. If a = b, the number is equal parts real and imaginary. Very simple transformations applied to numbers in the complex plane can lead to fractal structures of enormous intricacy and astonishing beauty. |
Yeah, sorry, but I'm going to be phoning it in today. No article, no Revisited (that's tomorrow), and not enough stuff has been going on in my life to talk about it. That last bit is a good thing, though: it means my life's goal of avoiding drama is still successful. So, taking advantage of it being the final day of September, well... WAKE UP! (Yes, that's a Green Day reference.) I did want to plug a couple of things for October, which is usually a busy time around here. First, there's:
Even if you're not planning on doing NaNo, it's a good way to get your stuff organized for a longer story. Hell, you can use it if you're running a role-playing game and want to do your own adventure. It runs through October, and there's still time to sign up. And then there's:
As that is a blogging activity, I'll be participating. The prompts are even more open-ended than usual for that contest, and it should be interesting to see what people come up with. If you're intimidated by having such intimidating competition as myself, well, don't be; I never win. I'm just in it for the prompts. You only need to do 8 entries during the month. So that's about it for now, though I know there are numerous other activities here this coming month. I'll be back tomorrow with more usual content. |
It's been a few weeks, I think, since the last time one of the solar system articles came up from my queue. This one's about Saturn. Saturn Could Lose Its Rings in Less Than 100 Million Years Recent discoveries suggest that the planet’s distinctive feature may be gone in the cosmic blink of an eye I guess this is considered noteworthy because, with a few notable exceptions such as comets or sunspots, we tend to think of astronomical objects as relatively unchanging. And, from a human perspective, 100 million years is a very long time indeed. 100 million years ago, our ancestors were mouse-sized quadrupeds, hiding from dinosaurs. I did an entry a while back on Jupiter, that other gas giant in the solar system, and we've witnessed the shrinkage of its iconic Great Red Spot over the past couple hundred years—in comparison, lightning-fast. Now it might be more properly called the Okay Red Spot. Anyway, back to Saturn... If someone asked you to draw a planet other than ours, you would likely draw Saturn, and that is because of its rings. Rings are very common in SF art, to indicate "hey, a different world." And yet, of all the planets we're most familiar with, only the one has easily identifiable rings. It was Galileo Galilei who first spotted something there. His primitive telescope gave him only a slightly better view of the heavens than did the naked eye, and in 1610 he thought he saw two undiscovered bodies flanking Saturn, one on each side. I think he called them "ears." Some of his, and others', early drawings are reproduced in the article. As the article notes, when they're edge-on, they basically disappear from Earth-based views. The particles in the inner rings move faster than those in the outer rings, because they are fighting against a stronger gravitational pull. That's... well, that's mostly correct. Except they're not "fighting against" anything. The average thickness of the main rings is believed to be no more than 30 feet. A recent study showed that parts of the B-ring—the brightest ring of all—are only three to ten feet thick. I've known this for a while, but it still amazes me. Considering the overall breadth of the rings, this makes them far thinner, proportionately, than a sheet of paper. Most of Saturn’s rings lie within what’s known as the Roche limit—the distance a satellite can orbit a large object without the planet’s tidal force overpowering the object’s own gravity and tearing it apart. Which might well be how they formed in the first place. The article goes on to describe how one researcher figured out (probably) how the rings were disappearing. In 2012. Just goes to show that there's no apparent end to discovery. There's also lots of cool pictures of the rings, of course. From close-up. Because we've sent robots there. We're also treated to what I consider an excessively long biography of the researcher in question, which, admittedly, I skimmed. “Take a Good Look at Saturn Before It’s Too Late,” Time magazine cheekily warned, “Because It’s Losing Its Rings.” And this is why I have issues with most science reporting. There's another article in my queue, which I'll get to eventually, about the recent discovery of potential life-indicating chemicals on an exoplanet. Popular media immediately jumped to techno-aliens in flying saucers. But. This article shies away from such sensationalism. Because the reality of it is sensational enough. I mean... just look at those fucking rings! |
England invented English. Here in the US, we perfected it. Mostly, I'm just sharing this because I find it intriguing. Though, as usual, I didn't do extensive fact-checking. English settlers faced with unfamiliar landscapes and previously unknown plants and animals in the Americas had to find terms to name and describe them. Gosh... if only there had been people here who had already named such things. They sometimes borrowed words from Native American languages. "Sometimes." By the time of the American Revolution, English had been evolving separately in England and America for nearly two hundred years, and the trickle of new words had become a flood. Given all that, it's truly a wonder that we're mutually intelligible. Most of the time. Corn offers an example of how English words evolved in America. Before 1492, the plant that Americans call corn (Zea mays) was unknown in England. The word corn was a general term for grain, usually referring to whichever cereal crop was most abundant in the region. For instance, corn meant wheat in England, but usually referred to oats in Ireland. When American corn came to Britain, it was named maize, the English version of mahiz, an Indigenous Arawakan word adopted by the Spanish. When the first colonists encountered it in North America, however, they almost always referred to it as corn or Indian corn, probably because it was the main cereal crop of the area. Interestingly, the Arawak weren't North American, and apparently their staple food crop was cassava root, not maize. Yeah, it gets complicated. Also, the French word for it is maïs, pronounced similarly to maize. As with aubergine and courgette, this is one of those food words that sometimes trip up US/UK relations because the British names are closer to the French than they usually like to admit. Much of the landscape of North America was new to the English, so many early word inventions applied to the natural world. Often these simply combined a noun with an adjective: backcountry, backwoods (and backwoodsman), back settlement, pine barrens, canebrake, salt lick, foothill, underbrush, bottomland, cold snap. Plants and animals were similarly named, for instance, fox grape, live oak, bluegrass, timothy grass, bullfrog, catfish, copperhead, lightning bug, garter snake, and katydid (a grasshopper named for the sound it makes). Now, see, that last one is another example of me having been wrong about something, and I can admit it. We have katydids here; I see them every summer. The cats like to try to catch them, which I discourage, because katydids are cool. But I'd never really heard whatever sound they make, so I assumed (yes, I know the line about assuming, but shut up) that it was a corruption of "caryatid." Americans repurposed other English words as well. For example, bug, which meant a bedbug in England, broadened to cover any insect... Here's where I get pedantic: in science, "bug" refers to a subset of insects, ones with specific mouth morphology. Here in the American South, "bug" can refer to pretty much anything with an exoskeleton that lives on land, including spiders, which of course aren't insects. Other usages restrict "bug" to pests, such as midges, mosquitoes, or cockroaches (or, yes, bedbugs). Which is correct? Well, all or none, depending on your point of view. Interestingly, though, no one ever seems to refer to butterflies as bugs, though they're insects, and I don't know of anyone who considers lightning bugs to be pests. Several words for bodies of water changed meanings between the old country and the new. In England a pond is artificial, but in America it is natural. Creek in British English refers to an inlet from the sea, while in American English it describes a tributary of a river. Eh... sometimes. Sort of. In common usage, a "creek" is like a stream or brook: a (mostly) permanent, small flow of water in a channel. Here in the South, people sometimes call it a "crick," which is usually down in the "holler." But if you look at a map of the Chesapeake Bay, especially on the Virginia side but also sometimes in Maryland, you'll see a lot of tidal estuaries called creeks. These were (I suspect, though again, not a lot of fact-checking on my end) likely named by John Smith, who was, of course, English, so I suspect he was using the "inlet from the sea" meaning. Many of them have local names followed by "creek." One example is Potomac Creek, in Virginia, which wasn't named for the river it connects to; rather, both were named for the native group, part of the Powhatan Confederacy, whose main village was located on its banks. This usage can confuse people who aren't from the area. "That's a 'creek?' It looks like a bay!" Well, it is. Both. You get into categorization problems, like with mountains vs. hills, seas vs. lakes, or what makes a world a planet. Especially with the Chesapeake Bay, which is really the flooded lowlands of the Susquehanna River, which became tidal the last time a bunch of glaciers melted to raise sea level. Whatever. I spent my childhood in the area, so I probably think about it more than most. Back to the article. An English watershed is a line or ridge separating the waters that flow into different drainage areas, but in America it’s a slope down which the water flows, or the catchment area of a river. And that was part of my career, so I always used it in the "catchment area" sense; the ridge that separates watersheds, we simply called the watershed boundary. English speakers also adopted words from other colonial countries. The language that influenced early American English the most is Dutch. You see that mostly in the New York and New Jersey area. My favorite, though, is the Murderkill River, in Delaware, which has nothing to do with murders; it's from Dutch words meaning "mother" and "river," so its name, translated, means something like "mother river river." It's actually a creek. Bushwhacker, from a Dutch term meaning forest keeper, made its first appearance in print in 1809, in Washington Irving’s comic novel A History of New York, written under the pseudonym Diedrich Knickerbocker. He describes a gathering of prominent Dutch settlers as “gallant bushwhackers and hunters of raccoons by moonlight.” I will take this opportunity to point out that "raccoon" wasn't Dutch, but came from the Powhatans' Algonquian language. They also gave us "opossum." Yankee is also almost certainly a Dutch contribution. And we have yet another thing for which we can blame the Dutch. |
Been a while since I landed on a Cracked article. No, Greek Fire isn't one of these. But rest assured, there are far more than 6. We have some Russian doodles dating back to the year 1260, showing a six-year-old imagining himself as a knight fighting monsters, but we have other whole years where our records are blank. D&D is apparently older than we thought. For example, let’s be honest, can you remember a single thing that happened in 2019? Only when I look at that year in my blog. As usual, it's a countdown... 6. What Volcano Blackened the World, Half a Millennium Ago? It seems like we should know. An eruption that big should have created immediate columns of ash 15 miles tall and would have been audible a thousand miles away. However, we don’t have records from anyone who saw or heard it. Clearly, it was the one that wiped out Atlantis. 5. What Does pH Mean? I gotta be honest; this one surprised me. My father, who had an actual degree in chemistry, told me it was "percent hydrogen," and, being younger than 13 at the time, I took that to be the Final Truth on the subject. Once I entered my rebellious years (which never ended), I was too arrogant to ever question that. You know all about the pH scale if you use quack remedies that promise benefits from balancing the pH in your body. No, you don't (yes, I know this is what we in comedy circles sometimes refer to as a "joke.") The H stands for hydrogen — that much, we know. But the p? It doesn’t stand for “percent,” though pH does measure the proportion of hydrogen. Nor, for that matter, does it stand for “proportion.” Many people think it stands for “power,” because pH takes into account exponents, which we sometimes describe using the word power. Various urban legends say it stands for various foreign words, like puissance, Potenz, pouvoir, potential and pondus. French, German, back to French, English (which isn't foreign to Cracked readers), and... Latin? The guy who came up with it was Danish; why not look in that language, which I know pretty much nothing about (though it is somewhat related to English and German)? 4. Who Was the Guy on This Coin? Obviously, "this coin" is pictured in the article. Silbannacus must have been emperor for a couple months in the year 253. There was enough of a gap in the timeline for him to reign for a little bit. For a short spell in September and October, exactly 1,770 years ago this week, he was the most powerful person in the world. Then he was gone, leaving behind only two coins. For once, the RNG gave me a recent article, so I need not explain the historical context of the writing I link. I just want to know if he lasted longer than Liz Truss. 3. What Is This Battery Made Of? I'd guess... battery stuff? In 1840, the science of electricity was taking its first faltering steps. Oxford professor Robert Walker bought a curiosity called the Clarendon Dry Pile, which he set up in his lab. It used some kind of chemical reaction to generate electricity, which sent a clapper back and forth between two bells. Fun fact: we still use "pile" for "battery" sometimes, but from what I understand, the French word is "pile" (pronounced more like "peel.") You can say "battery" in France, but they'll think you're saying "drum" (batterie) which, when you think about it, makes a hell of a lot more sense than English. It’s been running almost 200 years now, without pause. And yet, my smartphone battery starts to fail after a month. The only way to find what’s in the battery is to cut it open. This would end the ringing and may also end the world. Unless that long-ago inventor managed to stumble upon a way to circumvent the Laws of Thermodynamics, it's not a perpetual motion machine. At some point, perhaps in the distant future when humanity is either dead or living amongst the stars (or both), it'll stop. Then we'll know. 2. What Do Levi’s 501s Mean? Jeans trivia is its own special branch of knowledge. Did you know that the rivets in denim are essential for holding the fabric together, and aren’t there simply to look cool? That was true when they were invented. Now, it would be possible to use advanced materials technology to do that with thread alone. But then it wouldn't look nearly as cool. Did you know that the leather patch that bears the brand logo is called a Jacron? Okay, I learned something. And that the tiny upper jeans pocket (the one that fits a single key or an emergency ecstasy pill) was designed to hold your pocket watch? You do if you read this blog. And did you know what the “501” in Levi’s 501s stands for? Nope, you didn’t, and as you might now guess, having read this far in the article, no one else today knows that answer either. You'd think maybe it was just a cool-sounding number, which companies like car makers pull out of thin air all the time, these days. But according to the article (which, in case it's not obvious, shouldn't be taken as the final word on anything, and neither should this blog), there was a whole history behind it. Said history was lost in a fire, much like the contents of the Library of Alexandria. 1. Why Was Ivar the Boneless Boneless? I'm guessing because his dick didn't work. Was he missing an arm maybe, or a leg? That sounds reasonable, but it would still leave him with plenty of other bones and may not justify the “boneless” sobriquet. One theory is that he really did have no bones, as we normally know them, but had cartilaginous bones, a condition known as osteogenesis imperfecta. If he did, you’d think every account of his life would have mentioned this fact, long before detailing which parts of Northumbria he raided. I suppose it's also possible that he was extraordinarily flexible. Physically, anyway. However, one other theory remains. The name could have been a way of noting that Ivar couldn’t get a boner. We have no record of Ivar having children, and his name may have mocked his impotence. Or his dick didn't work. |
I can't resist a piece on Poe any more than my housemate's kitten, Edgar Allan Purr, can resist a dangling string. The cat's name was my idea, incidentally. I don't know much about this source, Crimereads, by the way. I just found this article while searching for something else, and, well, like I said, dangling string. Even though literature had, for centuries, brimmed with clever problem-solvers, from tricksters to reformed thieves to wise men to police prefects, Edgar Allan Poe’s detective story, “The Murders in the Rue Morgue,” still awed the literary world when it appeared in 1841. Detective stories were, for Kid Me, kind of like crossword puzzles: fun to complete, but the secrets of their creation might as well have been quantum physics for all I understood them. I would read a science fiction or fantasy book and go, "Hey, I bet I could write like that." Not so with the detective genre. The police are stumped. But C. Auguste Dupin, a chevalier and rare book aficionado, solves the mystery at home after reading the details in the paper, becoming literature’s first bona fide detective character and starting a genre revolution. Astute readers will note that thus, science fiction predated detective fiction as a genre. As literary critic A. E. Murch writes, the detective story is one in which the “primary interest lies in the methodical discovery, by rational means, of the exact circumstances of a mysterious event or series of events.” Critic Peter Thoms elaborates on this, defining the detective story as “chronicling a search for explanation and solution,” adding, “such fiction typically unfolds as a kind of puzzle or game, a place of play and pleasure for both detective and reader.” Like I said. Crossword puzzle. The well-heeled Dupin is an armchair detective who solves puzzles because he can, using a process called “ratiocination,” in which he basically ‘thinks outside of the box.’ I've known several people who prided themselves on "thinking outside of the box." Without exception, all of these people were lousy at thinking in the first place, and just did free-association. This is quite a different procedure than "ratiocination." (And it’s a good thing he does, or no one will solve these crimes; the murderer of “The Murders in the Rue Morgue” turns out to be an escaped orangutan. It might be safe to say no one else would conclude that.) Goddammit! SPOILERS! If Poe had not solidified the conventions that we recognize as marking the modern detective story, others likely would have done the same not long after. Okay, but this is true about pretty much any invention or discovery. If Einstein hadn't worked out relativity, someone else would have. If Nikola Tesla hadn't invented the alternating current method of electricity transmission, others would have. That in no way diminishes Poe's legacy. And no nineteenth-century detective lineage would be complete without Eugène-François Vidocq, a criminal-turned-criminologist who lived from 1775-1857 and who founded and ran France’s first national police, the Sûreté nationale, as well as France’s first private detection agency. His life inspired countless (swashbuckling) adaptations, including an American adaptation published in Burton’s Gentleman’s Magazine in 1828, entitled “Unpublished passages in the Life of Vidocq, the French Minister of Police,” which Poe very well might have read. Interestingly there’s a character in that story named “Dupin.” Ahem. "Good writers borrow. Great writers steal." -T.S. Eliot, and everyone who ever quoted him. Dupin’s ability to read extraordinary meaning into clues makes him rather the first semiotician, elucidating the relationship between signs, signifiers, and ‘signifieds’ more than a century before Ferdinand de Saussure published his work on the subject in 1966—particularly because Dupin finds his clues through linguistics rather than physical objects. Thus, Poe also invented postmodernism. Before modernism was even a thing. Years later, Arthur Conan Doyle wrote, “Each [of Poe’s detective stories] is a root from which a whole literature has developed… Where was the detective story until Poe breathed the breath of life into it?” He stole. Indeed, Doyle construed his detective Sherlock Holmes as an intellectual descendant of Holmes, having Watson (who also participates in a lineage offered by the Dupin stories, but of Dupin’s supportive narrator/chronicler and friend) cite Dupin upon first witnessing Holmes’s deductive genius. Clearly, that second Holmes should have been "Dupin." Everyone makes mistakes. Even fictional detectives, from time to time, when the plot requires it. As the article notes at the end, Holmes wouldn't have existed, at least not as we know the character, without Dupin. Neither would Batman. |
Some people peak in childhood. Some of us never peak at all. Here's a peek. What Happens to Spelling Bee Champions When They Grow Old? Spelling contests have remained a feature in American life since the Puritans landed on Plymouth Rock — but is peaking at 12 years old all it’s cracked up to be? Yeah, just to be clear, I'm all for stretching facts to comedic ends, but that bit about Plymouth Rock is plain bullshit with no humor in sight. Spelling was pretty fluid until the early 19th century, and the earliest mention of a spelling contest in the US comes from the first decade of that century. Anyway. Thinly veiled colonialism aside, the article: Sixty-two-year-old Brad Williams remembers what he ate for breakfast and lunch — Corn Flakes and hamburgers, respectively — on May 3, 1969, the day he won the Wisconsin state spelling bee after spelling grisaille and lamprey correctly. He also, miraculously, remembers exactly what he was doing on my birthday, September 2, 1978. See, now, that's the opposite of me, almost. I couldn't tell you what I had for lunch last Thursday, let alone on September 2, 1978. I'm told I was alive then; that's about all I know. And yet, I'm pretty good with spelling. Not perfect, and I'd be stumped by grisaille because I don't even know what that is but I'm pretty sure it's French. Williams credits his spelling prowess in large part to the hyperthymesia, since if the word in question happened to be on a study list, he could instantly visualize it on the page. I'd credit a spelling bee loss to hyperthymesia. If that was one of the words. As an aside, one snippet of memory I do have from the late 70s involves a spelling competition. Just a minor one, classroom only, and I don't even remember which grade I was in, but it was at least 4th. One girl was asked to spell "dough" and she said d-o-h. That probably stuck in my mind because it was the moment I realized I was better than everyone else. Then, probably a few days later, some bullies beat the shit out of me and I realized maybe I wasn't, after all. The 4-foot-5 kid had a trophy nearly half his size in his bedroom and considers winning the state bee as one of his life’s major accomplishments. That's fair. One of my life's major accomplishments was managing to wake up early enough for a job interview, once. It was invariably a part of Colonial education throughout the 1700s and 1800s, he writes, and the term “bee” referred to many different social events — e.g., “quilting bee,” “barn-raising bee” and “corn-husking bee” — in which an entire community came together, like bees in a hive, for a common goal. Again, I have some doubts about the practice before 1800. But I'm including this bit to illustrate what "bee" means in this context. This, though, is why I'm featuring the article in the first place: I'm a sucker for Mark Twain quotes, and apparently, he opened a local spelling bee in Connecticut in 1875: “Some people have an idea that correct spelling can be taught, and taught to anybody,” Twain remarked. “That is a mistake. The spelling faculty is born in man, like poetry, music and art. It is a gift; it is a talent. People who have this talent in high degree need only see a word once in print and it is forever photographed upon their memory. They cannot forget it. People who haven’t it must be content to spell more or less like thunder, and expect to splinter the dictionary wherever the orthographic lightning happens to strike.” Another of my misunderstandings, early in life, was that because I was pretty good (not perfect) at spelling and punctuation, I had everything I needed to be a writer. Boy, was I wrong. The first national spelling bee was held 25 years later in Cleveland on June 29, 1908. I was also pretty good at math. For instance, 1908 is 33 years later than 1875. A 500-boy choir sang, three bands performed and one newspaper enthused that “thousands of electric lights will furnish illumination.” Marie Bolden, a 14-year-old African-American daughter of a Cleveland mail carrier, won the competition with a perfect score. Perhaps due to the shock of a black contestant besting all of her white competitors, Maguire suggests the next national spelling bee wasn’t organized until 1925. This would be funny if it weren't for the goddamned racism. With probably a dose of sexism thrown in for good measure. As for whether spelling is still an important skill (the answer is yes), the article goes into that, then notes: Also, cautions Tracey Sturgal, a linguist professor and director of business communication at Marquette University, spell-check isn’t perfect. “People still have to have a certain level of spelling competence,” she explains. “The number one spelling error I get in college papers is when students flip definitely with defiantly, which is surely a spell-check error.” Spell-checkers also can't tell its from it's, one of my markers when privately assessing someone else's intelligence. Others include your vs. you're, and there, they're, their. I have to cut this short because my cat has a vet (that's veterinarian, not veteran) appointment, but in brief: unlike child actors in movies and television, who often seem to burn out and/or start using drugs (with notable exceptions like the kid who played Charlie Bucket in Wonka, who pursued a career as a vet (the first kind)), it seems aging former spelling bee champions mostly have their shit together. |
Today, we're going only about 4 years into the past for this entry from August of 2019: "What's Good for the Propaganda" The entry was, as mine have mostly come to be, prompted by an article. And the article, from Fast Company, is still up for now. It's a quick read if you're interested; it's about design as propaganda. I don't have much to say about my own comments, so I'll get them out of the way: That last tide seems to be turning, now, thanks to Elon Musk making electric cars that people might actually not be embarrassed to be seen in, as opposed to the pug-ugly "Smart Car." Boy, this bit didn't age well, did it? Though I stand by my assertion that Tesla's cars are aesthetically pleasing (and, reportedly, usually functional), Musk went ahead and crashed and burned like one of his other companies' rockets. As a result, today, I'd be embarrassed to be seen in a Tesla, regardless of how cool it might look. It's a disgrace to its namesake now. I don't usually think about it, which makes me a sucker. But I'm going to work on it. And I think I have, though I'd forgotten this particular article and entry over time. What I didn't do in that entry is take a look at the article itself, so I'll do a bit of that now. So the following quotes are from the article, not my entry: By elevating everyday, inexpensive objects that fit the museum’s criteria of “good design,” MoMA paved the way for modernism to hit the mainstream, launching the careers of seminal designers like Eero Saarinen and Charles Eames and displaying designs that visitors could actually buy. There's a problem with the word "modern," and I think I've touched upon it before. It can be used as a noun or adjective to describe something that's roughly contemporary, as opposed to obsolete or outdated. That definition is a bit slippery, as some cultural or technological things change faster than others. The other meaning of "modern" describes a particular artistic or aesthetic movement, one which probably peaked around 1950 or so (but I'm not an expert on these things). Whatever the actual end date was, what's called "modern" in art is now obsolete, thus rendering it not "modern" by the other definition. Oh, and in case you're not aware and don't want to read the article, "MoMA" refers to New York's Museum of Modern Art, which, despite me being a critic of just about everything inside it, is a great place to visit in the city. Or at least it used to be; I haven't been there for several years. There was a secondary motivation for the Good Design institution as well: economic expansion, both at home and abroad. According to Kinchin, the Good Design exhibitions, which were established in conjunction with the Chicago Merchandise Mart, played a key role in educating the American consumer about why they should be buying these kinds of American household products. So, the whole thing was basically an ad. In other words, the Good Design exhibitions were marketing. Stores that wanted to sell objects that had been featured could emblazon them with the program’s logo, a red dot with the words “Good Design” written inside. Catalogues for the exhibitions also included exactly where people could buy each product, like a glorified showroom with an institutional stamp of approval from the MoMA curators. Yep. Ad. The competitions, which were accessible to all, also opened the door for European designers to find a foothold in the U.S. market. For instance, MoMA’s 1950 International Competition for Low-Cost Furniture Design was meant to generate new furniture that could be mass-produced. Clearly, this resulted in the hegemony of Ikea, which I also mentioned in that earlier entry, accompanied by the words "stay the hell away from." That's about it for me. There's obviously more in the article. |
Rather appropriate for the first day of fall... Yes, I said fall. I get really, really tired of saying "or autumn if you're in an Anglophone country that's not the US, or spring if you're in the southern hemisphere, but some cultures considered this to be the midpoint of the season, so it gets complicated." As mine is the only perspective that matters, it's fall, and today sucks because we're getting spitting rain thrown off from a minor hurricane. Anyway. Why is it appropriate? Because while "equinox" literally means "equal night," implying that day and night are of equal length, that's not technically the case on the equinox, and part of the reason is twilight. For example, this is the (badly formatted) almanac for my area today (courtesy of Weather Underground): Sunrise 7:03 AM Sunset 7:11 PM Civil Twilight 6:37 AM 7:38 PM Nautical Twilight 6:06 AM 8:08 PM Astronomical Twilight 5:35 AM 8:40 PM Length of Visible Light 13 h 0 m Length of Day 12 h 8 m "Length of visible light" takes twilight into account. "Length of day" is, you'll note, 8 minutes longer than it would be if day were actually equal to night. That's because the atmosphere refracts sunlight, so the sun appears to rise earlier and set later than it would if we didn't have an atmosphere, which, if that were the case, there would be no one around to be picky about these things. The three twilights there... well, that gets us back to the Atlas Obscura article. When the sun slides out of sight—or, more accurately, when Earth rotates so that your particular place on the planet is no longer exposed to our day star—but before darkness consumes the landscape, there is the magical, muted time of twilight. One of my all-time favorite words is "crepuscular." Animal activity can be crepuscular, or there might be crepuscular rays visible in the sky due to distant clouds. It's an adjective meaning "of or pertaining to twilight." I'm unaware if it has a noun form; "crepuscule" just sounds wrong. Most of us think of twilight as a single period of transition from bright day to deep night, but there are actually three twilights: civil, nautical, and astronomical. Each is defined by how far below the horizon the sun is, and hint at our lives before artificial light and GPS. Or, for those of us who care, our lives now. Civil twilight has nothing to do with being polite; it’s simply the evening’s first twilight, which starts at sunset: the moment when the sun’s center is exactly at the horizon or, to get technical, 0 degrees below it. Again, this is not precisely true because of that refraction thing. It also happens, in reverse, near sunrise. Once the sun is 6 degrees below the horizon, nautical twilight arrives. If you’re not a degrees kind of person, hold your arm out to the horizon, with your hand turned so your palm is facing the horizon and your fingers, parallel to the horizon, are together but extended. Close one eye. The width of your first three fingers is roughly 5 to 6 degrees, depending on the size of your hand. That's nice to know and all, but at that time, you can't bloody see the bloody sun, so how do you know how far below the horizon it would be without math and a clock? Nautical twilight is the period after sunset when, though no longer able to read a chart without a lantern or torch, sailors could still take accurate readings. Reverse that for morning. Also, this doesn't take into account cloud cover or moon phase, which can shorten or lengthen the time when one could see the horizon. Also also, things are a bit different on land because you sometimes have hills and mountains and stuff. Your own height of observation plays into it as well. Regardless of all these variables, it's still cool stuff to know. |
Well, here's a source I don't think I've quoted before: Architectural Digest. 9 Creepiest Places You Should Probably Never Visit From an abandoned asylum to a church adorned with human skulls, these sites are not for the faint of heart Dude. I've spent the night at the Clown Motel in Tonopah, Nevada. You think these, or any, locations can faze me after that? Except for, you know, actual dangerous places like active volcanoes or Baltimore. Article is from October (hence the theme) of 2019, but I doubt if much has changed. Though popular attractions, these creepy places have the benefit of being rooted in history with layers of culture, which means they’ve been well preserved over time. That also means maybe don't use other cultures for cheap thrills without their permission. Instead, these are places packed with bones, scrawny cats, and the paranormal. Bones are just part of the life cycle. I hope the kitties are okay. And the paranormal is all in your head. Now, remember, for full effect, you'll need to click on the link for helpful pictures. Island of the Dolls, Mexico Somehow, I doubt that's its official Mexican name. Isla de las Muñecas, as it's called in Spanish, is south of Mexico City... Or that city name, for that matter. ...the island is largely deserted, save for hundreds of dolls hanging in the trees... Gosh, I wonder why. Now, would someone please bet me lots of money that I can't spend the night there? Mansfield Reformatory, Ohio Oh gods, no! Not Ohio! It's no longer in operation—it closed in 1990—but you can go on a guided or self-guided tour... Alone. Nagoro, Japan Small-town Japan is quaint, but this eerie village in the Iya Valley has just 30 residents—and over 400 large dolls. Again with the dolls. Though I'm pretty sure these are technically mannequins. La Recoleta Cemetery, Buenos Aires, Argentina At least they tried to get the language right, this time. ...this Buenos Aires resting place is seriously haunted—even the city's tourism website endorses the Neo-Gothic cemetery’s status. How much you wanna bet the city had some giggling workers put in speakers and animatronics on the sly? Trans-Allegheny Lunatic Asylum, West Virginia The hospital earned National Historic Landmark status in 1990 but closed down in 1994—and rumored ghosts have haunted the premises ever since. Oh, fun! Not just ghosts, but batshit hillbilly ghosts! Sedlec Ossuary, Kutná Hora, Czech Republic The design dates back to 1870, when a local man was hired to take bones stored in a crypt and turn them into art. I might have to go Czech this out. Veijo Rönkkönen, Finland Named for the artist who created the 550 concrete sculptures within—all human figures in a forested setting—it can appear overwhelming, as if you are being watched or maybe even judged. Basically, more dolls. The Hill of Crosses, Lithuania Based on the picture in the article, this is actually pretty cool. Akodessewa Fetish Market, Togo ...just keep in mind that "fetish" has at least three meanings, and this is one of the less salacious ones. Still, the appropriate definition here includes the phrase "believed to be inhabited by a spirit," so, you know... good luck. So, by my quick scan, there's at least one of these on every continent, save for Antarctica and Australia. Way to be inclusive, guys. Antarctica's probably left out because there's just not much there, and Australia because, well, that whole continent/island is creepy. I'd visit all of them. Wouldn't you? |
Might as well call this "How To Stop Being Awkward." How To Read The Room Like A Pro Reading the room is about seeing and hearing what’s both spoken and unspoken. And it’s a skill well worth mastering. Lots of skills are well worth mastering. That doesn't mean we have the time, patience, or opportunity for all of them. For instance, to practice reading a room, one would have to repeatedly go out among people. You walk into a conference room, dinner party, or group of playground parents and make a comment that immediately shifts the ballast of the conversation. Eyes dart at you. Their message is clear: Dude, read the room. But you’ve already said or done something out of sync with what’s appropriate in the moment. Despite what I said above, I do go out among people from time to time, or at least I used to. While I can't say I've never embarrassed myself, I do think I have some skill at reading body language and facial expressions. Conversely, though, I've sometimes been one of the people saying, "Dude, read the room." Learning how to read the room is an important skill, one that can be honed by pausing to observe a few key details. No time for that. I have jokes to tell! Think about if you’re home alone. You know that you can act a certain way, i.e., wear no pants. I think I'm one of those weirdos who just feels better, even at home alone, wearing pants (well, sweatpants, but whatever). And a shirt. And a hat. If company is there, you know enough to put on clothes. Not if I want them to go away. You want to practice, and like with a recipe or golf swing, you can get better if you invest the time. Find a partner who’s willing and will provide honest feedback. "Find a partner." Right. Like that's easy. Look at how people’s shoulders are angled. Then notice where their chests are pointing. For fuck's sake. I've spent all of my adult life, and teenage years, forcing myself to NOT stare at half the peoples' chests. You also want to notice people’s expressions as you listen to what they’re talking about, keying in on the paraverbals — the cadence, tone, volume, pace. Hey, I learned a new word! If "paraverbals" is really a legitimate word. My browser underlines it with that squiggly red line. It’ll take time, and you’ll make mistakes but because you’re trying, they’re usually non-fatal. That's too bad, because if it's non-fatal, you spend the rest of your life waking up in a cold sweat at 4 am going, "Goddamn, I wish I hadn't said that." Anyway, there's a lot more at the link. To me, it highlights the dangers of dealing with people. Fortunately, I've made enough mistakes of that sort that I never get invited anywhere anymore. Things are much easier online, and I'm all about easy. |
I fail to see how this is better. We may earn revenue from the products available on this page... Popular Science used to be a credible publication. Washing dishes is awful. Well, sure; any chore is. But I can name a whole lot worse. Thankfully, a woman named Josephine Cochrane, who was really concerned about her fancy china getting chipped while being hand-washed, stepped up and invented the first dishwasher. Yes, I've referenced her before, here: "Pretty Petty" . I would like to reiterate that she invented it so she wouldn't have to do the dishes herself because her servants were incompetent. Presumably, she'd chip the china too, and then who was she going to blame? Eh, probably the servants, anyway. Now, other than the brainpower and Tetris skill you need to load the machine, you’ve only got to put some detergent in the soap compartment and press “start.” Brainpower? Tetris skill? I like a good hyperbole, but it's really not that difficult. Sure, Cochrane’s invention saves you time, but you still have to buy detergent. The horror. Make your own dishwasher tablets, though, and you’ll save some money. The draw of a dishwasher is that it saves you time. It lets you be lazy, or at least concentrate on working on something else, if you're one of those weirdos who has to be productive all the time. Making your own dishwasher tablets defeats the purpose, much as making pancake batter from scratch defeats the purpose of buying frozen microwavable pancakes. It’s easy... Having read the instructions, no, it's not. ...makes cleaning up a bit more exciting... A number can be less negative, but still negative. and will leave your most likely not-so-fancy china shiny and smelling of fresh lemon… Yep, that's the goal: have your plates and silverware, with which you're going to eat, smelling like fruit or flowers. Not. Time: 20 minutes (with a minimum 24-hour drying period) In my experience, you have to triple the time (at least) of any recipe, food or not. Material cost: $27 ($5.40 per batch—or $0.14 per pod) Similarly, the given cost is always underestimated. Difficulty: easy Lie. So, the rest of the article is the actual recipe, presented in the same style as food recipes. I'll give them this, though: the intro wasn't 20 pages long, though some of the steps have meta-information to make up for the brevity of the intro. Considering that it's finicky, and involves a lot of cleanup in itself (all the equipment needed, plus the inevitable powder spills on your counter), I'm not convinced you're saving anything. You get bragging rights, maybe, but doing all that work defeats the purpose of having a dishwasher in the first place. At least they seem to have gotten the science right, which, considering the source, is a bare minimum. |
I wasn't aware there was still any doubt, here. Animal magic: why intelligence isn’t just for humans Meet the footballing bees, optimistic pigs and alien-like octopuses that are shaking up how we think about minds Just a few gripes in the above before we get into the meaty bits: 1) Why muddy the waters by calling it "magic?" 2) As I'm sure we can all attest, intelligence isn't for all humans; 3) How do we know octopuses are alien-like, when we haven't met any aliens yet? Hopefully the rest of the article, and the book it's pushing, isn't that sloppy. Guardian link, so beware of British spellings. It's fairly long, so just a few highlights: But what makes a pig optimistic? In 2010, researchers at Newcastle University showed that pigs reared in a pleasant, stimulating environment, with room to roam, plenty of straw, and “pig toys” to explore, show the optimistic response to the squeak significantly more often than pigs raised in a small, bleak, boring enclosure. In other words, if you want an optimistic pig, you must treat it not as pork but as a being with a mind, deserving the resources for a cognitively rich life. I don't want an optimistic pig. I want tasty bacon. If treating it like Pig Royalty makes the bacon taste better, great. We don’t, and probably never can, know what it feels like to be an optimistic pig. Objectively, there’s no reason to suppose that it feels like anything: that there is “something it is like” to be a pig, whether apparently happy or gloomy. Until rather recently, philosophers and scientists have been reluctant to grant a mind to any nonhuman entity. Philosophically, we can't know what it feels like to be a different human, either. Oh, sure, we can take some guesses, and maybe even listen to them blather on (or read the blather) about this or that which makes them happy or angry or whatever—they seem to like that—but there's no way to really know what it's like to be them. Except that it probably sucks. To René Descartes in the 17th century, and to behavioural psychologist BF Skinner in the 1950s, other animals were stimulus-response mechanisms that could be trained but lacked an inner life. Perhaps we, too, are stimulus-response mechanisms, and our "inner life" is entirely illusory. After all, as Charles Darwin pointed out, we all share an evolutionary heritage – and there is nothing in the evolutionary record to suggest that minds were a sudden innovation, let alone that such a thing occurred with the advent of humans. This is true, but we didn't come by the "humans are special" philosophy by means of science, but through religion. Consider the often maligned bird brain. Compared with bird neurodiversity, humans are a monoculture. Birds’ minds are scattered widely in mind-space, their differences and specialities tremendously varied. Some birds excel at navigation, others at learning complex songs or making elaborate nests. Well, okay, but you can't compare {the set of all humans} with {the set of all birds}. Well, obviously, you can, but it would be like comparing the works of Beethoven with the entirety of EDM. Compare mammals with birds, or humans with ravens, or you have a categorization problem that confuses SAT-takers. We award pride of place in the hierarchy of bird minds to tool-using species, especially corvids (crows, ravens, rooks). The most masterful of them is the New Caledonian crow of the south Pacific, which will design and store custom-made hooks for foraging, and even make tools with multiple parts. Among animals, great apes, dolphins, sea otters, elephants and octopuses are the only others known to use tools. I have yet to hear of any nonhuman animal using a tool to make another tool. That, in my view, signals an ability to plan beyond the present and near future, and seems to be a key trait of humanity. Maybe it's happened and I just haven't heard of it. Although animal communication can be subtle and complex, it’s generally thought that no animal besides a human uses symbolic communication, where one concept is represented by another, as it is in writing. None, that is, except perhaps the honeybee, which conveys information about a distant food source to its hive members by dancing. Pretty sure symbolic communication is more widespread than that, but I'm no expert. Still, I doubt they grasp the concept of metaphor. Hell, half of humanity doesn't grasp the concept of metaphor. The article goes on to describe octopus cognition, and it's fascinating too. But, lest all of this talk makes you want to become vegan (as for me, it just makes me want bacon): Even this, however, might sound tame compared to the idea that plants have minds. Yet that proposition is no longer confined to the fringes of new-age belief; you can find it discussed (relatively) soberly in august scientific journals. There, it often goes by the name of “plant neurobiology” or, in a more extreme form, “biopsychism” – which supposes that every living being from bacteria up has sentience of a sort. "Biopsychism" is at least more palatable to me than panpsychism, which I've ragged on in here before, at length. That's the unprovable and unfalsifiable philosophy that all matter has rudimentary consciousness. We might not (and may never) agree about whether plants, fungi or bacteria have any kind of sentience, but they show enough attributes of cognition to warrant a place somewhere in this space. This perspective also promotes a calmer appraisal of artificial intelligence than the popular fevered fantasies about impending apocalypse at the hands of malevolent, soulless machines. But those fantasies are fun. Likewise, most of our fantasies about advanced alien intelligence suppose it to be like us but with better tech. That’s not just a sci-fi trope; the scientific search for extraterrestrial intelligence typically assumes that ET carves nature at the same joints as we do, recognising the same abstract laws of maths and physics. But the more we know about minds, the more we recognise that they conceptualise the world according to the possibilities they possess for sensing and intervening in it; nothing is inevitable. There's good reason for this assumption: with it, we know what to look for. Again, though, I doubt the existence of technological aliens close enough for us to detect. We also have some idea how we might detect signs of any sort of life, including microbial, in the atmospheres of exoplanets... because we've studied our own biosphere. If we knew to look for different signals indicating a different biology (or technology), you bet we'd be looking for them, too. Also, I didn't miss the inherent carnivorous metaphor in the above quoted bit: "carves nature at the same joints as we do." I find it amusing in a text that, while not explicitly pro-vegan, could certainly nudge a few people in that direction. |
Wood isn't known for its long-term durability. Smithsonian notes an exception. This Wooden Sculpture Is Twice as Old as Stonehenge and the Pyramids New findings about the 12,500-year-old Shigir Idol have major implications for the study of prehistory Some environments preserve wood better than others, though: Gold prospectors first discovered the so-called Shigir Idol at the bottom of a peat bog in Russia’s Ural mountain range in 1890. For our purposes, I'm going to call this carved hunk of tree St. Peat. The unique object—a nine-foot-tall totem pole composed of ten wooden fragments carved with expressive faces, eyes and limbs and decorated with geometric patterns—represents the oldest known surviving work of wooden ritual art in the world. I'm wondering how they determined it was ritualistic and not just, you know, ars gratia artis. Was it like "Oh, our ancestors were all cowardly and superstitious, and whatever they did, they did to appease the gods and/or spirits?" Because they definitely weren't cowardly (couldn't be), and what religion they did have probably wasn't regarded as superstition any more than you regard your religion as superstition. Based on extensive analysis, Terberger’s team now estimates that the object was likely crafted about 12,500 years ago, at the end of the Last Ice Age. Technically, by some accounts, we're still in an Ice Age, just one that's in retreat. “The landscape changed, and the art—figurative designs and naturalistic animals painted in caves and carved in rock—did, too, perhaps as a way to help people come to grips with the challenging environments they encountered.” Or perhaps because art changes over time. Maybe not as quickly as today's deliberate art "movements," but then, as now, people get bored and/or inspired and make art. I've heard that hunter/gatherer societies had lots of free time, more than us civilized folks. The debate has major implications for the study of prehistory, which tends to emphasize a Western-centric view of human development. I imagine it's hard to make definitive conclusions about a society from one lone artifact, but that's no excuse to let one's preconceptions fill in the gaps. Prevailing views over the past century, adds Terberger, regarded hunter-gatherers as “inferior to early agrarian communities emerging at that time in the Levant. At the same time, the archaeological evidence from the Urals and Siberia was underestimated and neglected.” H-Gs aren't "inferior." Just different. Don't get me wrong; I like civilization. But it does have its downsides. João Zilhão, a scholar at the University of Barcelona who was not involved in the study, tells the Times that the artifact’s remarkable survival reminds scientists of an important truth: that a lack of evidence of ancient art doesn’t mean it never existed. Rather, many ancient people created art objects out of perishable materials that could not withstand the test of time and were therefore left out of the archaeological record. Much is made of cave art, and for good reason: it's a window into the thoughts of humans (and related species) of the past. But I find it difficult to believe that they confined their paintings to cave walls; that's just where the art would be best preserved. Artists today (using the term very broadly) would leave their mark everywhere, if they could; and some do. For all we know, every stone was covered in graffiti, every tree carved, every cliffside covered in murals. We're not so different from the people who carved St. Peat, in other words: creative, curious, aware of our mortality, at least moderately intelligent, social, communicative. Sure, we likely have different priorities in life ("make money" instead of "look out for tigers," e.g.), and we, or at least most of us, have more knowledge about the world around us. But as I keep saying, don't conflate knowledge with intelligence. And don't be so sure that a 12,500-year-old work of art was what you interpret it to be. |
Archaeologically excavating the bones of the midden heap that is my blog, we come to this early attempt at an entry from near the end of November, 2007: "Another Saturday Night" The title comes from an old Cat Stevens song, but it was one of his more popular ones, so I guess I figured people knew the reference. That's the problem with reference jokes: not everyone is in on it. Though one could make the same observation about any joke that doesn't involve things we all share. Like farts. But fart jokes aren't funny. I say all this because the whole entry was about me racking my brains for a Comedy newsletter editorial. I've done thirteen of those a year ever since the year I wrote that linked entry, and while it's sometimes easy to find a topic, I'm often stumped until the looming deadline forces me to just pick something, dammit. And not jokes about impending deadlines, either; that only worked once. Or maybe twice. I don't know; I'm certain I repeated many topics over the years. Which reminds me, I need a Fantasy newsletter topic today. What will it be? I don't know yet; it's not close enough to the deadline for me to feel enough of a sense of dread, which is when my brain finally wakes up. Anyway, from that entry: Thing is, I didn't start out to be funny - I started out to write science fiction. The two are, of course, not mutually exclusive, but one risks being accused of ripping off Hitchhiker's. Unless you're a Star Trek writer; they do a great job at jabs with Lower Decks. A common trick in science fiction, one which goes all the way back to its roots in Mary Shelley's Frankenstein, is to write in the Alien Observer. Ugh. The naïveté of youth. Half of this entry is embarrassing to me now, being so obvious. I guess it was new to me at the time. So I won't paste more of that. I'll just note that, when I was done, I realized I had written myself into an editorial, assuming a few changes. It's been so long, though, that I don't remember if I actually used it or not, and can't be arsed to go back that far in the newsletter archives. It may have been the first time I turned a blog entry into a newsletter editorial, but it certainly wasn't the last. |
Ever been told to stay in your lane? The Windy History of Penny Lane: The Beatles, the Slave Trade and a Now-Resolved Controversy Was Penny Lane named after a notorious slave trader? Recent protests reignited the debate If you've been following along, you know I rarely stay in my lane. One excursion I often make is into the subject of etymology, and, in the past, I've been especially fascinated by the etymology of currency. For instance, did you know what dollars and Neanderthals have in common? Both were named after valleys in what were then German-speaking regions: Joachimsthal and Neanderthal (those are old spellings; modern orthography omits the h from the th). Joachimsthal was a silver-mining location, and the silver coins made from its product were Joachimsthalers, and it's easy to see how you get from Joachimsthalers to thalers to dollars. (Neandertal has a linguistic connection to "new man," as it's partly a Romanized version of the surname Neumann, or "new man," but that seems to be one of those apt but unrelated coincidences that happen occasionally.) The origin of the pound (as a unit of currency) is similarly well-documented. But for something so common and widespread as the penny, well, they're not entirely sure. It's somehow related to the Roman denarius, which is why, for example, 12d nails are pronounced "twelve-penny nails." What I'm fairly certain of is that it's completely unrelated to the similarly-spelled penis, which in Latin was a word for penis. Penis. Anyway, it gets even murkier when you have people with the surname Penny or Penney. Apparently, one of them was a slave trader, and of course that's one of the occupations we don't name things after anymore. An old theory linking the street to a notorious slave trader had resurfaced due to the protests surrounding the police killing of George Floyd — and a cadre of local historians discovered that their research was now thrust into the public eye. That idea doesn't deserve the label "theory." At best, it's a guess. “It’s been an academic debate, really. So it’s a bit of a surprise to us all, to be honest; we’re sort of taken aback. We’re not used to this larger media interest in the names of streets going back to this, you know, 17- and 1800s — it’s not the usual thing that makes the news.” My hot take on this is: even if it was named after James Penny (for which, as the article indicates, there is no evidence whatsoever), that was so long ago, and the word is so common in other contexts, that it just doesn't matter anymore. Besides, any metadata on that particular street changed drastically nearly 60 years ago, when it became forever and indelibly associated with the Beatles. Following the graffiting of the signs, though, Liverpool’s Metro Mayor Steve Rotheram made international news after proclaiming the famed street name may be changed if there was evidence it was named after 1700s slave trader James Penny. On the one hand, it's their city and they can do what they want with it. We fought and won a war so that England couldn't tell us what to do anymore; it's only fair that we don't interfere with them, either. But, as you also know if you've been following along, I really hate to see falsehoods become accepted as truth. Enter MacDonald and other historians, who have been researching the area for more than 10 years and claim there is no connection between Penny Lane and the slave trade. It's remarkably hard to prove a negative. The only thing a historian can do in such a case as this, if they can't definitively show that Penny Lane was named for someone (or something) else, is to indicate that there's no known connection. So, if not some arsehole trader of human beings, what was it named after? They, um, don't know. According to the historian, the earliest mention of the lane was from the 1840s, when it was listed as Pennies Lane. In maps going back to the 1700s, it was merely an unnamed country road. Meanwhile, James Penny died in 1799 — plus, he already had a street named after him: Arrad Street, named for his birthplace in Ulverston, Cumbria. Okay, but that last bit is hardly definitive. Plenty of slaveholders had multiple streets named after them here in the US. I think the important bit is "Pennies Lane," though. Even with the very fluid spellings of the 18th and early 19th century, you'd think that it would have shown up as Penny's Lane or at least Pennys Lane, if it was connected to anyone with that surname. “Penny Lane about that time would have been a fairly rural country lane,” MacDonald says. “So that struck me. It would be very off that a lane in the middle of the country would be named after somebody in the same way that prestigious streets in the town center would.” You know what would make sense for a rural country lane in 19th century England? If you had to pay a toll to traverse it. Tuppence, perhaps, or thruppence with inflation. Hence, "pennies lane." But did these boffins investigate that blindingly obvious (to me) possibility? Well, maybe. Maybe not. The article doesn't say. So don't go quoting me on the last paragraph; it's just as much a guess, albeit a harmless one, as the spurious James Penny connection. In any event, as the article eventually winds around to telling us, it seems that "no evidence of a connection" is sufficient to keep the iconic street name around... for now. These controversies, however, have a way of coming out of remission later. If that's because some evidence is actually discovered, well, fine. But if it's just another urban legend, well, I trust the Liverpudlians will treat that with all the attention it deserves: about two pennies' worth. |
It's a promo on LitHub for a book, but whatever. We're all readers here. And it tracks with what I already knew. How the Banana Came To Be—And How It Could Disappear Emily Monosson on the History, Evolution, and Biological Enemies of a Staple Fruit Yesterday, I bought some bananas, and noticed that since the summer, they had doubled in price! ...from 12 cents each to 24. (This is one reason you have to beware of that sort of emotional language when you encounter it.) I still find it difficult to believe that a banana's actual worth, a couple thousand miles from where they're grown, is less than a quarter. I'm sure there are all sorts of subsidy shenanigans going on, and low-paid workers involved, but I can't be arsed to research it. Bananas are a fruit that unites the world. Not so sure about that, but as fruits go, they're definitely one of the easiest. Peel and eat, no messy juice, no seeds. I saw a video once where a creationist took that and ran with it as "proof" of intelligent design. Well, they were intelligently designed, all right: by intelligent humans, who genetically engineered a seed-infested fruit with very little meat and turned it into a snack as easy to eat as a Milky Way bar. Though there are thousands of varieties, most of us in the western world eat only one: the Cavendish. Or, as we tend to call them here in the US, "bananas." Of the twenty-two million tons of bananas exported to the United States, Europe, and Asia, most are grown in Latin America and the Caribbean. I still remember a brief tour of a banana plantation in Belize. But this article goes into a lot more detail around the growing and shipping process, which I'm not going to repeat here. The banana plant is easy to mistake for a tree, but it is the largest known herbaceous flowering plant. This is a distinction much like tomato being, botanically, a fruit. The plants grow, produce flower and fruit, and die. Ready-made metaphor there. The article also goes into the rise and fall of the previous kind of shipped banana, the Gros Michel (commonly translated to English as "Big Mike," but I prefer "Fat Mikey.") That's fairly widespread knowledge now, as is the whispers, going on for some years now, of the soon to be certain demise of the Cavendish. Hasn't happened yet, but with banana prices doubling here, maybe it's happening? What the article doesn't get into is whether there's a suitable, fungus-resistant successor to the Cavendish. We can be pretty clever with our bioengineering and selective breeding, so I expect they'll find a way. Otherwise, based on the numbers in this article, a whole lot of people are going to find themselves unemployed. And, worse, I'll have to find another easy fruit to like. And that, for me, holds no ap-peel. |
Having lived through the Great Coke Crisis of 1985, I'm still fascinated by how companies can sometimes shoot themselves in the foot. Coca-Cola, of course, survived their mistake (or perhaps it was a calculated conspiracy ). But can we really blame marketing for taking the L out of Schlitz? Throughout the first half of the 20th century, the Milwaukee-based Joseph Schlitz Brewing Company held the gold crown as America’s largest brewer. Which glosses right over the dark, dreary days of Prohibition, a significant chunk of "the first half of the 20th century." Like other brewers, Schlitz endured through diversification. But that's irrelevant right now. Then a series of business decisions, including a disastrous ad campaign, dubbed the “Drink Schlitz or I’ll kill you” campaign, precipitated the downfall of America’s biggest beer brand. That campaign lasted less than three months, coinciding with a tumultuous year for consumerism in general (1977). It's said that "no publicity is bad publicity," but I'm not so sure about that; still, I'm pretty sure other "business decisions" were more proximate. Like how they stopped making beer and started making formulaic adjunct lager like their competition. By the late 1950s, Schlitz lost its top title to another quintessential American beer brand: Anheuser-Busch. Calling that "beer" is an insult to beer, much like calling Kraft Singles "cheese" is an insult to cheese. During the 1970s, in an attempt to cut production costs and keep up with growing demands, Schlitz’s owners decided to shorten the beer’s brewing time by implementing a process called “accelerated batch fermentation.” This, by itself, should have been enough to kill the brand. But, as with New Coke, Americans will put up with a lot of enshittification. They also opted to replace its malted barley with a cheaper ingredient, corn syrup, and began experimenting with the use of a silica gel to prevent haze once the beer was chilled. That first bit, there, was enough to make it not-beer. Not necessarily bad; lots of great beverages start with corn syrup. But not beer. And nowadays, of course, lots of people love hazy beers. Sales dropped as Schlitz’s customers grew frustrated with the brand and started returning cases of beer. As tasteless as many Americans are, you mess with their favorites at your own peril. (The silica gel thing resulted in a recall, incidentally, never a great thing for any business). In an effort to stem its declining sales and improve its spiraling reputation, the company hired an ad agency, Leo Burnett & Co., to launch four television spots. Now, here's the thing: that could have worked. Marketing, done right, can sell anything, including pet rocks and bottled water. You can have the shittiest product in existence, and if you have the right marketing, it'll sell anyway. And truly great products, without marketing, can fade into oblivion. But by the time Schlitz did those ads, it had already become a terrible product. Schlitz closed its Milwaukee brewery in 1981. For context, home brewing became legalized in October of 1978, yet more proof that Jimmy Carter was a severely underrated President. That paved the way for smaller breweries to get some market niches. And Sierra Nevada opened in 1981, still producing craft beer today (I'm not a fan, but I can't argue that it's not real beer). Today, of course, is a wonderful time to be a beer snob. I'd call it a "Golden Age," but Golden is associated with Coors, and they suck too. But there are still beer marketing controversies, as I'm sure you remember from earlier this year. I mention these things sometimes, because even if you're not trying to market anything (a book, perhaps), we're all affected by marketing every day, and a little cynicism can't hurt. And maybe this will help keep you from getting stranded up Schlitz creek without a paddle. |
One of my favorite tropes is the mad scientist who, when no one believes them, tests the creation on themselves. In stories, that rarely ends well. In real life... well, it usually doesn't end well there, either. But here's an exception... sort of. The Inventor of Ibuprofen Tested the Drug on His Own Hangover Stewart Adams’ headache subsided—and his over-the-counter pain reliever became one of the world’s most popular medications A hangover cure—that is, a real one, not the folk remedies like "mix a raw egg with last week's coffee grinds" nonsense—is one of the most important inventions one can make. This is not only because it pisses off the puritans who get apoplectic when someone's not punished for having a good time, but that's a huge bonus. In retrospect, perhaps toasting the success of a new medication he helped invent with several shots of vodka in Moscow was not a good idea. Are you kidding me? That would be epically awesome! Well, not while there's a war going on over there, but any other time. English research scientist Stewart Adams was faced with the consequences of his actions: a serious hangover. But what if there doesn't have to be consequences? He reached for that new drug and swallowed a 600-milligram dose. For reference, most OTC ibuprofen comes in 200mg tablets, and the usual dose is 2 pills for 400mg. I've been prescribed as much as 800mg by dentists and the like. So this is not outside of the realm of sanity, mad scientist trope aside. Stewart Adams and his associate John Nicholson invented a pharmaceutical drug known as 2-(4-isobutylphenyl) propionic acid. It was later renamed ibuprofen and is now one of the world’s most popular nonsteroidal anti-inflammatory drugs (NSAIDs)... Sadly, my doctor insists I shouldn't be taking NSAIDs unless they're prescribed. She had no comment on my drinking, though. Stewart Adams began his career in pharmaceuticals at the young age of 16, when he started an apprenticeship at a drug store owned by Boots UK Limited, then known as Boots the Chemist. He went on to earn a degree in pharmacy at the University of Nottingham and then received his PhD in pharmacology at the University of Leeds. Looks like he was one of the most British people to ever Brit. Now if someone could explain to me why they call that other pain relief medicine acetaminophen here, but paracetamol over there, that'd be great. I can't be arsed to look it up. |
We're back to the solar system articles today. This is a hot one. The article's from 2016, but that's a bit over 10 days on Venus (...yes, I did the math; the article came out 10.1893 Venusian days ago). On the day that I was born—winter solstice, 1959—a headline in Life magazine proclaimed “Target Venus: There May be Life There!” In that era, we were trying to find the slightest hint of how life may exist on any other planet. It told of how scientists rode a balloon to an altitude of 80,000 feet to make telescope observations of Venus’s atmosphere, and how their discovery of water raised hopes that there could be living things there. For context, this was after Sputnik 1 but before humans put other humans into orbit. The very first thing that scientists discovered with a mission to another planet was that Venus was not at all the Earthly paradise that fiction and speculative science had portrayed. From what I can gather, the surface of Venus will kill you even quicker than vacuum will. As a possible home for alien life, it has been voted the planet least likely to succeed. You probably heard the breathless hype over the detection of phosphene in that planet's atmosphere a few years back, touted as a possible sign of some sort of life there. What you probably didn't hear was that the data didn't withstand scrutiny, and we're back to square one. Russian and American spacecraft also found hints that the primordial climate might have been wetter, cooler, and possibly even friendly to life. Which is a far cry from claiming that life still exists there today. For most of Earth’s history, Venus may have been the nearest habitable planet and possibly even home to a thriving biosphere. For billions of years, our solar system may have had two neighboring wet, geologically active, habitable rocky worlds. They may even have very occasionally exchanged life when meteors struck and catapulted shrapnel from one planet to the other. I just want to emphasize, again, that this doesn't imply sentient Venusians, now or in the past. In any case, the article is necessarily full of speculation such as this, because we just don't have enough data. Which is a shame, because it's often the closest planet to us. The Moon isn't technically a planet, Mars is sometimes closer to Earth than Venus, and, on average, according to some calculations, the nearest planet on average is Mercury, which makes sense when you think about relative orbital speeds. What's not in dispute is that it's the planet with the closest orbit to ours. While the author bangs on about climate change and the greenhouse effect, I can excuse that in this case because he's actually a planetary scientist. Well, an astrobiologist, anyway. Close or not, Venus presents challenges Mars simply doesn't. The latter doesn't have much atmosphere, for instance, while the former has, arguably, too much of one. We can handle near-vacuum, but not corrosive acid at immense pressure. Not well, anyway. But we're trying, and we're clever and curious. We'll probably have more data in a few days. A few Venus days, anyway./size} |
No external link today, just some personal reflections on 19 years of Writing.Com. The problem with having an account creation anniversary today is twofold: One, it falls close on the heels of WDC's Birthday Week celebration. On this site, that's rather like being born just after Christmas in the US: everyone's burned out on celebratory cheer, and ready to go back to being their usual grumpy selves. Or, possibly, gearing up for NaNoWriMo (shameless plug for "October Novel Prep Challenge" [13+], which is taking signups and looking for more Contest Round judges). And two, it coincides with another anniversary. That other anniversary was very fresh in everyone's minds back in 2004, when I signed up here. This year, I haven't seen any mention of it at all, and I follow the news cycle pretty damn well for someone who shuns popular social media (RSS is still around, and it's easy to set up your feed the way you want it; it's just not as easy as getting it spoon-fed by the algorithms on Xitter and the like). But it wasn't supposed to be September 11; it was supposed to be the 10th. As I recall, I had some problems getting registered way back then, and it was only about 2am on a Saturday that it finally worked. That's also the reason for the annoying 02 at the end of my username; earlier attempts to register as just cathartes failed, but locked out that option. I keep meaning to change my username, but after 19 years, it seems like that would cause more problems than it would fix. I've told that story before, but who could read all the entries here? I sure haven't. For the curious, I'm pretty sure this was the first thing I posted in my port: "Ghost Poem #1" [ASR] Considering my evolution since then, it's strange that my first bit was a poem, and a serious one at that. I still do them occasionally, but I'm more tuned to comedy, these days. At least I'm not embarrassed by that poem, which is older even than my account. Not embarrassed much, anyway. I barely remember last week, let alone 19 years ago, and I don't know a way to view one's portfolio in chronological order. So I could be wrong about that being my first item. Doesn't matter; close enough, as it was the same day I joined. That, of course, made me a Registered Author. I don't remember when I became Preferred. I also don't remember the exact date I got the blue case, but it was sometime in August of 2006... probably. Like I said, my memory sucks. I still think of myself as one of the newer moderators, when that's objectively not the case (pun intended). And I started this blog the following New Year's Day. I took a years-long hiatus from it, but decided to keep going with the same item rather than start a new one, mostly because come on, the name is awesome. Unfortunately, it's now almost full. Around 95% by byte size, and 82% by number of entries, not counting this one. I doubt it'll last another year at the rate I post (daily, relatively long entries), so this might be my last anniversary retrospective here. I won't delete entries, and it's unclear to me whether I'd get more space in the blog if I switched to a Premium Plus membership. I'll design that bridge when I get to the chasm. Right now, just let me end by thanking everyone who left me anniversary notes and done Anniversary Reviews, including one of that ancient poem of mine I linked above. This is not me begging for more, only acknowledging those that have done these things. It's been a great community, and I've made a lot of friends I never would have otherwise met, both online and in person. You know who you are. While I doubt I'll be around for another 19 years, as I'm pretty sure I'm not an immortal, I'm not planning on going anywhere anytime soon. So you're stuck with me... unless, of course, you put me on ignore. |