Items to fit into your overhead compartment |
As a (mostly) solo traveler myself, this article from Business Insider caught my attention. Okay. Mostly I wonder why they bothered to publish this. Is it some sneaky pronatalist propaganda? Shill for the travel industry? Just a way to get eyeballs on the site? For most of my 20s, travel was my whole personality. Huh. Most of us had "struggle to find an entry-level position and not get laid off" personalities in our 20s. So, when I started feeling a little stuck in the summer of last year at almost 29 years old, I did what had always worked before: I packed a bag, booked a one-way ticket, and left. Oh, no. The horror of turning 29. But then, one afternoon, hiking through the jungle, watching scarlet macaws flash across the sky, I felt it: nothing. No awe, no wonder, just a dull, creeping awareness that I'd seen this all before, that I could be anywhere, that none of it was touching me the way it used to. This Just In: people change as they get older. It's not always about "growing up." It's not about "putting away childish things." It's just change. I'm certainly not the person I was in my 20s, and while I can't point to a certain event and say "This was the watershed moment, the point at which my tastes changed," it happened. Perhaps gradually. Now, travel just felt like I was running away. I wasn't discovering new things about myself. I wasn't growing. I wasn't even particularly interested in where I was. Okay, well, I'll give her points for recognizing this and not holding on to old habits just because they're old habits. When I came back to the US, I expected to feel relief. Instead, I felt restless in a way that travel couldn't fix. Yeah, that's what happens when you've changed and you haven't yet figured out what you want to do next. A deeply meaningful life isn't found in constant movement; it's built over time. It's in the friendships that deepen over years, not days. The sense of belonging that grows from showing up again and again. The purpose that comes from committing to something, even when it's not thrilling every moment. Perhaps the problem is looking for meaning when there isn't any. But really... this is not some grand revelation. This is, again, an age thing. It hits some people earlier or later than others, but eventually, I think, most people get there. Travel will always be a part of my life, but I no longer see it as the answer to everything. That's because there is no one answer to everything. No, not even religion. I, too, enjoy travel, but I don't see it as some grand solution to all of life's problems. It's just nice to get out and do something different every now and then. If travel is the only thing you do, the "something different" may be settling down, as it was with this author. When I was a kid, there was a house on my road with a shingle outside proclaiming its name: "Journey's End." I didn't understand that as a kid. I think I do now. Please don't think I'm ragging on this chick. I only question why BI decided to publish this particular piece, which seems more like a blog entry than an opinion piece for a magazine (not that there's anything wrong with blog entries, either). I can't help but think it's some sort of propaganda, but I might be paranoid about that. |
From Popular Mechanics, some science reporting that I'm not going to get too skeptical about this time, promise. Scientists Found Evidence of a Megaflood that Shaped Earth’s Geologic History ![]() The flood may have refilled the entire Mediterranean Basin in just two years. Not that it shouldn't be approached with a level of skepticism; it's just that I don't know enough about the subject to know what questions to ask. However, I do question the headline: certain people see that headline and immediately think of one particular story involving rain, animals, and an ark. Hopefully the subhead is enough to disabuse one of any such notions, not to mention the article itself. Ages, epochs, periods, and even eras are often defined by some sort of geologic trauma. The Chicxulub asteroid, for example, pushed the Earth into the Cenozoic Era, and 65 million years later, experts are pondering if we’ve entered a new geologic age induced by modern humans (and their predilection for greenhouse gasses). If you ever look at a geologic time scale (here's one from Wiki), ![]() As for the "new geologic age induced by modern humans," I don't know for sure, but I thought they discarded the concept of the Anthropocene. Of course, "they" aren't a monolith and there might still be debate. Around 6 million years ago, between the Miocene and Pliocene epochs—or more specifically, the Messinian and Zanclean ages—the Mediterranean Sea was cut off from the Atlantic Ocean and formed a vast, desiccated salt plain between the European and African continents. If there's no ocean or sea between the continents, are they separate continents? By ancient convention, Europe and Asia are considered different continents, so I suppose so. Until, that is, this roughly 600,000-year-long period known as the Messinian Salinity Crisis suddenly came to an end. Messinian Salinity Crisis would make an excellent name for a 70s prog-rock band. At first, scientists believed that the water’s return to the Mediterranean took roughly 10,000 years. I have a bit of an objection to this wording. It's not like scientists took it on faith; there was evidence. It's entirely possible that the evidence was misinterpreted, but, as this article shows, scientists change their views when new or reinterpreted evidence shows up. But the discovery of erosion channels stretching from the Gulf of Cadiz to the Alboran Sea in 2009 challenged this idea, suggesting instead that a powerful megaflood may have refilled the Mediterranean Basin in as little as two to 16 years. Other than wondering why the author didn't just say "Straits of Gibraltar," which is probably better-known globally than "Alboran Sea" and "Gulf of Cadiz," there's a really, really big difference between 10,000 years and something on the order of a decade. Specifically, 1000 orders of magnitude. Quite a few discoveries move whatever needle by a tiny amount, like if there's evidence that the Sun is 5 billion years old but new evidence comes in that suggests 5.1 billion (I'm not saying this happened, just an example my head came up with). But this difference is a major shift. So I'd be looking for lots of evidence to back it up. Extraordinary claims require extraordinary evidence, and I call a 1000 orders of magnitude change extraordinary. But, again, I'm not saying it's not true; I just don't know much about this subject. That likely means this flooding event—now known as the Zanclean megaflood—featured discharge rates of roughly 68 to 100 Sverdrups (one Sverdrup equals one million cubic meters per second). Case in point: I'd never heard of the Sverdrup. So of course I looked it Sverdr-up. ![]() It shouldn't be surprising that they came up with a larger unit. This is analogous to how star masses are reported in terms of solar masses, or interstellar distances in light-years or parsecs. It keeps us from dealing mathematically with huge numbers, like billions or trillions, or having to use exponents. At any rate (that's a pun there), even if the numbers (68 to 100 in this case) are comprehensible, the amount of water flow is almost certainly not. The article goes into a discussion of the evidence that led to this extraordinary conclusion. I don't know enough o say whether it's compelling or not, but I did find it an interesting read. But then: This model shows that flooding could have reached speeds of 72 miles per hour, carving deep channels as observed in the seismic data. Look, I get using nonstandard units to make enormous quantities somewhat manageable in calculations, but switching from metric/SI to "miles per hour?" That, I cannot abide. Pick one. (It's about 115 km/h.) Now, let's see if I can find a lead singer for Messinian Salinity Crisis. And some musicians. Because I have no talent, either. |
Once again, Mental Floss tackles the important topics. Like many kids, I found history classes boring. Later, history became a favorite topic. I often wondered why that's the case. Part of it is because kids lack context, I'm sure. But another part is that they never taught the history of pizza. The history of pizza is a large pie—half Margherita and half lies. If you have to order a half-and-half pizza, you have failed at diplomacy and compromise. The most famous story about its origins, in which the classic tri-color pie was created to honor Queen Margherita of Savoy, is a work of fiction. And yet, it's the first thing people hear, so they'll stick with the fictional version. U.S. soldiers did not fall in love with pizza en masse during their time fighting World War II and bring it back to the States. Pretty sure I've never heard that tale. And the pizza in New York is not good because of the magical tap water. That bit, I knew. The pizza there is good because it's New York pizza. While New York City tap water is remarkably good for drinking, it doesn't contribute much to the taste of New York's most perfect food. Nor does it do anything to improve the taste of beer from their local breweries. Let’s take a look at some iconic pizza styles... Some of which aren't pizza, but okay. In 2014, newly-elected New York mayor Bill DeBlasio set off a small international incident when he was photographed eating his pizza with a knife and fork... So, was the then-mayor wrong? Right? Obviously, he was wrong, as he's a politician. The answer is both, and that’s because pizza is at once internationally recognizable and completely regional. That’s why some people look at a Hawaiian pie and see the greatest crime ever committed to leavened bread and others see a beautiful story about immigration, intercultural influence, and innovation (or, at least, lunch). The only thing I love more than watching Chicago vs. New York pizza arguments is watching the pineapple-on-pizza arguments. Well, actually, I love pizza more than any argument, but they still amuse me. The article goes into the Margherita thing, then: According to food historian Tommaso Esposito, up until the mid-20th century, pizzas were usually ordered by simply listing the ingredients you wanted on top. Esposito wrote a book all about pizza songs (yes, that’s a thing) from the 16th century up until 1966 and found that none of the songs mentioned specific pizza types by name. Hey, I still order by listing the ingredients I want on top. Also, how come I don't know any pizza songs? Neither of those two famous Neapolitan pie varieties would have been possible without tomatoes. And I'm glad the article acknowledges this. While something resembling pizza undoubtedly existed long before tomatoes were brought over from the Americas (I've seen histories tracing it back to classical Rome), it took the nightshade fruit to really make pizza what it's recognizable as today. When we think of pizza today, tomatoes—a crop the Aztecs had introduced to the Spanish—often seem like an essential ingredient. That's a kind way of putting "the Spanish stole tomatoes from the Aztecs." The Oxford English Dictionary, in fact, defines pizza as a dough “baked with a topping of tomatoes, cheese, and any of various other ingredients.” I don't accept dictionary arguments, but this one reflects common usage. Anyone who’s ever had a white pie might blanche at that definition. Ha ha. I see what you did there. There’s a written record from Gaeta, about 60 miles up the coast from Naples, dating back to the end of the 1st millennium CE. It lays out an agreement in which someone owes a local bishop 12 pizzas every Christmas and Easter Sunday. As the article notes, this was in the B.T.E. epoch (Before Tomatoes in Europe). We don’t have any way to know exactly what that proto-pizza looked or tasted like, but consider what the simplest version of a pre-Columbian-Exchange pizza might entail: a simple Mediterranean flatbread. Kind of like … a pita. Now here's where the article gets into that linguistic parallelism, something I've wondered about often myself, but never cared enough to look up. Plenty of sources think this is no accident, and draw a linguistic line straight from pita to pizza. That’s not the only possible etymology for the word, though. There's one important difference between pita and pizza, though: the former is generally baked on its own, while pizza dough is topped and then baked. Now, I've had things called "pizza" which feature pita or naan or other flatbread, pre-baked, topped with traditional pizza toppings (tomatoes, mozzarella, pepperoni) and then baked again, but I've always thought it's not true pizza. It can be good, though. If we define pizza as a flatbread with toppings, we can imagine it being “invented” more or less independently by the Ancient Greeks, Egyptians, and Natufians (from modern-day Jordan, who were apparently making bread more than 14,000 years ago). Yes, putting stuff on bread is as old as civilization, I can accept that. I can also easily see someone putting another hunk of flatbread on top, so I've never truly accepted the "Earl of Sandwich" story for eating something between two pieces of bread. The article backs me up on this, too: The idea of putting something delicious inside a pizza-like bread likely dates back thousands of years. They talk about figs as the "something delicious" before going on with: Eventually, pizza with figs became popular beyond those who ate it out of economic necessity. Wealthier eaters embellished the simple dish with prosciutto, creating a new variation that harkens back to pizza’s historical roots and remains popular today. This parallels the history of a lot of cheap eats. You take what's available in an area, and it feeds the masses. Then, later, it becomes a gourmet delicacy. Hell, France made basically a national cuisine out of that idea. Snails and frog legs, anyone? The Hawaiian pie was invented in 1962, according to most accounts, by Sam Panopoulos, a restaurateur living in Ontario. Sam was originally from Greece, and the boat he left on stopped, fortuitously, in Naples, where he first became acquainted with pizza. Unlike the murky origins of pizza itself, that story checks out. I like it because it's international: Greek, Italian, Canadian, Polynesian, American. The article also discusses other styles of pizza, like Detroit and Chicago, which I don't consider pizza. Again, though, it can be good. A different approach to that same long cook time may have given us Ohio Valley-style pizza. One of its defining features is the last-minute additions of cold toppings, including cheese. Unlike some other regional pizzas, Ohio Valley style tends to stay in the Ohio Valley. There's a lot more at the link. I won't belabor it further, except to say that regardless of categorization arguments, I only have one pizza advice about pizza, or pizza-adjacent concoctions: if you like it, eat it, and don't listen to those of us who need to be purists or pedants. |
A few days ago, I shared an article about how to tell if someone is rich. This one's like that, only it's about smart. From Upworthy: How do you know someone is very smart? Here are 15 'subtle signs' others notice. ![]() "You can understand both sides of an issue and still think one is wrong." It's probably a lot easier to tell if someone's stupid. That's easy: they are. Everyone is stupid; even, sometimes, very smart people. A Redditor named Occyz wanted to know how people tell the difference by asking them to share the “subtle” signs that someone is very intelligent. Oh, great, an article that summarizes a Reddit thread. In other words, don't believe a word of it. (See? I is smart.) A big takeaway is people think highly intelligent people are mentally flexible. They are always interested in learning more about a topic, open to changing their minds when they learn new information, and they're acutely aware of what they don’t know. So, people of questionable intelligence, plus a bunch of AI bots, ![]() In fact, according to the psychological principle known as the Dunning-Krueger effect, there is a big confidence chasm between highly intelligent people and those who are not. Low-IQ people often overestimate what they know about topics they need to familiarize themselves with. Conversely, people with high IQs underestimate their knowledge of subjects in which they are well-versed. In fact, starting a paragraph with the words "in fact" does not, in fact, mean that what follows is fact. Here are 15 “subtle” signs that someone is highly intelligent. "They don't tell everyone how smart they are" seems to be missing from the list. Incidentally, the article opens with a big picture of Steve Jobs. Now, there's no denying that Jobs was intelligent. He started a company with a couple of friends in a garage, and by the time he died, it was the most valuable company in the world (based on market capitalization). But he also eschewed evidence-based medicine, leading to quite possibly an early death. I'd argue that's not very smart. On the other hand, had he held out a little longer, Apple wouldn't have been the most valuable company in the world anymore, so maybe he was playing n-dimensional chess and winning? I don't know. Point is, smart isn't everything, just like money isn't everything. You can be smart and still a raging asshole, like Jobs reportedly was. I won't bore everyone with comments on every single item in the article. Hopefully, the ones I mention here will be enough to get my point across. 1. They admit their mistakes "When someone can admit a mistake and they know they don’t know everything." This sounds more like learned behavior. It is a good trait to have in most situations, I think, but I can't say it correlates with general intelligence. There are a few on the list like this. 2. Great problem-solvers On the other hand, this one strikes me as the actual definition of intelligence. 3. They appreciate nuance "'I can hold two opposing ideas in my head at the same time.' Anyone who is willing to do that is intriguing to me. I'd agree with that. I've said many times that life isn't binary; it's not all good/bad, black/white, whatever. I'm just not sure one has to be a genius to do it. 5. They have self-doubt The great American poet and novelist Charles Bukowski once wrote, “The problem with the world is that the intelligent people are full of doubts and the stupid ones are full of confidence,” and according to science, he’s correct. Yeah, well, Yeats wrote it first (I think): "The best lack all conviction, and the worst / Are full of passionate intensity." 9. They can simplify big ideas Okay, but to me, that's less a marker of intelligence and more a sign of... I don't know. Empathy? What do you call wanting other people to understand something? And also of being so well-versed in the "big idea" that they can explain it to the uninitiated. Richard Feynman, who gets my vote for smartest dude of the 20th century (edging out the perennial icon Einstein), reportedly once said, "If I could explain it to the average person, it wouldn't have been worth the Nobel Prize." And yet, he spent a lot of time explaining stuff. I wish I could find out who said something like "If you really want to learn something, figure out how to explain it to a fourth-grader." I thought it was Feynman, but I'm having trouble finding the quote. If indeed it exists. 11. They're humble "They don't continually need to tell people how intelligent they are." Okay, so up there, where I said, '"They don't tell everyone how smart they are" seems to be missing from the list.'? I was wrong. See what I did there? There are more in the article, as you might have inferred based on the number-skipping (and the fact that I told you I was going to skip some), because you're smart. Now, just to be clear, I'm not saying these are bad things. Everything on that list is what I'd consider a desirable character trait, to one degree or another. I just question their correlation with what we call intelligence, which, as I noted above, is notoriously hard to quantify in general. Sure, there are IQ tests, but I don't think such tests measure all possible forms of intelligence. And, just to reiterate something I've said before, it's best not to conflate intelligence with knowledge. Someone who does well on trivia questions has a lot of stuff memorized, but that doesn't necessarily mean they can figure something out that's unfamiliar to them. It's like, I don't know, if you have the dictionary memorized, you'll be able to make more Scrabble words, but will you be able to place them on the optimal score-enhancing spaces? The former is knowledge; the latter may be intelligence. In conclusion, there's a whole lot of other dimensions to a person than just "smart." Or how much money they have. Which also aren't necessarily correlated. I mean, everyone knows, or should know, that the only thing that matters is how attractive you are. |
I couldn't let this hate-review from SFGate go by without comment. A stay at the decrepit tomb of what was once the Vegas Strip's coolest hotel ![]() When it opened, Vegas had never seen anything like Luxor. Now, it's one of the most hated hotels on the Strip. Who the hell wrote this? Someone working for the competition? There's a lot of competition there, but I'd suspect Harrah's (owner of Caesar's Palace). And within 10 minutes of my arriving at Luxor, it was clear why it’s one of the most reviled hotels in Las Vegas. Really? Because within 10 minutes of me arriving there, I'm already relaxed and ready to gamble. I pulled into the porte cochere shortly before noon and headed inside with my luggage in tow. Hoping to stow my bag while I explored the resort, I walked over to the bell services desk. The employee gestured for me to come closer, then angrily pointed behind me. “That’s the line,” she said. I turned to see a queue about 10 feet away. It extended all the way through the lobby to the casino floor. Okay, a few things to unpack here. Let's start with the last bit. That makes it sound like the lines at Disney. This is bullshit. There's not much space between the front desk and the casino floor. Now, and here's the major, epic fail of this takedown piece: the author is channeling Yogi Berra here. "No one goes there anymore. It's too crowded." So now I begin to suspect that this writer, "mortified" (her own word) at her faux pas, simply got a bad first impression and then found everything she could to rag on. Then, by some miracle, I got a text: My room was ready. I passed two broken moving walkways, a closed cafe and a long, blank wall lined with employee-only doors before finding the ancient-looking bank of elevators. Yeah, I know that route. I also know the quicker, alternative route, which takes you through the casino floor. Had she gone that way, she might have written about how the hotel forces you through the noisy, flashy, money-sucking part of the first floor. As for broken walkways, yeah, that happens in an aging building. I've never seen the place not having some sort of construction going on. Finally, it's not like elevators were a thing in ancient Egypt. The least they can do is style them like older elevators. When the first one opened, the electrical panel was exposed, wires spilling out. The doors shuddered shut, and the ascent began. Because Luxor is a pyramid, the elevators are more like funiculars, climbing sideways at a 39-degree angle. Okay, okay, I'll grant that the exposed wires, which seem to be confirmed by a pic in the article, are a major fail on the part of maintenance and/or management. While there are laws about under-21s in casino areas in Vegas, plenty of families stay in the hotels. I'm not a big "think of the children" person, but kids do have a tendency to get curious about stuff like that. The elevators rattle uncontrollably, shaking the occupants like a martini all the way up. They’re also incredibly slow. I was on the 21st floor, and it took over a minute to get there. Waaah, they're slow. I think of them as Wonkavators. They are a bit rumbly and shaky, but that's part of their charm. As I put it to anyone in there with me (captive audience), "Hey, we came here to gamble, right?" Things did not improve when I reached the room. As I closed the door behind me, I saw that there was no deadbolt, no bar lock, no privacy latch. Okay, first of all, I've been in lots of hotels, from fleabags in rural Montana to the Ritz-Carlton in DC, so I don't recall specifically if the Luxor rooms lack those features. Seems to me they do have them, but it's possible that some rooms don't. Second, the pyramid is not the only place to stay there. They have two "tower" facilities with more traditional elevators and rooms without sloping walls. As I recall, you only pick the pyramid rooms by your own choice. Does Luxor mistrust its guests so much that it doesn’t provide interior locks? I wondered how many times a day its staff had to force their way into rooms, and why. Look, I'm no expert on the hotel industry, but management has ways to bypass those "security" features. People die in hotel rooms on a regular basis (not because they're in hotel rooms, but just because a lot of people stay in hotels and everyone dies at some point). Also, let's not forget that Luxor is immediately adjacent to, and for a long time shared an owner with, Mandalay Bay, and Mandalay Bay was where the infamous concert shooter stayed. The dark exterior of Luxor made for a perpetual tint in the room, worsened by the fact that one of the windowpanes was crusted in desert dust. This is probably a great setup for someone with a blistering hangover, but it gave a depressing pallor to the space. Counterpoint: "I couldn't sleep in because the room was too bright!" There were two positives. One was the Wi-Fi, which was strong enough to seamlessly maintain a video call. You're on the 21st floor of the pyramid, and you're trusting the hotel Wee-Fee over your phone's hotspot? Your priorities are backwards. The other was the toilet, which flushed with the force of a cruise ship lavatory. I'm glad she counts that as a positive, but the engineer in me wants to know how they get pressures like that at the top. Is there a hidden water tank at the tip of the pyramid? The tip which famously has a giant sun-bright spotlight pointing at the stars? If you’re eating at the Luxor buffet, this is no doubt a hygienic necessity. I don't get the love for buffets. I've never eaten at that one. I only eat at buffets when my friends pressure me into it. If there's one thing that Vegas doesn't lack, it's casinos. If there's another, it's restaurants, including ones where you don't have to do half the work. The article goes into some of the property's history, which is interesting but somewhat irrelevant. Then: Stripped of its novelty, though, the gloomy interior is now bare and brutalist. You say that like it's a bad thing. It is not. With limited food options at the hotel, I ate elsewhere for dinner. I will grant that, compared to some other Vegas properties, the Luxor has fewer dining options. There's a food court for fast food, a breakfast/lunch diner style area, a deli, a couple of Starsuckses, a tequila bar with food, a sushi place (which is incidentally very good), and the aforementioned buffet. This is "limited" in Vegas, true, but when you consider that all you have to do is ride up an escalator to the passageway between Luxor and Mandalay Bay, which is a mall with various shopping options and, yes, many restaurants, this complaint falls short for me. That night, afraid of falling asleep without a security lock, I dragged an armchair in front of my door. At $299.32 for two nights, it felt particularly absurd to be redesigning the room for safety. I don't mean to be rude or anything (okay, I kinda do), but that exhibits a level of paranoia I just can't get behind. Like I said, hotel staff can burst into a room at any time if they have to. And anyone who's not staff shouldn't have a key. Hell, those rickety gambling Wonkavators won't even take you to your floor if you don't use your room key (unless, I suppose, the panel's broken and the wiring's exposed, which, as I said, is one legitimate complaint). And, I might add: $300 for two nights? What the Egyptian Underworld? I've never paid more than $50 for a night, and it's usually even less because it's comped (yes, this means I spent more at the blackjack tables, but ignore that). After a fitful night’s sleep, I stumbled down to the lobby Starbucks. Which one? Seriously, the overabundance of Starsucks is my second-biggest problem with Luxor, after the really quite tiny and understaffed high-stakes table games room. Okay, no, third, after the high-stakes room and their deal with Pepsi (I'm a die-hard Coke guy). Now, look, I know tastes are different. You want high-end? Plenty of other options in Vegas. You want real cheap? Those options exist, too, usually without the shows and casinos. Luxor may not be "cool," but it's cheap (this author got price-gouged, sorry) and the beds are comfortable, especially if you stay in one of the towers instead of the pointy thing. Las Vegas properties have a relatively short half-life. Luxor has already passed that point. I fully expect it to go the way of Golden Nugget and other casinos that were the Vegas version of historical-register buildings. Meanwhile, though, I wasn't about to let this absolute hit-piece stand without comment. |
As I've noted before, I try to be skeptical of articles that confirm what I believe. Like this one from The Guardian. Night owls’ cognitive function ‘superior’ to early risers, study suggests ![]() Research on 26,000 people found those who stay up late scored better on intelligence, reasoning and memory tests One wonders if the study was conducted by night owls. The idea that night owls who don’t go to bed until the early hours struggle to get anything done during the day may have to be revised. Eh, getting anything done is overrated. It turns out that staying up late could be good for our brain power as research suggests that people who identify as night owls could be sharper than those who go to bed early. We're also funnier, better looking, and richer. Seriously, though, the first thing I had to ask myself was this: Are we smarter because we stay up later, or do we stay up later because we're smarter? Or is there some factor that contributes to both, like, maybe, a willingness to go against the grain of society and do one's own thing, regardless of the schedule imposed upon us by cultural pressure? Or, and I'm still being serious for once, do larks as a group score lower on these traits because some of them are actually owls who were pressured into their schedule by relentless society? Researchers led by academics at Imperial College London studied data from the UK Biobank study on more than 26,000 people who had completed intelligence, reasoning, reaction time and memory tests. They then examined how participants’ sleep duration, quality, and chronotype (which determines what time of day we feel most alert and productive) affected brain performance. Well, now, they could have said up front that sleep duration and quality were also being considered as factors. I think it's pretty well-established that people who get a good and full night's sleep (whether it takes place technically at "night" or not) tend to do better with things like memory and reaction time. From a purely speculative viewpoint, this brings me back to wondering if some larks aren't getting decent sleep because they should be owls. I can't think of a mechanism by which merely shifting one's sleep hours could help with cognition, unless one's sleep hours already should be other than what they are. In other words, I'd expect to see the reverse result in such a study if it were generally larks being forced into night owl mode, rather than the reality of the other way around. I imagine we could get some data on that if they just studied people like late-shift workers or bartenders, people who need to follow an owl schedule even if their chronotype is more lark. Going to bed late is strongly associated with creative types. Artists, authors and musicians known to be night owls include Henri de Toulouse-Lautrec, James Joyce, Kanye West and Lady Gaga. I also imagine way more musicians are owls just because they, too, can be forced into a stay-up-late schedule for work, whatever their natural chronotype. For writers, it's a different story (pun intended), because creative writers, at least, often set their own schedules. At any rate, I'm glad the article uses "strongly associated with" instead of implying causation in either direction. ...the study found that sleep duration is important for brain function, with those getting between seven and nine hours of shut-eye each night performing best in cognitive tests. Which I was speculating about just a few minutes ago. But some experts urged caution in interpreting the findings. Jacqui Hanley, head of research funding at Alzheimer’s Research UK, said: “Without a detailed picture of what is going on in the brain, we don’t know if being a ‘morning’ or ‘evening’ person affects memory and thinking, or if a decline in cognition is causing changes to sleeping patterns.” Fair point, so my skepticism here is warranted for reasons I didn't even think of. Jessica Chelekis, a senior lecturer in sustainability global value chains and sleep expert at Brunel University London, said there were “important limitations” to the study as the research did not account for education attainment, or include the time of day the cognitive tests were conducted in the results. Hang on while I try to interpret "sustainability global value chains," which sounds to me more like a bunch of corporate buzzwords strung together haphazardly. Regardless of the value, or lack thereof, of that word salad, her note about limitations is important to account for. The main value of the study was challenging stereotypes around sleep, she added. And I think that's valid (maybe not "the main" but at least "a" value), because us owls are generally seen as lazy and unproductive. Well, okay, I am lazy and unproductive, but that doesn't mean I'm not an outlier. |
This article's a few years old, and it's from PC Gamer, a source I don't think I've ever quoted before. No, I don't follow them, even though I am a... wait for it... PC gamer. But this one's not about gaming. Wi-Fi is something most of us use every day. It's a miraculous technology that allows us to communicate and share large amounts digital information to multiple devices without the use of cables. The great big machine that went BING and fixed my heart problem, that was miraculous technology. Wi-Fi? Just technology. But what does it mean? I know I do philosophy in here from time to time, but "what does it mean" is just too big a ques- Oh, you mean, what does "Wi-Fi" mean. Wireless Fidelity? Wrong. Wireless Finder? Nope. Withering Fireballs? Not even close, my friend. From now on, in my house, it's Withering Fireballs. According to MIC ![]() ![]() So here I am, quoting an article that quotes an article that quotes another (20 year old) article. Sure, I could have just gone to the original source, but where's the fun in that? Then I wouldn't have been able to make jokes about Withering Fireballs. Here's my take: it means what it means. Every word has a meaning, except maybe for "meaningless." Rather, Wi-Fi was a name settled on between a group now known as the Wi-Fi alliance and some brand consultants from Interbrand agency. "Now known as?" One wonders what they were known as before they invented the term Wi-Fi. Let's look it up, shall we? "In 1999, pioneers of a new, higher-speed variant endorsed the IEEE 802.11b specification to form the Wireless Ethernet Compatibility Alliance (WECA) ![]() WECA, now, that's a meaningless acronym because they're not called that anymore. I know a few people in Wicca, but that's a different thing. Ten names were proposed by the brand agency, and in the end the group settled on Wi-Fi, despite the emptiness the name holds. "Despite?" I'd have guessed "because of." You may not want your brand to connote other meanings. It can lead to confusion. Different story, but that's kind of what happened with .gif. The creator of the Graphics Interchange Format went to his grave insisting that it's pronounced with a soft g, and he was wrong. We're still arguing about it to this day, and .gifs are older than Wi-Fi. "So we compromised and agreed to include the tag line 'The Standard for Wireless Fidelity' along with the name. "This was a mistake and only served to confuse people and dilute the brand." Like I said. A word that many of us say potentially several times a day is actually straight up marketing nonsense. Fun fact: in French, it's pronounced "wee-fee," which I find highly amusing. No relation to "oui." At any rate, every word is made up. Some were made up more recently than others, is all. Some get passed around for a while and then fall out of favor, while others become Official Scrabble Words or whatever (I wonder if I'd get dinged for using "yeet" on a Scrabble board.) Perhaps sometime in the future, a newer technology will replace what we know today as Wi-Fi. They'll try to give it a different name. We'll just keep calling it Wi-Fi. Maybe we'll even drop the hyphen, which seems to be the pattern for lots of made-up words. And the French will go on pronouncing it differently. |
Today's article, from Nautilus, is even older than most that grab my attention: first published in, apparently, 2013. That's ancient by internet standards. Well, technically, each species is unique in its own way. But it's unsurprising that humans would be most interested in the uniquity of humans. (I just made that word up, and I like it.) If you dropped a dozen human toddlers on a beautiful Polynesian island with shelter and enough to eat, but no computers, no cell phones, and no metal tools, would they grow up to be like humans we recognize or like other primates? That's a lot of restrictions for one experiment. How about we just drop them off on the island? (Ethics bars the toddler test.) Annoying. Neuroscientists, geneticists, and anthropologists have all given the question of human uniqueness a go, seeking special brain regions, unique genes, and human-specific behaviors, and, instead, finding more evidence for common threads across species. And yet, evidently, there is something that makes humans different from nonhumans. Not necessarily better, mind you. But if there weren't a unique combination of traits that separates a human from a chimpanzee, or a mushroom from a slime mold, we wouldn't put them in different conceptual boxes. Meanwhile, the organization of the human brain turns out to be far more complex than many anticipated; almost anything you might have read about brain organization a couple decades ago turns out to be radically oversimplified. And this is why the date of the article matters: in the twelve years since it came out, I'm pretty confident that even more stuff got learned about the human brain. To add to the challenge, brain regions don’t wear name tags (“Hello, I am Broca”), and instead their nature and boundaries must be deduced based on a host of factors such as physical landmarks (such as the hills and valleys of folded cortical tissue), the shapes of their neurons, and the ways in which they respond to different chemical stains. Even with the most advanced technologies, it’s a tough business, sort of like trying to tell whether you are in Baltimore or Philadelphia by looking out the window of a moving train. Yeah, you need to smell the city to know the difference. Even under a microscope human brain tissue looks an awful lot like primate brain tissue. That's because we are primates. When we look at our genomes, the situation is no different. Back in the early 1970s, Mary-Claire King discovered that if you compared human and chimpanzee DNA, they were so similar that they must have been nearly identical to begin with. Now that our genomes have actually been sequenced, we know that King, who worked without the benefit of modern genomic equipment, was essentially right. "Must have been nearly identical to begin with." Congratulations, you just figured out how evolution proceeds. Why, if our lives are so different, is our biology so similar? The first part of the answer is obvious: human beings and chimpanzees diverged from a common ancestor only 4 to 7 million years ago. Every bit of long evolutionary history before then—150 million previous years or so as mammals, a few billion as single-celled organisms—is shared. Which is one reason I rag on evolutionary psychology all the time. Not the only reason, but one of them. Lots of our traits were developed long before we were "us," and even before we diverged from chimps. If it seems like scientists trying to find the basis of human uniqueness in the brain are looking for a neural needle in a haystack, it’s because they are. Whatever makes us different is built on the bedrock of a billion years of common ancestry. And yet, we are different. I look at it like this: Scotch is primarily water and ethanol. So is rum, gin, vodka, tequila, other whisk(e)ys, etc. But scotch is unique because of the tiny little molecules left after distillation, plus the other tiny little molecules imbued into it by casking and aging. This doesn't make scotch better or superior to other distilled liquors, but it does make it recognizable as such. (I mean, I think it's superior, but I accept that others have different opinions.) I was unable to find, with a quick internet search, the chemical breakdown of any particular scotch, but, just as I'm different from you, a Bunnahabhain is different from a Glenfiddich, and people like me can tell the difference—even though the percentages of these more complicated chemicals are very, very small. Point is, it doesn't take much. But trying to find this "needle in a haystack" (how come no one ever thinks to bring a powerful electromagnet?) might be missing the point. And yes, that pun was absolutely, positively, incontrovertibly intended. Humans will never abandon the quest to prove that they are special. We've fucking sent robots to explore Mars. I say that's proof enough. But again, "special" doesn't mean "superior." Hell, sometimes it means "slow." |
Here's a relatively short one (for once) from aeon. It's a few years old, but given the subject, that hardly matters. And right off the bat, we're getting off to a bad start. Proclaiming that something is "always" (or "never") something just begs someone to find the one counterexample that destroys the argument. In this case, that someone is me. You have probably never heard of William Kingdon Clifford. He is not in the pantheon of great philosophers – perhaps because his life was cut short at the age of 33 – but I cannot think of anyone whose ideas are more relevant for our interconnected, AI-driven, digital age. 33? That's barely old enough to have grown a beard, which is a prerequisite for male philosophers. Or at least a mustache. However, reality has caught up with Clifford. His once seemingly exaggerated claim that ‘it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence’ is no longer hyperbole but a technical reality. I'll note that this quote is not the same thing as what the headline stated. I guess it's pretty close, but there's a world of difference between "without evidence" and "upon insufficient evidence." There is, for example, no evidence for a flat Earth beyond the direct evidence of one's senses (assuming one is in Kansas or some other famously non-hilly location), and overwhelming evidence that the Earth is basically round. Okay, not a great example, because flat-Earth believers can be shown to be wrong. But morally wrong? I'm not so sure. I hold the belief, for a better example, that murder is wrong. There's no objective evidence for this, and moreover, we can argue about what constitutes "murder" as opposed to other kinds of killing, such as assisted suicide or self-defense. And yet, it seems to me that believing that murder is wrong is, on balance, a good thing for people's continued survival, and thus morally right. His first argument starts with the simple observation that our beliefs influence our actions. Okay, that seems self-evident enough. The article provides examples, both practical and ethical. The second argument Clifford provides to back his claim that it is always wrong to believe on insufficient evidence is that poor practices of belief-formation turn us into careless, credulous believers. Clifford puts it nicely: ‘No real belief, however trifling and fragmentary it may seem, is ever truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others; and so gradually it lays a stealthy train in our inmost thoughts, which may someday explode into overt action, and leave its stamp upon our character.’ I've heard variations on this argument before, and it does seem to me to have merit. Once you believe one conspiracy theory, you're primed to believe more. If you accept the concept of alien visitations, you can maybe more easily accept mind-control or vampires. That sort of thing. Clifford’s third and final argument as to why believing without evidence is morally wrong is that, in our capacity as communicators of belief, we have the moral responsibility not to pollute the well of collective knowledge. And that's fair enough, too. So why do I object to the absolutist stance that it's always wrong to believe on insufficient evidence? Well, like I said up there, I can come up with things that have to be believed on scant-to-no evidence and yet are widely considered "moral." The wrongness of murder is one of those things. That we shouldn't be doing human trials for the pursuit of science without informed consent and other guardrails. That slavery is a bad thing. And more. I'm not even sure we can justify most morality on the basis of evidence (religious texts are not evidence for some objective morallity; they're just evidence that someone wrote them at some point), so to say that belief on the basis of insufficient evidence is morally wrong (whether always or sometimes) itself has little evidence to support it. You have to start by defining what's morally right and wrong, or you just talk yourself in circles. While Clifford’s final argument rings true, it again seems exaggerated to claim that every little false belief we harbour is a moral affront to common knowledge. Yet reality, once more, is aligning with Clifford, and his words seem prophetic. Today, we truly have a global reservoir of belief into which all of our commitments are being painstakingly added: it’s called Big Data. Again, though, that's a matter of scale. People have held others to certain standards since prehistory; in the past, this was a small-community thing instead of a global surveillance network. None of this is meant to imply that we should accept the spread of falsehoods. The problem is that one person's falsehood can be another's basic truth. That makes it even more difficult to separate the truth from the lies, or even to accept the reality of certain facts. Yes, having evidence to support one's beliefs is a good thing overall. But we're going to end up arguing over what constitutes evidence. |