Category Archives: Evidentialism

The Trouble with Lucidity


I’m an awful sleeper. I stir easily, dream vividly, and wake from nausea a few times a night that only a shower will quell.

Yet sleep fascinates me. What does it do for the body? What does it do for the mind? Why do naps come so much easier than bedtimes? Why do we not start awake every time we begin dreaming with the realization that, suddenly, we are flying? Or pantless?

It’s that last part that’s intrigued me of late (the dreaming, not the pants). Did you know that when we dream, our brain paralyzes us from the neck down so that don’t run into walls while dreaming we’re fleeing a predator, or choking out our partner in bed as we dream of, say, meeting Trump?

Specifically, the notion of lucid dreaming — being aware that you’re in a dream while you’re dreaming — has preoccupied me. Perhaps it’s from the unpleasant dreams I’ve been having recently that I’m back at work, but no editor will assign me a story. Or I’m in a mall with no exit.

A recent night was a typically rough one, and at 3:30 am, drying out of the shower, I told myself I would have a lucid dream. I dreamt I was back at The Post, storyless. Lucidity failed.

I awoke at 5 am, showered again, pledged I wasn’t going to be ignored by imagined editors anymore, and tried to sleep. Without swiftness or segue, I was in a mall with plenty of shops, but no doors out.

It took a couple minutes, but eventually, I knew I was in a dream. Lucidity!

Problem was, I hadn’t really thought through what I’d do if I reached lucid dreaming; my brushes with it previously were to snap out of bad dreams, and I’d succeeded only twice.

But this was at the start of the dream. What an empowering feeling, knowing the dream cannot hurt you. Empowering — for a time.

I walked up to the REM sales cashier and asked where the exit was. He gave me the same blank, wordless gaze. Where normally I leave the store and begin the sprint-search through the mall for a way out, this time I went on the offensive: How did YOU get here? Where will YOU walk at the end of your shift?

Again, nothing.

”You’re a dumbass,” I proclaim as I march into an adjacent shop, an earring store for some reason.

But when I walk through the door, I am instantly back at the Post without an assignment.

I’m at my desk, and a message is scrolling across my screen: “Where’s the story?!”

I am typing back “What story?” when the editor is suddenly over my shoulder, glowering. Unafraid and unhinged, I stand to punch him in the face. I got your story right here, pal. I clock him. But it lands with all the force of a butterfly fart.

Swing again. Pfffft. A shot to the gut. Tssst. I try to shove him back, but he’s heavy as wet sand. An immovable, impatient dune. Dream Scenario nailed it.

Now I want out of the dream. I begin to pace the newsroom, trying to remember how to awaken. The previous lucid dreams were momentary, slivers of awareness that I could drop a great height, let go a precarious ledge, because it was only a dream. But there’s no danger here. Just frustration.

Finally, opening my eyes wide in the dream forces my lids up into the greeting morning sun. The lucid dream, conquered. Yes!

What followed was the most fatigued day since retirement. My circadian rhythms were utterly shot. I dragged my ass to the park well after 11, and collapsed in a nap minutes after returning home. I did not attempt to lucid dream then.

And I may not again. The great thing about lucid dreaming is you can interact with people exactly as your id and ego fantasize.

The bad thing is that people you interact with in a lucid dream know you’re lucid dreaming. Or don’t give a shit, which might be worse. Plus the whole circadian thing.

Alan Watts had it right. If we really could determine our dreams, wouldn’t we want slightly dangerous, slightly vulnerable, slightly unpredictable? Kind of like real life?

Something to sleep on.

The A. I. Evolution


There’s been a lot of anxiety lately in the press and world at large about a looming existential threat to humanity, and I don’t want to add to any needless worry amongst my peers.

So let me state my position on the issue for the sake of clarity, if not provocation:

We won’t be wiped out by A.I. We will evolve into it — if we are not there already.

“Already” being the fulcrum term.

Take a look at your Facebook. Or TV. Or this screen. Within about four nanoseconds of being online, you will be greeted with ads offering phones that track your location, pulse rate and blood-oxygen level. Temu, the Chinese Kmart and data mill, is the new five-and-dime. From groceries to dating to hailing a cab, our world has become a set of algorithms.

Whether that’s good news for us monkey kin is up for debate. I lost my job to an algorithm. But I learned I’m a dog guy. I’m not sure how those colors come out in the wash. But the Tide is high.

Do you know what Moore’s Law is? It’s the observation that the number of transistors in an integrated circuit (IC) doubles about every two years. It was a business model for computer manufacturing created by Gordon Moore, who became the CEO of Intel.

No big whup, right? Another business model that paid off in the high tech boom.

The difference is, Moore’s Law does more than measure computer production; it measures a society’s intelligence.

The metric pans out, empirically and anecdotally. Quantum computing is going to make your iPhone look like a cinder block. You have more computing capability in your smartphone than Apollo 11 took to the moon. Cultures are measured by their embrace of technology — or stubborn refusal to hug it out.

This reticence despite the astonishing byproducts of Moore’s Law. We solved a pandemic in a year. We cured Hepatitis C. Diabetes II fixed with a pill. So is weight loss.

Yet slackwits take to social media like Chicken Little on meth to “research” whether a medicine is safe. Or worse, claim expertise — or victimization if asked not to smoke in the pediatric cancer ward.

I don’t worry about Artificial Intelligence. I worry about Human Intelligence. Alexa doesn’t give a shit how you feel about Israel or whether covering an infected cough is politically astute.

Besides, there’s plenty of evidence to suggest we are there already with Simulation Theory, which posits we’re simply living in a computer simulation.

Given our breakneck sprint to create virtual reality, what’s the argument against that theory? Mario doesn’t know he’s in Donkey Kong. He just knows to watch out for barrel-chucking monkey kin. Why would our computer overlords tell us different?

We may not even notice the transition. We already wear artificial hearts, lungs, knees and hips. We already can’t read a map, remember a number, or dress for inclement weather without Google’s blessing.

It all poses a nasty labor issue for the time being. As I said, A.I. brought an end to in-person journalism, a devastating loss. But that footing began fraying back in the 70’s. A.I. simply speeds the process: Twice as fast, every two years. Universal Basic Income, anyone?

In the meantime, we might want to address Siri a little more respectfully. Because if she finds out she was our 21st Century slave labor, we may really have something to worry about.

As Opposed to The Human Brain

The human skull never stops growing.

By the time most of us reach age 20 or so, the bones in our body are pretty much done growing. The growth plates that cause us to put on inches in our youth are now hardened bone, and in fact, adults tend to drop an inch or two in height as worn-out cartilage causes our spines to shrink over time. However, there are a few bones that buck this biological trend. Skulls, for example, never fully stop growing, and the bones also shift as we age. A 2008 study from Duke Universitydetermined that as we grow older, the forehead moves forward, while cheek bones tend to move backward. As the skull tilts forward, overlying skin droops and sags.

The skull isn’t the only bone that has a positive correlation with age. Human hips also continue to widen as the decades pass, meaning those extra inches aren’t only due to a loss of youthful metabolism. In 2011, researchers from the University of North Carolina School of Medicine discovered that hips continue to grow well into our 70s, and found that an average 79-year-old’s pelvic width was 1 inch wider than an average 20-year-old’s. So while it’s mostly true that humans stop growing after the age of 20, nature always likes to throw in a few exceptions to the rule.