Uncle Orson Reviews Everything
January 7, 2016
First appeared in print in The Rhino Times
, Greensboro, NC.
Force Awakens, McKinley, Unforgettable
I finally overcame my disgust with people who were responding to Star Wars: The Force
Awakens as if it were a religious experience, a revelation direct from, not God, but the Star Wars
Mostly because our daughter assured us that it was, in fact, a very good movie.
Not just "better than the prequels," a huge category that includes old phone books, used diapers,
and the classic weird movie Pootie Tang, written and directed by Louis C.K. in 2001, and filmed
for a budget of about a buck fifty.
No, she said it was actually good.
And, after my wife and I joined her in watching it, we had to agree.
There was nothing actively embarrassing or shameful in it, so right there it became one of
the top three movies in the very uneven Star Wars franchise.
But even with Lawrence Kasdan as one of the writers, and J.J. Abrams as director, there was
something perfunctory about the first few scenes. It's as if they were trying to touch all the bases
from the original Star Wars (I loathe calling it "A New Hope." It was released as Star Wars,
period, and since I prefer to ignore the prequels, I see no reason not to call it Star Wars.)
So we've got a droid with a precious secret which is being hunted by dangerous enemies, and it
ends up in the hands of a naive kid -- in this case, a young woman, Rey (Daisy Ridley), who is
waiting around among scavengers for her long-departed family to return. There's a vague
cantina scene. There's a guy dressed up like Darth Vader. There's an evil empire going under a
different name, while the rebels are now the Republic. And there's a badder-than-ever Death
Star replacement that draws its power, ludicrously, from blacking out the nearby sun.
In other words, it looks like a remake, and Star Wars didn't need a remake, in exactly the way
that The Hobbit didn't need to be made into three bad movies. It needed people to leave the story
alone and make something new.
But those beginning scenes were well acted and well directed, the computer graphics were great,
especially the motion-capture character Maz Kanata, a character that feels like she was
written for Linda Hunt, but is actually played by Lupita Nyong'o.
Even though the formulas were creaky, Daisy Ridley was wonderful as Rey and John Boyega
does a good job as Finn, a white-armored storm trooper who decides to take off the suit and join
the human race. And even though I loathed Adam Driver's character in Girls, he is a good actor,
and he plays Kylo Ren very well.
So I kept watching.
Then Han Solo came onto the screen.
Yes, it was a nostalgic pat on the head to the fans, but it was more than that.
And of course, Harrison Ford classes up the joint in every movie he's in.
But it was more than a great actor doing his shtick, and more than a trip down memory lane.
Every time Harrison Ford and Carrie Fisher were on screen together, the film came to life,
I actually started to care. To be involved in the story. Everybody else was in an adventure movie
that was imitating Star Wars. Ford and Fisher were in a story that mattered, and they spread that
magic over the rest of the movie.
Already the fans are taking it all way too seriously -- though I have to say that the fan writer who
makes a case for Rey being Obi-Wan Kenobe's granddaughter is pretty convincing, and the
screenwriters should take note, because it's probably a better idea than anything they had in
But all the weaknesses of the Star Wars franchise are still there, despite the fact that this movie is
quite good while the previous three-and-a-half sucked pond scum.
It's now quite openly a religious movie, using terms like "faith" in the Force. And the Force
is still a third-grader's version of philosophy, its deep stupidities all the more obvious the more
people talk earnestly about them. All the fighting and Force-using feel like badly conceived
magic, as if the outcome of fights and chases were being decided by a Dungeonmaster rolling
Self-parodying jokes, like when somebody says that of course there's always a simple way to
destroy a massive weapon, aren't really funny, because in fact this movie, like the first one,
absolutely depends on one of those shots-in-the-dark.
No. No, really. You don't win real wars or real battles by sending out one guy to bring down the
superweapon by closing his eyes and letting fly.
That would be the story of David and Goliath, and stories in which God ("The Force") simply
bestows victory on the outgunned kid are not really worth repeating.
We got it. Goliath big, David small, God picks. Done. Move on.
I enjoyed this movie. A lot. I'm never going to watch episodes II and III because naps are better.
But I'll probably come back for episodes VIII and IX, as long as this team of writers and director
But let's get a little perspective. There was another franchise reboot this year: Jurassic
World. It too touched all the bases from the original. It too tried to succeed by being Bigger and
Badder -- and it did. It was enjoyable, and it had Chris Pratt the way this movie has Daisy
There were two other sci-fi extravaganzas, though, that were easily better than Star Wars: The
Force Awakens. First, there was Mad Max: Fury Road, another reboot, but this time much
better than any of the previous installments. It did not rely on nostalgia, ever. It told a
powerful story, it was brilliantly acted, it moved with breathtaking speed, and while it's too much
to hope for movie sci-fi to be smart, it was surprisingly not-dumb.
And then there was the only "hard-sci-fi" movie of the year: The Martian. It was real, it was
well-written, the acting was excellent, it was based on a smart novel and kept everything that
could be shown on screen.
Mad Max: Fury Road and The Martian were both better movies than Star Wars: The Force
Awakens, mostly because they didn't rely on nostalgia to keep us watching until things finally
started to matter.
I think Lawrence Kasdan, J.J. Abrams, and Michael Arndt did as good a job of making a silk
purse out of George Lucas's ear as could possibly be done -- but the prequels are a festering sore
in the franchise and they are still poisoning even this good movie, just because it cannot rise
above the deep stupidity of the overall story.
It all still depends on a bunch of magic people in whom "the Force is strong" -- you know, Old
Testament prophets who are chosen by God -- and the interplanetary politics is still
embarrassing. Everything magical depends on heredity -- who is descended from whom -- so
you only matter in this universe if you're one of the Chosen Few.
In short, it's so Calvinist that it hurts. Irresistible grace is not an attractive doctrine to me; it
makes for bad fiction. I prefer my characters to choose rather than be Chosen. I don't like it
when one person handles an ancient relic and immediately has overpowering visions, while other
people can touch it and nothing happens. It's an unearned power. It's simply Bestowed.
That's not the moral universe I believe we live in; I don't think God plays favorites. I like it
much better when characters -- when people -- take the hand they're dealt, good and bad, and
still try to use it to accomplish something without screwing things up too much.
I don't like mysticism, and the Star Wars universe has become all robes and hoods, people
coming out of disguise and handling magical artifacts and having wizard duels.
We're one dragon short of a cliche fantasy.
Science fiction can be, and should be, more.
It's so discouraging to be only a few pages into a biography when the author makes a mistake so
howlingly dumb that you marvel that he somehow graduated from high school.
Karl Rove's The Triumph of William McKinley: Why the Election of 1896 Still Matters began
well enough, with an overview of the life of a president who is largely overlooked. In fact, as
American history is usually taught, McKinley matters only because he was the President who
chose Theodore Roosevelt to be his running mate, and then was conveniently assassinated so that
the great Teddy could become President and save the world.
Rove gives us a different take on the man, a genuine Civil War hero, a man of firm principles
(including being anti-slavery when people with that view were reviled as "Abolitionists). His
election led to a historic realignment of political forces.
But in talking about McKinley's upbringing in eastern Ohio, Rove mentions that the family lived
just across the Ohio River from a slave state -- Virginia, which provided the pro-slavery
Confederacy with its capital city.
Immediately I thought, Come on! Everybody knows that the Ohio River separates Ohio from
Kentucky, which, while it was a slave state, remained in the Union.
Filled with righteous indignation at the mistake -- surely a copy editor should lose his or her job
for letting such a slip go by -- I suddenly gave myself a shake and realized:
Oops. Ohio is also across the Ohio River from West Virginia. When McKinley was growing up,
there was no such state as West Virginia. West Virginia was still part of Virginia, because it
only became a state when the anti-slavery counties of the west seceded from Virginia and
rejoined the Union in the midst of the Civil War.
So Virginia, as then constituted, was across the Ohio River from the easternmost portion of Ohio.
It was a slave state and did host the Confederate capital in Richmond.
I was the bonehead who had forgotten the historical geography of the region. Only for about
two minutes of self-congratulatory high dudgeon, but ... a humbling experience all the same.
It would have been nice if Rove had reminded us that the part of Virginia across the river from
the McKinleys' portion of Ohio later became West Virginia ... but then, maybe he figured that
the tiny number of Americans educated enough in geography to realize the "mistake" would also
be so well educated in history as to realize that it was not a mistake.
Which I did. Eventually.
Everybody makes mistakes. That's one of the laws of human nature that I learned as a
proofreader at a university press. Everything was read carefully by two different proofreaders --
and the second reader often caught almost half as many mistakes as the first.
One thing we all knew: No matter how many errors we caught, no matter how carefully we
proceeded, there would always be some shamefully obvious typographical error that
everybody missed. It would only become obvious after the book was printed and the error could
not be corrected.
I once worked for a magazine that allowed "Odyssey" to be spelled "Odyessy" in a headline, so
that it sat there in big black type, screaming at the reader, "This magazine is edited by idiots."
But it wasn't edited by idiots -- it was edited by humans.
In this column, I check everything before I send it in. Plus, I don't make many mistakes in the
first place. But in every installment, our fearless copy editor catches at least one typo or error of
fact that slipped past me. And then, after publication, it happens way too often that a reader
catches a mistake that all of us missed.
So I was prepared to sermonize about how somebody who's writing history or biography should
be much more rigorous than to put Ohio across the Ohio River from Virginia. And then I
realized a lesson I've had to keep relearning for the past forty years: Before I correct somebody
else, I should make sure that I'm not the bonehead.
Whenever you set out to chastise someone else for their mistakes, it's a wise practice to stop and
think: Am I missing something here? Is there some reason why the thing I'm correcting isn't
actually wrong? Is my correction even wronger than the error I'm correcting?
Just last night, somebody on my Facebook newsfeed "liked" a little discourse on correct
pronunciation. "Are you pronouncing these words incorrectly?"
I always enjoy reading these with evil glee, because in almost all the "use correct grammar" or
"pronounce words correctly" posts that pop up on Facebook, about half the "corrections" are
wrong, and about half the "wrong" examples are perfectly correct and often have a long
history of being good English.
Last night's pronunciation guide told people that "sherbet" should be pronounced "SHER-bit"
instead of "SHER-bert." Well, that extra R is certainly not in the written word, but that only
proves that two near-rhyming vowels in a row want to become the same vowel. The English-language retroflex R actually becomes part of the vowel (rather the way it does in some versions
of Chinese), so ER followed by E wants to be ER followed by ER.
Still, by all means, I encourage you to stop saying "sherbert." But don't change to another
equally ignorant mistake! Because there really is a difference between short E and short I,
between "sherbet" and "sherbit."
In the South (and other areas) this difference is so close to disappearing that many people feel the
need to say "inkpen" so people don't think they're saying "pin" when they mean "pen."
But the second syllable of sherbet is "bet," with a short E. And if you're not going to pronounce
it, it should be thought of as a schwa, not a short I. So the "pronunciation guide" was
substituting one natural error for another just as wrong.
The pronunciation guide for "wheelbarrow" listed the first syllable as "weel." Um, no. The H is
not silent -- not in the dialect of English I grew up with. I could never understand people
who thought "witch" and "which" were homophones, or "where" and "wear." In my family, we
all pronounced the H.
Admittedly, the H begins before the W and probably should be written first, as it used to be in
Old English, as with the word "what": "Hwaet!" But the WH or HW sound is an ancient
combination, one of the distinctive English sounds. In my mind, only ignorant and lazy people
drop the H. But, of course, that is simply the way the language is evolving, and "witch/which" as
a pun was already common when I was a kid in the 1950s.
My point is that if you set yourself up in the business of correcting other people's pronunciation,
make sure you are following best practices yourself.
Even when it makes you crazy, you have to recognize that every language shifts constantly.
Words pile on new meanings or lose old ones or swap meanings with other words.
Pronunciations change, and to those who say things the old way, the new way can sound insanely
Language is like a haircut, though, in the old joke: "What's the difference between a bad haircut
and a good one? Two weeks."
You think you're hearing a bad pronunciation change that needs desperately to be obliterated?
Within two generations, speakers of the language won't realize there ever was a change; it will
seem "sensible" to speak in the way that sounded insane only two generations before. Or else
the new pronunciation will be gone and forgotten.
I'm really hoping that "bae" for "baby" and "po-po" for "police" disappear Real Soon Now. But
that's my preference. The language will do whatever it does, and when I'm dead, I'll stop being
annoyed by stupid babytalk changes.
So when somebody says, "If you pronounce the word that way, you'll sound illiterate," smile and
thank them -- and then decide for yourself. If you hear smart people saying it your way, then
relax. Or if all your friends say it the same way is you, then call it a dialect and keep
speaking the language as you know it.
Even the much reviled "nucular" pronunciation of "nuclear" may turn out to be the way
everybody says it in another couple of generations, just as "mischievous" wanted to become
"mischievious" to rhyme with "devious."
It doesn't mean we're getting stupider. It means that our language is still alive and changing.
I've known sci-fi writer Eric James Stone for a long time now. He's won the Nebula Award
(given by fellow writers) and been nominated for the Hugo (given by seriously committed fans).
He won the Writers of the Future Contest early in his career.
He's also a really good guy, so I'm happy to report that he has recently published his first novel
and it's a wonderful adventure built around a clever, original idea.
The main character, Nat Morgan, was born with a weird birth defect: He's forgettable.
No, not a little bit nondescript: Quantum mechanics function in such a way that people cannot
maintain any memory of him. Computers can't keep any record of him. Video surveillance
equipment can't record him. A minute after a person or machine registers his presence, he's
gone from their memory.
The standard thing to do with a character who has a totally damaging "talent" like this is to have
the talent arise only when he reaches puberty, so he begins to deal with it only after having been
fairly normal for a dozen years.
But Stone isn't doing anything the easy way. Nat Morgan was so forgettable as a baby that
every morning his mother woke up wondering why there was a baby in her home. She had
to keep a journal that explained who this baby was and leave it in the kid's crib so she could
discover, all over again, that this was her child.
As long as she kept the baby with her and noticed him at least once a minute, he was safe. But
then, when Nat was about fifteen, there was a housefire, and while both Nat and his mother
survived, her journals were burnt up and they were taken to the hospital in separate ambulances.
She had no idea who he was, and had no memory of ever having a child. This meant he was
When people forget you easily, you can live by stealing. In effect, you're invisible -- not right
now, but in a minute. Walk out of the store with a few days' worth of food and sure, they might
chase you for a little bit. But if you can find a hiding place and remain undiscovered for sixty
seconds, they forget why they ran out of the store and they certainly stop looking for Nat
The novel takes place after Nat has managed to convince some key people at the CIA that he can
get information nobody else can get. In order to report in, he has to phone his one contact person
and tell him to read the report in a certain drawer of his desk. This reminds his contact that Nat
even exists, and that his reports should be taken seriously.
This is a splendid premise, and it has adventure-movie written all over it. Or even, with the right
showrunner, a long-running television series.
It solves the problem of invisible man stories on screen: You pay the salary of a top star to
play the title role, but the actor is almost never on screen.
In Unforgettable, he's always on screen.
Aw, forget movies and television. Unforgettable is already a much better thing -- a novel,
smart and exciting. You can order the trade paperback or ebook from online stores, or look for
it at Barnes & Noble here in town.