Jar of Lights

Standard

Recently, a fellow indie author made a request of me, to which I overreacted.

“I’m 68 years old. You are 30-31. If, for some reason, I should never get to write/finish my novel, would you be willing to consider finishing it as a novel based on my story? I would consider it a great honor.”

I stared at the screen, scared, angry and annoyed. I typed up a scathing response, but hesitated to send it. How dare you? I barely know you and you’re anointing me as your successor, bypassing your legitimate heirs to make me responsible not just for honoring your legacy, but finishing it for you? I left the email draft alone to think about it. A few days later, I gave a gentle response, somewhere between “no” and “we’ll see.” To which my colleague backpedaled, confused by the artificial tension I’d created.

Why did I react so violently? It took me a bit to figure it out.

June 20, 2009

“…And I can only hope you find peace.”

I finished and stepped down from the podium, wiping away a tear. No applause followed. I had just given the eulogy for my twenty-two year old high school friend, Richard Yee. I met Richie in middle school, when my family moved from Texas to the Atlanta suburbs. I fell in with his group but it was years before the two of us started writing. Once we started, we couldn’t stop. I wrote my stories, he wrote his. We distributed them, soliciting unofficial ratings and reviews from our circle, often trying to one-up each other by writing competing versions of the same plot.

We went to different colleges, pursuing technical degrees. Our emails were always alight with new manuscripts, heated exchanges over commas, repetitive words, nuances of usage, and the relative strength or weakness of a new story. After college, we started looking at the nascent worlds of blogging, ezines, and self-publishing. In 2009 we tried to get a periodical going, but after some head-butting it was clear that he would have his blog and I would have mine. The last time I saw him alive was his birthday, April 28th, 2009. In mid-June, I got the call. Richie was found in his apartment alone, having died of an alcohol overdose, drinking alone, celebrating or mourning what, I’ll never know.

I have print and digital copies of everything Richie ever wrote. When we were both alive, writing was always a passion, a hobby, a fruitful exchange of ideas. We were two boys running around a green lawn in the dusky twilight, catching fireflies. Who had more, whose were brighter, and how to improve our game is what occupied us all those years. If I read dead authors as often as live ones, why should he be any different? John Steinbeck is no more dead or alive to me than he is to other avid readers. He just is. He is Grapes of Wrath, East of Eden, Of Mice and Men. He’s the sum total of all the words he wrote when he was alive, in a very particular order. To him, to his family, that’s his legacy. But to me, that’s his body, his identity. Steinbeck was never a person to me, a man or a woman or a life, only a crisp rectangular stack of books like a message left in a bottle.

After Richie died, I didn’t stop writing, but I slowed. Between 2009 and 2015 I wrote only twelve stories (not counting the throwaways, of course). I walked the lawn, alone, still feebly catching fireflies, but his jar sat on the porch now, a lid put on it. If you, dear reader, have ever grieved, I want you to know I don’t pretend to be your equal for losing my friend. But when I talk about death, I’m not talking about capital-D Death, the essay topic. I don’t fetishize books or the creative writing process. It’s just that at a young and formative age the world taught me a vivid lesson about the intimate connection between death and the written word. When I die (hopefully decades from now), my family will know me as the man, but my broader legacy will be my body of work. My best case scenario will be a fifteen-year-old boy writing a book report on Gingerbread, and my birth and death years will just be numbers, as arbitrary as Steinbeck’s 1902 and 1968.

May 21, 2016

So no, I was not enthused by the prospect of finishing an old man’s novel for him. I thought it was selfish cheat. Why should I lease out the years of my life to accomplish something he had 68 years to accomplish himself? Richie only had 22 years to live, and he lived them. He created his legacy. He may not have finished it, but what’s there will have to do. The human body is an interesting thing. I’ve been carrying this as a part of me for seven years, and it manifested as an indignant, passive-aggressive reticence. For the record, I managed to diagnose this complex and bring the issue back down to the hard earth of civility. Still, the experience scraped away that scar tissue and reinforced that lesson in my value system.

Interviewers often ask us why we write, or when we started writing. It’s a more complicated answer than can possibly fit in a single blog post, because for us writers, writing is our life, it’s how we define ourselves. Regardless of our marriage, our family, our occupation or hobbies or residence, writing is the deepest light within us. It’s the part of us we know will live on past our physical death because we read the lights of others every day. Whether it be Roald Dahl or John Steinbeck or Ayn Rand or Harper Lee or Edgar Allan Poe, we read dead authors and they make us feel alive. Writing, among other things, is a bid for immortality. Because eventually night will come and we want something left on our nightstand to remind us that the darkness isn’t all dark.

Inner Demons

Standard



Review of The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, Chapter 8

(Credit: All block quotes are excerpts from the book.)

Chapter 8: Inner Demons

“The myth of pure evil bedevils our attempt to understand real evil.”

It was Carl Jung who said “Knowing your own darkness is the best method for dealing with the darknesses of other people.” One of the most damaging sources of violence in our world today is the persistent myth of good vs evil. We often perceive ourselves as good guys suffering an unprovoked attack by bad guys and are therefore justified in routing them, not necessarily for personal revenge, but for the general safety and welfare. But what happens when the bad guys feel the same way about themselves? In our personal conflicts, in world events, in our perception of crime, we are naturally biased toward our own position.

The Perpetrator’s Narrative: The story begins with the harmful act. At the time I had good reasons for doing it. Perhaps I was responding to an immediate provocation. Or I was just reacting to the situation in a way that any reasonable person would. I had a perfect right to do what I did, and it’s unfair to blame me for it. The harm was minor, and easily repaired, and I apologized. It’s time to get over it, put it behind us, let bygones be bygones.

The Victim’s Narrative: The story begins long before the harmful act, which was just the latest incident in a long history of mistreatment. The perpetrator’s actions were incoherent, senseless, incomprehensible. Either that or he was an abnormal sadist, motivated only by a desire to see me suffer, though I was completely innocent. The harm he did is grievous and irreparable, with effects that will last forever. None of us should ever forget it. (p 713)

Sound familiar? Consider for a moment when you have witnessed this phenomenon firsthand, as a third party observer to, say, a road rage incident or a spat over money or a fight between siblings. During the rare chances you may have had in your life to be an unbiased observer, you have probably seen how naturally the same facts can be construed in opposite narratives. Now, consider a few times when you yourself have been wronged and think of which narrative you used to relate the story. What about the times you’ve been guilty of wronging someone else?

Once you become aware of this fateful quirk in our psychology, social life begins to look different, and so do history and current events. It’s not just that there are two sides to every dispute. It’s that each side sincerely believes its version of the story, namely that it is an innocent and long-suffering victim and the other side a malevolent and treacherous sadist. And each side has assembled a historical narrative and database of facts consistent with its sincere belief… The victims of a conflict are assiduous historians and cultivators of memory. The perpetrators are pragmatists, firmly planted in the present. (p 716)

The ultimate lesson here is that we are all human, with common motives, desires, and safety mechanisms. What we perceive as evil is all too often “perpetrated by people who are mostly ordinary, and who respond to their circumstances, including provocations by the victim, in ways they feel are reasonable and just” (p 720). Does this imply that they deserve a free pass? After all they are only acting out the same impulse we all feel and can relate to, right? No. The purpose of empathizing with the perpetrator, and not just the victim is to understand what drives criminals to commit crimes (or countries to commit atrocities, or individuals to act coldheartedly). By understanding bad behavior we are more likely able to prevent it. By empathizing with the victim we are only able to vilify the perpetrator and console the survivors, but by empathizing with the perpetrator we are able to anticipate and hopefully prevent future tragedies.

No baby is born evil. Ideology is the real killer. Most ideologies persist in a vacuum, filling their constituents’ minds with one-sided rhetoric and arming them with a strong sense of revulsion for any hint of dissent. Repeatedly, we have seen in this book that one of the root causes of the decline in violence is the increased awareness, empathy, and instant communication there is in the world. Knowledge is an ideology-buster, but the thirst for knowledge must out-compete the indoctrinated revulsion for new and conflicting facts. The advance of science is also a force for good, not just in its ability to extend lifespan and connect disparate parts of the globe, but also for its moral imperative: that doubt is good, that any open mind welcomes new and conflicting facts because they stimulate questioning. Historically, players on the world stage have only ever seen the world through their own eyes, but increasingly, we are learning to anticipate our opponents’ moves, and in anticipating them, we are understanding and empathizing with them.

Increasingly we see our affairs from two vantage points: from inside our skulls, where the things we experience just are, and from a scientist’s-eye view, where the things we experience consist of patterns of activity in an evolved brain, with all its illusions and fallacies.

Pinker echos Jung in his assertion that by understanding our own personal inner demons, we can understand, and then destroy, the demons “out there” in the world.

If all of this sounds intriguing, even counter-intuitive, watch Pinker’s TED Talk for a wonderful summary of the book.

Review: The Gingerbread Collection

Standard

The Gingerbread Collection
The Gingerbread Collection by Victor A. Davis
My rating: 0 of 5 stars

Wow, so incredibly humbled by and proud of my first public review. Thank you Joel R. Dennstedt of Reader’s Favorite! https://readersfavorite.com/book-review/the-gingerbread-collection

Reviewed by Joel R. Dennstedt for Readers’ Favorite

The Gingerbread Collection: Short Stories by Victor A. Davis immediately reveals the efforts of a master craftsman hard at work creating what appears to be an effortlessly produced, highly polished, perfectly edited, exquisitely written, fine tuned set of finished tales. Upon opening the book to partake in these delights, the reader immediately relaxes, knowing that he or she is in the hands of a professional writer and master storyteller. The style of writing is impeccable, exhibiting a perfect balance of necessary information, descriptive detail, and allocated momentum uniquely relevant to the action immediately at hand, with finely measured doses of anxious tension to make one hesitate before recklessly plunging on ahead. All for the purpose of entertaining the avid reader with intriguing, helplessly engaging plots.

The title story, called simply Gingerbread, retells the story of Hansel and Gretel in a totally modern setting with more anticipatory involvement than the original, and with a decidedly more gripping – and perhaps more morally demanding – finale to the tale. You will be deeply touched, affected, and morally offended … that is guaranteed. You will also have been deeply involved and entertained … that too is guaranteed. This holds true with all of the stories in The Gingerbread Collection (especially one really horrifying tale), in which each tale – so completely unique unto itself that choosing a favorite is not only impossible but somehow inappropriate – seduces the reader into a new and different spot to be, watching with a kind of participatory gaze the events of life so particular to that specific tale. I am attempting to convey here the incredibly lucid sense of reality that permeates each story, immersing the reader quite helplessly as he becomes a participant in the telling. That is what a great storyteller does, and make no mistake, Victor A. Davis is a great storyteller.

View all my reviews

The Rights Revolutions

Standard



Review of The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, Chapter 7

(Credit: All block quotes are excerpts from the book.)

Chapter 7: The Rights Revolutions

I have a dream that one day this nation will rise up and live out the true meaning of its creed: “We hold these truths to be self-evident: that all men are created equal.” ~ Martin Luther King, Jr.

A long time ago, I remember watching a documentary about cave people. There was a scene in which a family sat in a cave with a blizzard raging outside. They looked cold, hungry, and destitute. A mother tried to comfort a screaming baby. The father gathered his dignity, took the baby gently from its mother’s arms and carried it outside to smother it. This winter was too harsh to be caring for an infant. I remember thinking at the time what a horrible world they found themselves in, not what horrible people they were. As it turns out, the killing of newborns is fairly common in mammals and primates. Mothers must hedge a biological bet, taking stock in their situation before determining whether the child has a reasonable chance to reach adulthood or if the potential leech must be killed before it dies anyway and squanders all their effort.

As it turns out, this has been the rule rather than the shocking exception for most of human history. Myths and fairy tales abound in which infants are left for dead, but grow up to become heroes, like Romulus and Remus. Myths give us a dramatized record of ancients’ actual practices. Sparta was not the only civilization that discarded weak or sickly babies. In 1527 a French priest wrote that “the latrines resound with the cries of children who have been plunged into them.” Enter Oliver Twist, the 19th century English exposé about an orphan growing up in a workhouse. When we think of the evil conditions of orphanages of times past we forget that they were a moral improvement upon infanticide. That is to say, at some point, the killing of unwanted children became unpalatable to our ancestors so social movements cropped up calling for institutions that could take them in. Our concern for children has continued to grow so much that the workinghouses of Dickens’ day, with their near ninety-nine percent casualty rate, seem repugnant to us.

The historical increase in the valuation of children has entered its decadent phase. Now that children are safe from being smothered on the day they are born, starved in foundling homes, poisoned by wet nurses, beaten to death by fathers, cooked in pies by stepmothers, worked to death in mines and mills, felled by infectious diseases, and beaten up by bullies, experts have racked their brains for ways to eke infinitesimal increments of safety from a curve of diminishing or even reversing returns. Children are not allowed to be outside in the middle of the day (skin cancer), to play in the grass (deer ticks), to buy lemonade from a stand (bacteria on lemon peel), or to lick cake batter off spoons (salmonella from uncooked eggs).

What’s going on here? If the long historical arc from infanticide to orphanages to child labor to public education to children’s rights to political correctness is real, then is this just one more instance of the overall decline in violence? Yes. In 1693 Locke coined the phrase “tabula rasa,” or “blank slate,” in describing for the first time in history the concept of a child’s brain being an “empty” version of an adult’s, that must be filled with wholesome knowledge in order to become a responsible adult. Neglect a child’s education and he/she will grow up to become a bandit, a fiend, a godless peasant. People picked up this Enlightenment call for public education. But what had they picked it up from? What Locke was arguing against was the age-old medieval idea that all children were possessed of demons, inherent from their sinful creation, and that child-rearing consisted of “beating the devil out of them” and replacing Satan’s stranglehold with scriptural knowledge and grace. (Note how modern expressions stick around long after the practice that coined them have disappeared.)

The subject of children’s rights is just one instance of a slew of rights revolutions that have blossomed, most notably in Enlightenment Europe and the 1960s and 70s. Civil Rights, Animal Rights, Children’s Rights, Women’s Rights, Homosexuals’ Rights, and (a bit further back in time) Laborers’ Rights have all become familiar by now. It is not difficult to conjure up images of the terrible violence that must have provoked them. The UN’s 1948 Universal Declaration of Human Rights seems to have marked an historical turning point, a reawakening of some of the Enlightenment ideals of freedom and equality for all.

When we look back in time, it is difficult to imagine how any rational person could justify an act we would today call a hate crime or a hate killing. But that just goes to show how our cultures have changed over time. People used to exist who really believed Jews had horns on their heads under their hair, or that negroes were physically unsuited for education. Discrimination, and worse, ethnic violence, starts with things like dehumanization. Even that very word is telling, for it implies that today we consider every 46-chromosomed person a real, fully whole and deserving human, and that our ancestors (or unfortunately, contemporaries) have de-humanized them, have stripped them of their inherent humanity in order to demonize them. But for them, it’s the other way around. A pervasive ignorance and an upbringing within a cultural norm that has never considered a class of people human in the first place cannot be guilty of de-humanizing them, only of ignorance. That is why the rights revolutions have always been driven by information, education, engagement, and in short, the spread of knowledge.

If I were to put my money on the single most important exogenous cause of the Rights Revolutions, it would be the technologies that made ideas and people increasingly mobile. The decades of the Rights Revolutions were the decades of the electronics revolutions: television, transistor radios, cable, satellite, long-distance telephones, photocopiers, fax machines, the Internet, cell phones, text messaging, Web video. They were the decades of the interstate highway, high-speed rail, and the jet airplane. They were the decades of the unprecedented growth in higher education and in the endless frontier of scientific research. Less well known is that they were also the decades of an explosion in book publishing. From 1960 to 2000, the annual number of books published in the United States increased almost fivefold.

If all of this sounds intriguing, even counter-intuitive, watch Pinker’s TED Talk for a wonderful summary of the book.

The Morals Science Teaches Us

Standard

Most people, when they think of science at all, think of lab coats, mice and monkeys in cages, particle accelerators, and astronauts. Although subconsciously we know how pervasive science is in our everyday lives, we rarely acknowledge it. The first reason is, well, I am not a scientist, nor do I know any. The second is that there still exists a stigma about it. After all, morals are taught by culture and religion, and science, for all its boons, is bent on destroying those things. Right? We know that scientists are not evil (at least, not in real life), and yet most of us still think of them as super-intelligent heathens, who develop toys and technologies for our everyday consumption and ask little in return. They are, and always have been, representative of Prometheus, the titan who defied the gods to bring man fire. We are both grateful and fearful, but mindful of worshipping or idolizing this false god.

I challenge you to break out of this mindset by considering a few non-controversial, unsung science facts. None of the following are ambiguous, unnatural, disputed, or divisive. You have probably always known them without realizing that nearly every generation that has come before you did not know them. Many scientific discoveries have bestowed something upon us or challenged our worldview, more than can possibly be listed. It is one thing to know now what a star is made of, but that hardly translates into moral progress. Yet these are a few that have challenged long-held beliefs affecting the moral constitution of our civilizations. Culture and religion taught us one thing, and a scientific discovery slowly changed what we teach our children, without ever demanding the credit. Or, it must be said, degrading or invalidating the religion it contradicted.


The earth is not the center of the universe. This one is a bit obvious, which makes it a good place to start. Early astronomy placed earth at the center of creation, and man at the center of the reason for earth’s creation. Consider the lesson in humility science teaches us by obliterating that totally understandable, but ultimately incorrect belief. By relegating mankind to an observer of the universe and not its kingpin, science has instilled in us a sense of wonder and responsibility toward the world around us, rather than subjugation. Moral: There is no center of the universe, and no single reason for its existence.


There are structures both vastly bigger and vastly smaller than the human senses can experience. Microscopes and telescopes teach us of the existence of microbes and galaxies, and the world of unfamiliar forces they experience. While this doesn’t necessarily have an immediate impact on our morality, per se, it does teach us that our bodies were not designed to be capable of directly sensing all of creation. We cannot see ultraviolet rays with our eyes, nor feel the neutrinos passing through us. Again, this should teach us to respect the invisible truths all around us, and seek them out, acknowledging that no creator ever intended for us to know them. Moral: Worlds exist in parallel all around us, that our natural organs were never intended to experience.


Race does not exist. Before the discovery of DNA, the word “racist” meant the belief that one race of men was superior to another. It could mean that Hutu was superior to Tutsi, that white was superior to black, or that Persian was superior to Arab. The contests of superiority are well-documented. What is underappreciated is the fact that this antiquated definition of “racist” presupposes that race exists in the first place. DNA encodes information about your lineage, and historically, people from the same geographic area shared lineages because they did not travel extensively. Biologically, this leads to a divergence in physical features over a long period of time that can sometimes become so extreme that two individuals from different genetic pools were no longer compatible enough to interbreed. Early naturalists called this process speciation. Human beings, for all our physical variety, have never diverged into distinct species. A man and a woman from two different lineages are as fertile, statistically, as from the same lineage, because the species is so biologically young. The connotative word “mulatto” was used to describe the illegitimate children of white masters and black slaves. Today we use the equally connotative, but politically correct phrase “mixed race.” Both are misnomers. The child of an Irish father and a German mother is as “mixed” in lineage as the offspring of a black-white couple, yet only the latter was ever called “mulatto.” Race is a social invention used exclusively to divide and subjugate. There is no single gene for dark skin, curly hair, blue eyes, or short stature. These phenotypes are the result of a tangled web of genes, which is why the physical features of parents appear “blended” in their child. Thus, a DNA test may reveal a person to hail from the West Indies, or from East Africa, but there is no pass/fail DNA test that shows him or her to be black. By comparing a single DNA specimen against databases of different lineages, a person can be placed on a branch of the human tree. That tree represents the travelling (and conquering) history of peoples geographically. Nowhere does the science suggest that there exist pools of people in the high-walled gardens we’ve always traditionally called “race.” Implicitly, we’ve accepted this. Today, the word “racist” means a feeling of superiority toward people of different ethnic background or geographic origin. Eradicating racism will involve, among other things, continuing to allow scientific discoveries to subtly change the literal definitions of words like “mulatto,” “mixed,” “foreign,” and “race.” Moral: We all belong to the same race, and our physical features originate from our geographic ancestry. There is no such thing as a “mixed” person.


You inherit exactly 50% of your genes from each parent. No, this was not obvious before genetics! Many mothers abhor the idea that a father can lay claim to the baby in her womb on account of the fact they “planted the seed.” This treats her as a mere carrier, not co-creator, of a child. This analogy of seed-planting, while euphemistic today, has been taken quite literally by cultures past. In this analogy, the baby contains 100% of the father’s genetic history, and the mother’s womb is the “soil” that allows this “seed” to grow into a “fruit.” After “harvest,” the new mother is the caretaker and steward of the father’s genetic history. This literal rendition of seed-planting was the norm for most of cultural history, until science said otherwise. Although a thinking person could determine from certain clues that babies blend their parents’ physical features, it wasn’t until the discovery of DNA, chromosomes, mitosis, and meiosis that the mechanics of reproduction evinced the 50% rule: that babies represent an exactly equal share of their mother’s and father’s genetic history. While this is not the sole misunderstanding underpinning gender inequality, it behooves us bury the egregious “planting of the seed” analogy for good. Moral: You represent exactly half your mother’s and your father’s genetic history. The mechanical differences of male and female reproductive roles do not change this ratio.


Infant mortality is at historical lows. We all know that families used to have more babies in generations past, but most people vaguely chalk it up to a change in “culture” or our “cultural” definition of family. In fact, there are dozens of factors that contribute to this trend. The two biggest ones: the near eradication of childhood disease, and contraception. Although there are lots of reasons for it, no one denies that families are smaller today than they have been in times past. Previously, women had very little control over the number of babies they had in their lifetimes, and sex was the only way to modulate this number. Counterbalancing this overproduction of babies was the fact that the odds of a child surviving to parenthood were low. Once medical science caught up with the science of microbes and things like vaccinations made survival more and more probable for babies and children, society encountered a bit of an arithmetic problem. How could the local economy support all these babies who, a generation ago, would not have survived long enough to need this support? Any society experiencing a rapid decline in infant mortality faces this grave economic challenge. Healthcare improving survival prospects leads to overpopulation which stresses the local economy, causing problems such as undernourishment and unemployment. Contraception, in its various forms, was the solution to this economic problem. “Natural” reproduction involves making babies quickly and often, from puberty to menopause, and losing most of them young, analogous to what we see in the animal kingdom. “Modern” reproduction divorces sex from baby-making, using contraception to control when a woman chooses to have children, as many as she wants or can take care of. This model has no analogy in the natural world, and we are on our own in making it work. Moral: Birth control is a necessity in a world where all babies are expected to reach adulthood.


“You” are your frontal lobe. You’d still be you if you lost your arm, or had to have a liver transplant, or your appendix removed. People did not always know what the brain was. The Egyptians threw it out during the mummification process, and only preserved the lungs, stomach, intestines, and liver. We take it for granted today that the electrical signals in our brain somehow encode who we are, even though brain science cannot yet explain precisely how. Although there are different interpretations of the word “soul,” the less connotative word “I” communicates the basic idea: I would not be me after a brain transplant. One horrific “treatment” of unruly mental patients used to be frontal lobe lobotomy, a gruesome practice that, while pacifying the patient, also deprived them of their “I.” In Greek mythology, gods were perfect specimens of men and women. Throughout history, morals can be found calling deformed people half-people. Think of lepers, cloven-feet, cleft palates, dwarfs, etc, and the subhuman status they’ve been awarded in the fables and history books, relative to today. Moral: No physical deformity makes a human being less human.


Many mental illnesses have physical causes, and are now treatable. This is a point that cannot be over-emphasized. For all the bad press our over-medicated culture gets, consider the alternative as it stood before this drug revolution. Words and phrases such as “cabin fever,” “going postal,” “crazy,” “lost touch,” “backed off the edge” will take generations to purge from our vocabulary. I read a book recently called Brain on Fire, where the afflicted woman, Susannah Cahalan wrote a memoir about her “month of madness,” when a rare auto-immune disease wreaked havoc on her spine, brain stem, and brain. One of the first and most obvious symptoms was a horrifying psychotic change in personality her family was at a loss to explain and that she has no memory of. A physical treatment to her physical ailment brought her out of it to a complete recovery, but it begged the question. The drug she received was very expensive, very new, and only available in small doses in a few top hospitals. Even her diagnosis was a stroke of luck. How many people, past and present, have exhibited her symptoms and been institutionalized for the rest of their lives, for having “lost it”? When we feel powerless over something, we often compartmentalize it. This is the moral lesson of our historical treatment of the insane: They have been forsaken, marginalized, boxed up, cut away, and buried from view. Increasingly, our care for our fellow human beings has led to the rigorous methods of treatment and diagnoses for the insane. Science has enabled us to reach into that box and save many of them, and it is possible to envision a future in which nearly every form of mental illness has been cataloged as the microbes once were. We live in the anteroom of that world, where certain ailments like depression, anxiety, hyperactivity, and mania are manageable, while those we don’t have a handle on, like schizophrenia, continue to be studied with fervor. Moral: Mental illness could very well prove to be symptomatic of physical, treatable diseases.

 


“True science teaches us to doubt and to abstain from ignorance.” ~Claude Bernard. Here is a legacy shared by two men living in two different ages: Gutenberg and Galileo. Their legacy was the divorce of truth from authority. When Gutenberg invented the printing press and started printing bibles, he published them in vernacular rather than Latin. This sent the strong statement that any common man could read, comprehend, and interpret the bible for himself, without necessarily going through a priest. When Galileo invented the telescope and proved Copernicus correct, he recanted under threat of torture and lived out the rest of his life under house arrest. Why? Not because the priesthood upheld any sacred astronomical text in the bible “proving” the sun revolved around the earth, but because the telescope as a truth-seeking device directly threatened their monopoly on truth. Science is the ultimate democratization of truth. Just as an informed electorate is necessary for the operation of a democratic state, a doubtful, questioning, skeptical public is necessary for an educated society. Moral: Doubt is the ultimate weapon against ignorance.


For all the goods and evils science has been responsible for, we have advanced so much that we now take these statements for granted, or probably will within the next few generations. No religion or culture anticipated them. You should teach them to your children as your secular upbringing has taught them to you. Perhaps in addition to vaccines, smartphones, jet planes, soap, and refrigerators, science can also help us along with our continued moral progress. Those men and women in the white lab coats are human beings with hearts full of care, and have shaped our morals more than they will ever ask credit for.

There is no center of the universe, and no single reason for its existence. Worlds exist in parallel all around us, that our natural organs were never intended to experience. We all belong to the same race, and our physical features originate from our geographic ancestry. There is no such thing as a “mixed” person. You represent exactly half your mother’s and your father’s genetic history. The mechanical differences of male and female reproductive roles do not change this ratio. Birth control is a necessity in a world where all babies are expected to reach adulthood. No physical deformity makes a human being less human. Mental illness could very well prove to be symptomatic of physical, treatable diseases. Doubt is the ultimate weapon against ignorance.

The New Peace

Standard



Review of The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, Chapter 6

(Credit: All block quotes are excerpts from the book.)

Chapter 6: The New Peace

“Macbeth’s self-justifications were feeble – and his conscience devoured him. Yes, even Iago was a little lamb too. The imagination and the spiritual strength of Shakespeare’s evildoers stopped short at a dozen corpses. Because they had no ideology.” ~ Aleksandr Solzhenitsyn

This ironically titled chapter talks about quantitative declines in the most horrific categories of human violence: genocide and terrorism. It is possible to compartmentalize archaic forms of violence, such as soldiers dying by the thousands in musket line advances, because we know we have advanced past the point of ever seeing that again. It is much harder to prepare a subject like ethnic cleansing for objective, palatable academic treatment when so many people’s lives today have been ruinously touched by it. How can one’s blood not boil when looking at a chart displaying a downward trend in terrorist attacks, when that tiny blip at the far end represents those who died on September 11th, 2001?

Perhaps it is time to take a breath and remember the theme of the book. There is a common quote falsely attributed to the Koran that goes “if any one killed a person, it would be as if he killed the whole of mankind; and if any one saved a life, it would be as if he saved the life of the whole of mankind.” This is exactly the nerve that is touched by the detached academic vivisection of the emotionally devastating. Such as when we say “sure, slavery was abolished on paper, but black people still suffer needless discrimination.” Or, “yes, women can vote in most countries, but that doesn’t mean there aren’t still sex traffickers in the world.” This tendency to conflate the individual with the whole renders “progress” a dirty word. It is painful, and even offensive, to take two evils and attempt to weigh them against each other, to say that the more recent one is less harmful amounts to “progress,” or that a crime with ten victims is ten times “worse” than the same crime with one victim. Yet we have to be able to take these part-whole blinders off in order to make a fair judgment. It is disingenuous to claim that because black people still suffer needless discrimination, then “we’ve made no progress” since the times of slavery. That is factually incorrect and patently absurd, yet we hold it as a moral principle that if any one suffers needlessly, then the whole is broken.

The effort to whittle down the numbers that quantify misery can be heartless. But there is a moral imperative in getting the facts right, and not just to maintain credibility. The discovery that fewer people are dying in wars all over the world can thwart cynicism among compassion-fatigued news readers who might otherwise think that poor countries are irredeemable hellholes. And a better understanding of what drove the numbers down can steer us toward doing things that make people better off rather than congratulating ourselves on how altruistic we are. (p 468)

So long as any evil exists in the world, it is not the time for back-patting. But this is not what this book is encouraging us to do. His running theme is that recognizing a decline in violence is prerequisite to understanding what exactly it is that we have been doing right, so that we can pick up the threads of that effort and continue pushing it forwards. Pundits love to use phrases like “we live in dangerous times,” and “we are at a crossroads in history,” and “there is more at stake today than ever,” or most histrionically, “this is a battle for the soul of XYZ.” What they are doing is rousing us to action, engaging us to get involved in an issue or buy their sponsors’ products. They are exploiting the natural human misperception that times past were innocent because our ten-year-old eyes were incapable of perceiving wickedness. As we grow up and learn to see the world for what it is, our growing disillusionment can feel like a real decline in wholesomeness and purity. There is an ancient Greek word, kairos, which means “a passing instant when an opening appears which must be driven through with force if success is to be achieved.” We want to believe that we live in kairos, that we must be the vigilant driving forces to overcome the unique historical crisis we have for the first time found ourselves in. This is an illusion. There is nothing special about the time we live in other than the fact that we are here to live it. We are always living in kairos, in every age.

Baby boomers like to hearken back to the “innocent age” of the 1950s, when marriages were stable, families were whole and respectful, and jobs were plenty. But this was also a time when the lynching of a black man may not even make national news, when schoolchildren were taught their duck and cover drills in case the Russians dropped the bomb, and the Fadeyeen were terrorizing a nascent Israel. Has the world really gone to hell since these innocent times? Do we really want to go back there? Or was it simply that the five- to fifteen-year-olds of this time period remember it as being a time of childlike innocence because they were children?

Pinker writes that “[A] quantitative mindset is in fact the morally enlightened one. It treats every human life as having equal value, rather than privileging the people who are closest to us or most photogenic.” Thus, if the quantitative trends are real, then it means that fewer people suffer, and that we can thank the better angels of our parents’ nature for recognizing and working to decrease the suffering they saw in their world in order to build ours. We must pick up that torch.

My Year in Books 2015

Standard

I’d have to say I had a pretty solid year in books for 2015. I hope 2016 is as fruitful! I read 43 books. The vast majority were good or great. I discovered new authors, both traditional and indie. After reading 100% female authors in 2014 as a New Year’s Resolution, I pledged to continue striving to keep my reading list gender-balanced. Setting a few inapplicable titles aside, I read 28 books by male authors and 12 by females, for a 70/30 split. Nothing to write home about, but it’s much more balanced than my previous years, by focused effort. My favorites for the year were Flowers for Algernon, The Shell Collector, Lolita, Isaac’s Storm, Persepolis, Another Country, Night in Funland, The Cruelest Miles, and Ship Fever. My newest favorite indie author discoveries were Dan Buri, David Hull, and the very talented Jaq Hazell.

Without a doubt though, the best book I read in 2015 was the last. Eowyn Ivey’s The Snow Child truly took my breath away. This debut novel made it to the finalist round of the 2013 Pulitzer Prize judging, an exclusive group of only three books each year.

Click the links below for more information about each book on Goodreads!

Atlas Shrugged or Fountainhead?

Standard



Before becoming active on Goodreads, I was asking & answering a lot of book questions on Quora. I stumbled across this one and thought it worthy of a repost. The question was: “Which is better, Atlas Shrugged or The Fountainhead?”

Atlas Shrugged.

Since “better” is so subjective, let me rephrase your question: “Which of the two books encapsulates Ayn Rand‘s worldview and delivers her point more wholly?” Atlas Shrugged. (Disclaimer: this is my favorite book; I’ve read it three times. Though I have read The Fountainhead also, once.)

First of all, both books say exactly the same thing, have exactly the same theme, and are about of equal length. So, what’s the difference? The Fountainhead focuses on a single protagonist, Howard Roark, as he struggles to succeed as an individualist. There are only a handful of main characters. The “big monologue” comes toward the end in a court case scene, but I won’t say more to spoil that. Atlas Shrugged spans years of time and focuses on a huge group of protagonists represented in the book as genius industrial innovators. There are literally hundreds of characters to keep up with, but they all fall into very few well-defined classes, so it’s not too difficult to keep track. She called these classes “prime movers” and “moochers” although there are shades of grey, specifically characters I would classify as “the wretched” or “the fallen.” The “big monologue” comes toward the end as a radio broadcast from John Galt. In both, the “big monologue” is kind of the kernel of ideas, much like a college thesis embedded in the novel.

While The Fountainhead was inspired by the life and work of Frank Lloyd Wright and his “form follows function” post-modernist architectural style, Atlas Shrugged feels more like a social science fiction. Much like George Orwell did in 1984, Ayn Rand takes the world that she knew (specifically 1950s McCarthyism) and attempts to push it forward by a couple of decades. She takes the complex political question of communism vs capitalism and frames it simply as a social “good vs evil” struggle. Though she sets both novels in New York City, The Fountainhead never leaves NYC, while Atlas Shrugged is a story that spans the entire U.S., specifically Colorado and Washington D.C., and also Mexico (there is even a “pirate on the high seas” character).

Most importantly, Ayn Rand would probably tell you the same thing. Look at the progression of her work: 1936 We the Living is a semi-autobiographical account of a young woman growing up in the worsening conditions of Soviet Russia. The American public, increasingly warming to the concept of communism (particularly the burgeoning, left-leaning movie industry), rejected it as propagandist. 1938 Anthem was a short sci-fi novella about a human race in the distant future so collectivized that the society took on a form akin to an ant colony, where people had lost the faculty of recognizing themselves as an individual body apart from society. 1943 The Fountainhead expanded this idea into a full-fledged, well-crafted, novel-length story set in the present day. According to Rand, she began journaling Atlas Shrugged the moment she finished The Fountainhead, because she envisioned a much grander stage upon which to set her story. In 1957 she achieved this, delivering another full-length novel, Atlas Shrugged, with exactly the same theme, though an order of magnitude greater in scope, clarity, and power. Then she stopped writing fiction. For the remainder of her days, until her death in 1982, she wrote prolifically on her worldview in the form of non-fiction philosophic treatises. From her own point of view, she had accomplished her lifelong quest to produce her masterpiece, that seminal work that summed up her ideology for posterity.

This places her in the ranks of a very small subclass of artists like Robert M. Pirsig and Harper Lee, who, upon finishing what they believed to be their masterpiece and receiving the critical acclaim they desired, retired from writing fiction. They had something to say, they said it, and that was the end of it. Ayn Rand strikes me as one who grew up in ideological and economic poverty, came to this country and worried it was going in the same direction, and dedicated her life to correcting public opinion on the matter. Although the U.S. left’s sentiment for communism would not wane until the collapse of the USSR in 1991, I believe she played a prominent role in speaking out against it and preserving our more historically capitalist ideals. I once heard a quote that went something like “An artist is a lover of art who sees a void in the art world and feels compelled to fill it.” In that sense, she was an artist, through and through.

By all means, read all of her books, but know that her ideas crystallized as she got older and thus each of her books was “better” than the one before it. (Also, as an aside, the 1949 movie The Fountainhead with Patricia Neal and Gary Cooper is masterful (despite the cheeky trailer), in part because Ayn Rand wrote the screenplay. The recent Atlas Shrugged movies were terrible (despite the very good trailers).)

So, if you’ve heard a lot about her or her books and are looking to see for yourself what she’s all about, that’s my recommendation. Why anyone would voluntarily sit down and read a fifteen hundred page book is a tough question to answer. I hate long books. But, contradiction that I am, Atlas Shrugged is not just my favorite long book, it’s among my favorite, most influential books I’ve ever read. It’s one of those rare long books where every page is engaging, and it’s difficult to imagine cutting it down without sacrificing its quality.

The Long Peace

Standard



Review of The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, Chapter 5

(Credit: All block quotes are excerpts from the book.)

Chapter 5: The Long Peace

“War appears to be as old as mankind, but peace is a recent invention.” ~ Henry Maine

Name five 20th century wars as quickly as you can. Go! … Pretty easy, right? Now, name five 17th century wars as quickly as you can. Go! …(crickets)… Why is that so hard? Don’t you know that the 1648 Treaty of Westphalia ended the Thirty Years War, which took approximately seven million European lives? Well, don’t feel too bad. A lot of people don’t. Nor have they heard of Russia’s “Time of Troubles” in the early 17th century, which took another five million, or the Fall of the Ming Dynasty (also 17th), a conflict whose death toll ranks around twenty-five million. You’re getting the picture.

To take some random examples, the Dano-Swedish War (1516–25), the Schmalkaldic War (1546–47), the Franco-Savoian War (1600–1601), the Turkish-Polish War (1673–76), the War of Julich Succession (1609–10), and the Austria-Sardinia War (1848–49) elicit blank stares from most educated people. (p 341)

This is all indicative of a phenomenon called “historical myopia,” and professionals are as guilty as laypeople. Recent events weigh on people’s worldview more heavily than events of the distant past. The result in this context is that people almost universally believe that we live in the most deadly, violent time in all of human history. This is simply, factually not the case. You may know that World War II claimed more lives than any single war in world history, fifty-five million. But you also know there were many more people alive in the 1940s than at any previous time. As a proportion of world population, World War II ranks ninth in deadliness among the major conflicts in human history. What takes first?

The worst atrocity of all time was the An Lushan Revolt and Civil War, an eight-year rebellion during China’s Tang Dynasty that, according to censuses, resulted in the loss of two-thirds of the empire’s population, a sixth of the world’s population at the time. (p 294)

Don’t worry. I hadn’t heard of it either. Thirty-six million 8th century people perished. For comparison, had that conflict taken place in 1940, when the world population was far higher, the equivalent death toll would have been four hundred twenty-nine million. The point is, our intuitive narratives of history are based on our perceptions, and our perceptions are skewed myopically toward the present. When we adjust for this skew and create a narrative of history based on actual data, a new trend comes to light: The frequency, duration, and deadliness of wars, world wide, has been in steady decline from the ancient to the modern world. This trend is visible whether you count the casualties individually or as a proportion of the population. To rephrase, your chances of dying at the hands of another human being, whether on the battlefield or otherwise, as opposed to a natural death, are lower than at any time in human history.

So again, why has war declined? Nobody knows for sure, and again, it’s probably a complex combination of technology, the capacity for empathy, globalism, democracy, medical science, and humanism. If you ask a typical person today “Why did World War III never happen?”, you’ll generally hear that weapons have become so advanced (nukes, drones, biological) that a large-scale war would be too damaging to justify any standard war motive. While that may be partly true, the trend toward peace has been in the works a very long time. In fact, the “weapons too deadly” argument (besides being slightly oxymoronic) fails in application. Most historians attribute the high casualties of the American Civil War to advances in riflery, so proving nations war on in spite of higher casualties. Also, World War I was iconic for its extensive and horrifying use of poison gas, so reviling leaders that World War II was fought entirely (on the battlefield anyway) without it. So, nations do go to war while leaving their deadliest toys at home.

Which brings us to the big one: nukes. No country has used nuclear weapons in war since 1945. You may invoke deterrence and Mutually Assured Destruction, but for the first few years of their development, there weren’t enough in existence for MAD to work. While the world nuke counter ticks ever upward, the length of time the world has gone without using them ticks upward as well. So much that small countries have given up their nuclear programs while bigger ones have started dismantling their stock piles. While “fear of fallout” can’t be completely discounted, the fact of the matter is that war as an institution is fizzling out. It is increasingly unpopular, decreasingly deadly, and marginalized to the poorest and least stable nations in the world. When phrased another way,

If one were to calculate the amount of destruction that nations have actually perpetrated as a proportion of how much they could perpetrate, given the destructive capacity available to them, the postwar decades would be many orders of magnitudes more peaceable than any time in history. (p 368)

Historians call the period from 1945 to present “The Long Peace.” That phrase is not intended to be disrespectful to the millions who died in the Korean War, Vietnam, Kosovo, Iraq or Afghanistan. It is meant to signify that the world powers have transitioned from constantly feuding neighbors to international peacekeepers, and that interstate war has changed from an institution “needed as a cleansing and invigorating therapy for the effeminacy and materialism of bourgeois society” to an increasingly unnecessary evil.

Again, history is not driven by physics equations, and no trend is guaranteed to continue, so the most important lesson here is not that we can breathe easy and ignore the sufferings and threats in the world today. The lesson is that as a species, we are doing something right, and it is important to identify those somethings (multi-country coalitions, democratic self-government, aversion to torture, women’s rights, etc), so that we can keep doing them.

If all of this sounds intriguing, even counter-intuitive, watch Pinker’s TED Talk for a wonderful summary of the book.

The Humanitarian Revolution

Standard



Review of The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, Chapter 4

(Credit: All block quotes are excerpts from the book.)

Chapter 4: The Humanitarian Revolution

“Those who can make you believe absurdities can make you commit atrocities.” ~ Voltaire

Iron Maiden. Breaking on the Wheel. Drawn and Quartered. Burned at the Stake. Crucifixion. Impalement. Disembowelment. Put in Stocks. Beheading. These are not pleasant images to think about, but they were common punishments in Europe for centuries, even in response to such tame crimes as working on the Sabbath or insulting the crown. Why do these punishments no longer exist in modern times? If your answer is along the lines of “those were unenlightened times” and “we have a constitutional amendment protecting us from cruel and unusual punishment,” then you already know about the Enlightenment. But what caused the Enlightenment? For that matter, what caused human beings to be so cruel in the first place?

One attempt to answer the latter question is the “life is cheap” hypothesis. The idea is that life expectancy was so low, natural death and disease so common and horrific, and nature so forbidding, that primitive peoples were violent by default, because life did not carry with it particularly high premiums:

“Their primitive world was full of dangers, suffering, and nasty surprises, including plagues, famines, and wars. It would be natural for them to ask, ‘What kind of god would create such a world?’ A plausible answer was: a sadistic god, a god who liked to see people bleed and suffer.” So, they might think, if these gods have a minimum daily requirement of human gore, why not be proactive about it? Better him than me. (p 211)

This is an interesting theory, and seems to jive with the quantitative trend. As medical science advanced, quality of life improved, the valuation of life increased, so people were less likely to destroy each other. It’s a cute theory, but still a bit circular: Did medical science cause a humanitarian revolution, or did an appreciation for human life spur on medical science?

Between the sixteenth and eighteenth centuries, several prominent movements began, spearheaded and recorded by the likes of Adam Smith, Locke, Hobbes, Voltaire, Kant, Spinoza, Rousseau, and Newton. You probably recognize every name on that list without ever having read their books, and that’s okay! Reading their books now is a very academic exercise, because the kinds of concepts they advanced are so ingrained in our modern worldviews that they seem almost silly and self-obvious. This was not the case in medieval Europe. Daring to pose that a witch being burned at the stake was in full possession of a soul, a sense of pain, a feeling of fear, and the love of her family was not only revolutionary, it was blasphemous.

Here are just a few of the moral issues tackled during this time period, culminating in several full-on abolition movements: slavery, dueling, cruel and unusual punishment (torture), debtors’ prisons, separation of church and state, animal cruelty, capital punishment, corporal punishment, witch-hunting, women’s rights, the scientific revolution, democracy, free market economy. Because of the timing, several of these issues (with one very notable exception) were explicitly dealt with in the American Constitution.

So, what caused such a sudden explosion of moral and reasoned social change? Most of us remember from our schooling that it was a time of “rediscovery” of the ancient classical thinkers and a renunciation of the crooked, oppressive Catholic Church. While there are many complex and interconnected causal threads to this intellectual awakening, there is one “exogenous” cause that Pinker discusses at length in this chapter: books.

Johannes Gutenberg invented the printing press in 1440, dramatically reducing the production cost of books. Over the next century, its efficiency improved even more. Suddenly books became big business as swaths of people from lower and lower socioeconomic strata could afford them. Literacy rates skyrocketed. But what exactly is the connection between reading and humanism? The most notable emotional change is empathy. Books teach us to see the world through different sets of eyes, effectively lifting us from our own bodies and placing us in others’ temporarily. Once this concept passed from spooky to commonplace, people began to empathize with the burning witch, by asking for the first time in history what that person must be feeling, and how would I feel in their place.

So, if the enlightenment was so overwhelming, how can one explain the atrocious violence of the 19th and 20th centuries? First of all, as the next chapter demonstrates, the 19th and 20th centuries were nowhere near as violent as the middle ages. Secondly, history does not obey neat, tidy physics equations. Trends ebb and flow. Humanitarianism rose and tended to erode some of the disgusting practices of times past, but there were other trends at work in the world too. The one historians most often blame for the two world wars: Nationalism. There was a time when borders changed so frequently that peasants never thought of themselves as a national or ethnic group. Once those borders stabilized, people began to take pride in their “blood and soil,” their common languages, religions, and values. This culminated, unfortunately, into interstate wars at unimaginable scales. No longer were two rival princes recruiting knights to battle over a barony. Now, generals were conscripting entire male populations to battle neighboring countries for dominance.

Despite these setbacks, the point is that attitudes about government change over time. The differences are stark between tribal leaders, noblemen, kings, and senators, and the attitudes of their corresponding flocks. Historically, we elide these differences and innately assume that the power dynamic between rulers and people has stayed more or less the same. Although each development added or subtracted from the overall bloodthirstiness, the landscape we see today was shaped by the thinkers who were considered radical in their time.

Instead of taking government for granted as an organic part of the society, or as the local franchise of God’s rule over his kingdom, people began to think of a government as a gadget—a piece of technology invented by humans for the purpose of enhancing their collective welfare. (p 245)

If all of this sounds intriguing, even counter-intuitive, watch Pinker’s TED Talk for a wonderful summary of the book.