What does DNA mean?

After several decades of gradual formation, a deep understanding of evolution was first published in Darwin’s On the Origin of Species, in 1859. The ideas in the book then gradually seeped into cultural, social, and political consciousness. After forty years of this seeping, some began to write about competition for survival among human types. Twenty years later, serious support emerged for proposals to cull supposedly “weak” people from the human herd. Twenty years after that, notions of racial superiority led to mass genocide. The ideas of Darwin were then banished to the social and political wilderness. Yet the 1980s and 1990s saw new arguments and evidence for the concept of a “human nature,” features that have been hard-coded into us by the evolutionary process. More than a century after the book that brought evolution to light, with lots of mistakes on the way, we are still processing the idea in its social, cultural, and political dimensions.

The example of evolution tells us that discoveries about the fundamental nature of existence can take a very long time to make their way into the world of our daily lives. We are about forty years into processing the reality of the Big Bang, the fact that our universe actually had a beginning at a discrete moment. We’re only just beginning to work through virtual reality. There are many examples, discoveries that may take centuries to unpack.

03_avery_puWhere are we with DNA? Most of us credit Watson and Crick with the discovery of this curious acid in 1953, but they did not so much discover it as identify its structure, the double helix. Watson and Crick were also not the first to associate DNA with heredity. That honor goes to Oswald Avery, Colin Macleod, and Maclyn McCarty, who reported in 1944 that the hereditary information in bacteria was apparently transmitted by the cell’s DNA, not its proteins. The work of these scientists has led, over the ensuing decades, to a richer understanding of the genetic basis of the human person.

We now understand that DNA is code.

This is rather profound. It means that a person is not the matter of his body. The matter changes all the time. What keeps the person as the person he is, is not a constancy of matter, but the constancy of the code that builds the matter. That code is contained in his DNA.

DNA holds the code of a person, a single person, and no two people have the same code. This means that no two people are the same. Moreover, we have learned that the environment can affect which parts of the code express themselves. This means that even clones are unique. They may bear the same code, but the code will operate differently due to different circumstances.

What makes a person, then, is the interaction of his unique DNA with the world around him.

human_zygote_two_pronuclei_02Our world is still trying to process what this means, culturally, socially, and politically. These scientific discoveries make it completely clear, for example, that a new person is created whenever a new DNA is created. A person is not primarily made of matter; a person is a code acting on matter. Once a new code starts acting on matter, a new person exists. Life begins at conception. Our world is still not ready to accept what this means.

Code is immaterial. Infinite. Immortal. In the case of DNA, the code that makes us is a set of instructions rendered on a piece of matter, a harmless acid. The particular piece of matter can be destroyed; cells die. But the code is forever. It is notional, conceptual, intangible. It does not live here in this world; it is only rendered here. Code lives in the Platonic realm of the forms. If we are real at all, we are real in our combination of code and matter. A person is the rendering in the physical world of a code that exists outside the physical world. Our bodies are born and they die, but our code, our unique selves, existed before our bodies were here and will continue to exist long after. In fact our code does not participate in the time-world at all. The thing that makes us unique is immortal. Again, our world is still working on what this must mean.

A person is code acting on matter. This means a person is neither pure spirit nor pure matter. Many philosophies say that the material world is worthless, that all important things are part of our minds. Many others say that only the material world matters. Still others admit that the human person has spiritual and material aspects, but they are distinct and, according to some, at war with each other. Science now tells us that none of these views are correct. The human person combines the material and immaterial. But in our world, we still hear voices saying that our minds create who we are, or, conversely, that we are nothing but particles. Or, that we have a good, spiritual side and an evil, material side. We may take many decades to understand that we are a fusion of matter and spirit. We may take a long time to recognize that our physical bodies are fused with and partake in the eternal.

We partake of immortality, being dignified in spiritual and material personhood from the moment of conception. May this scientific knowledge continue to seep into our world.



Ancient wisdom, common sense


A cool kid plays the choking game.

Lately the two terms in the title of the post have been running through my head quite a lot. When you have kids (and students), you find yourself searching for simple ways to express deep thoughts, and I have been trying to think of a simple way to explain why one shouldn’t necessarily go along with the popular movements of the day. It seems as though – especially where I work, at the university – that the minute someone labels an idea as being on the “right side of history,” there is instant pressure to demonstrate support for it. You know, put a sticker on your car, change your Facebook portrait, wear the correct T-shirt, avoid using certain words, and start using others (but not too much or it’ll look fake). What if you don’t go along? What does it mean? There may good reasons to object to any particular movement, but I think there are good reasons to stand back, at least for awhile, from any movement like this. It’s not always good to go along. If you’re asking me to put my heart behind something, it’s not enough to say that “everyone” believes it is on the right side of history. It’s important to have high standards about the things you support.


Highly effective.

To put this more simply, two standards that have come to my mind recently are ancient wisdom and common sense. The ancient wisdom idea says, check the claim that a certain movement is on the right side of history. How much history are they talking about? It seems to me that many of the ideas labeled this way have come about only since 1950, although a few of them date to 1800 or 1700. Well, humanity has been around for a lot longer than that. Take for example the proposal of a truly secular society, one where religion is available only in private places and has no voice in the public square. I don’t know of a society where that has happened naturally, and recent attempts to impose it by force have not ended well. Churches thrive again in Albania, as though Communism never happened.

Common sense says, keep the theory to no more than a couple of steps. What I mean by that is, if your train of reasoning contains more than one or two “which implies that” phrases, the conclusion is probably unreliable. I’m talking about human affairs here, of course, not mathematical proofs. Purely conceptual thinking about human affairs dries up after a couple of implications. An example: Factory owners are selfish, therefore they will pay their workers the lowest wage possible, therefore the workers will do better if they own the factory themselves. That’s one too many therefores. Common sense says that running factories to the benefit of the people who work in them isn’t a simple matter of putting employees in charge. Which employees? Who owns what? What is the correct wage? Theory can’t answer these questions, at least not definitively. Common sense tells you that the situation is too complex.

Is this a conservative way of thinking? I suppose so. But conservative voices have their place in the conversation; it’s not terribly bad if a few people say, “Hang on a second.” I admire the work of Jonathan Haidt, who has done so much to clarify the differences between conservative and progressive ways of thinking without condemning either. The current election could certainly use some ancient wisdom and common sense. Who knows? Maybe the follies of our time will bring more people around to this way of thinking.




Publishing games is hard, but I have to try

maxresdefaultThey say we have to keep on our feet as we age. Somehow over the past 15 years I have wormed my way into a job as Professor of Media specializing in game design. I’ve never published a game, not professionally anyway. Not for money. Not with full review and development, you know, REAL publishing. It makes me feel inadequate, like I am pulling a trick on the world. Who am I to be teaching in a game program when I haven’t done the thing I am teaching?

I have some backdoor justifications, but they seem awfully weak. I could argue that my qualifications come from my training in economics and political science, where I did so much social modeling… and making models is the same thing as making games, right? Well, yes and no. It’s the same thought process, certainly. You have in your mind a certain human social behavior that you want to reduce to its core forces. You write down some rules for how the system works and then derive what happens when the system is up and running. It’s basically the same thing as designing a game and a set of AIs to play it, then analyzing the outcomes.

Let me give an example. In a market, the rules of the game are that the consumer can’t get anything unless he pays for it. The setup is, the consumer has an income Y and goods are offered at prices P. The quantity purchased is X. Y is a scalar, P and X are vectors. This means the consumer purchases X must satisfy PX <= Y. That’s the game. Here’s the AI. The consumer’s goals are specified by a function U(X) with U’ > 0 and U” < 0 for all arguments. The consumer seeks to maximize U(X) subject to PX <= Y. A basic optimization problem… get your engineering book out and solve. The result is a demand function X(P, Y) that tells us – the designers – what the consumer / player will buy given the game we set up. If we designers change prices or income, the consumer / player will react accordingly.

This is so close to game design thinking… but not quite. Game design (as commonly understood) has the additional requirement that the process of choosing has to be enjoyable for the consumer / player. Now, economists would say that this common understanding, that games have to be enjoyable, is unnecessary for one of two reasons. The first would be that the only point of the analysis is to predict behavior. Game theory, they would say, is about analyzing games that are played, not making games that people want to play. Fair enough. If I use “game design” in the way my students use it, though, it means they expect me to know not only how to analyze games that are made but also how to make new games that engage people. Because I am not an economist pure and simple, but a media professor, I have to think about engagement. The second reason economists would balk at engagement is that there should be no effects outside the model. The idea is to construct the utility function so that the player’s goals are fully accounted for, including any desire they may have to actually play. But this gets us into a nasty nesting problem. In order to take account of engagement in a theoretically consistent way, I would have to embed the consumer demand game I described above inside of a bigger game of choosing games. What a mess. It’s not a mess for common sense, but it is a nasty mess for mathematical model construction. My students are not interested in the design of metagames that mathematically depict choices within the game of choosing games. They want to make games that people want to play. And the most helpful way to handle that engagement is to keep it separate from the task of designing the game system.

1930s Man Teacher Looking At Camera With Pointer At Blackboard

Thus my economics background is really only a half-justification for being a game design professor. I can’t tell my students that game theory is all they need, nor can I show them a good theory of game design that includes psychological satisfaction. Thus I have to become a game designer as it is commonly understood: A person who makes an interactive artifact that people like and play, that receives notice from publishers and critics and the public. This is the same standard as professors of English who teach creative writing. On the one hand they can give lectures about the craft of writing creatively, but they also have to write something and publish it in some way in order to be credible.

I need to publish games, but it is hard to jump into a completely new area of performance. I spent my early career writing (bad) research papers. Then I wrote books. Now I am trying to design and publish games. It’s hard. My coding skills are not so good, and my first love is board games, so I have been trying for several years to get something published there. Well, this area is swamped with would-be designers. Board games are not like books and papers. In books and papers, there are specific places where you can send things. Submissions are welcomed.They are often rejected, but at least you get a review and some comments. Then you send your work somewhere else. Well, there’s no such thing as that in board games. Publishers are overwhelmed with submissions and don’t have the time to wade through them all. If you’ve never published before, they have no reason to review your work. It is very hard to get a publisher to even look.

As a result, these last few years have been humiliating. It’s like being young again, in a bad way. I keep getting rejected over and over! I try to talk to publishers at conferences and I keep getting the cold shoulder. I stand there for 30 minutes waiting to get one word with someone, only to get a brusque “Nah, nobody would publish that game.” A couple of publishers took my prototypes and I never heard from them again. Not even a rejection letter. It reminds me of the early years, trying to get big-time professors to notice my (bad) little papers. And failing… ouch!

In these moments my heart goes out to everyone trying to launch a new career or a new direction. These challenges to the ego… truly a test of inner strength … after each rejection to get up and try again. Very hard. Keep going!

Yes, keep going, because there may be a light at the end of the tunnel. After years of quietly trying and failing, I may finally have something in the hopper with a game company. Not yet! But closer.

Even though I have not published a game, trying to do so has still been the right thing to do. The journey has taught me so much about how board games are made. Real-world insights, not stuff from game theory. And I guess this is how a professor can sleep at night. If you have given your whole life to your subject, until your heart is bleeding, then you have some credibility in the front of the classroom. You may not be the biggest success in your field, but you do know what you are talking about, because you have lived it.


Hope for old men without hope


I like to have lunch with buddies and lately, these lunches have gotten pretty morose. My friends are a broad mix of ideological and philosophical types, but they’re all experienced – you know, old. And nobody’s excited or even optimistic.

I spent a lot of time studying public policy, back in the day, and I agree that there do not seem to be any reasonable policy solutions for the things going wrong. We are all standing around yelling at the government to do something, but there’s nothing it can do. Government can’t change culture. It can’t change anyone’s philosophy or their life commitments. It can’t make people behave better. And, you know, it can’t change the facts of the planet and technology. It can’t lower the oceans or stop the robots from taking over.

Is this a moment for reflection about what we expect from this world? Where does hope come from, anyway? The world has never been fully bad or good, just mixes of it. If we are hoping for a perfect world, we’re going to be disappointed. And if we thought the world should always get better, without ups and downs, well, that view of Progress was dispelled by the 20th century, wasn’t it?

In this dark hour, perhaps it is worth considering whether there’s hope that doesn’t come from the affairs of the world. I believe there is. May we all find it!


This sad season

will_rogers_quoteIn a WSJ article, Peggy Noonan writes of the sadness the political season provokes in her soul. Me too! Something seems awfully wrong.

The historian in me says that it is nothing new. Nations go through long periods of ineffective, bad, even tragically horrible leadership. It is not uncommon. Sicilians have been ruled by all kinds of people, but Sicilians are still around, still Sicilian.

Nonetheless, things are not pretty today. Candidates today represent impractical, unreasonable approaches to policies; a casual attitude toward morals, even crime; and a lack of serious intellectual heft on the right. It is sad to see the country I grew up in devolve to such a low stature in its political conversation. Kicking around in my head for causes, I encountered one that hits quite close to home, uncomfortably so: I fear that universities are partly to blame for all this.

In 1950, the American university was a place of learning that people around the world respected. The GI Bill which encouraged higher education was received by common people as a gift, a truly good thing. There was no doubt in anyone’s mind that what happened at universities – whatever it was – was good for a person. It made a person better. Going through a university education was a sure way to rise up in your life. Universities were respected that way.

quote-Will-Rogers-there-are-three-kinds-of-men-the-92578-380x263No longer. If you talk to the average person outside a university today, they are quite skeptical about us professors and the job we do. It is certainly not assumed that we have anything valuable to teach. Indeed, more often than not, the concern of a parent is not what kind of positive effect we will have, but rather, how to protect their child from us. People “out there” have come to see universities as having three basic kinds of experiences: Drunken orgies; radical indoctrination; and training for a job. They generally want their kids to get the latter and avoid the first two. When I say I am a professor, the line of questions and conversations seems to develop around which of these three experiences I represent. People seem to wonder, is this guy one of those crazy critical theory nuts, or does he teach those huge, meaningless, easy courses whose purpose is to clear time for more drinking? Or does he teach anything useful? The idea that I might be a model for their child, a pursuer of great and good ideas, abstract ideas that are unquestionably good to know, is not part of the thinking. Something has been lost.

These attitudes may or may not be very accurate, but they are not illusions. The culture of the campus has changed over the last 50 years. No one can deny that. And I fear that these changes have played a role in creating our current politics.

Here’s the connection I am seeing this morning: Over the last half-century, the American university lost touch with the American people. On the Humanities side, every college tried to become a little Paris, populated by sullen bohemians criticizing everything the world considers normal. On the Science side, a little Frankenstein lab, populated by geeky obsessives too immersed in their work to be involved in anything that, again, the world might consider normal. Between manic nerdiness and moody criticism, the American university lost touch with important strands of many folkways.

will-rogersOne lost connection is to the American Thinker. By this I mean, purveyors of homespun wisdom, of which America has produced many fine examples in the past.Ben Franklin, Flannery O’Connor. Mark Twain. The Spencer Tracy character in Judgment at Nuremberg. Ernie Pyle. Laura Ingalls Wilder. You might call it farm-bred wisdom with an intellectual punch. Will Rogers: “I joked about every prominent man of my time, but I never met a man I didn’t like.” Humble, honest, pragmatic, with lines best delivered with your hat tipped back and a stick of grass hanging out your mouth. Committed to ideals but incapable of being riled up to do anything nutty. Placid and skeptical, but warm and forgiving.

True, people like this were usually self-taught, and not produced by the university system. Yet this mode of being used to be such an important part of our culture, and one might hope that our universities would be places where this kind of calm pragmatism might be honored, respected, and emulated to some extent, but they are nowhere to be found on American campuses today. Instead, haven’t we all become awfully strident? (Me too, of course.)

A second connection is to historical American conservatism. I’m not talking about the “conservatism” of contemporary politics, but rather, the core ideas of down-home American ways of being. These would include a commitment to self-reliance, limited government, and small institutions of the family, religion, and locally-owned business. I think of it as a small-town farmer’s approach to policy. These ideas have traditionally been voiced by the right, but they are also represented to some degree in the folk movement and other rural Americans who try to live in a simpler way.

Think about it: For 50 years, these ideas have had no voice at all on American campuses. On the contrary, it is well-documented that the American university is dominated by the left side of the political spectrum. The domination is so total that we now have a counter-movement that argues for “viewpoint diversity.” The argument by these admittedly left-of-center academics is that whatever we may think of the right, surely its message of limited government and self-reliance, of simple virtues of family and church, has a place in the conversation. We are all better off if that place is occupied by well-formed, honest thinkers. By denying this quintessentially American way of thinking its place on the campus, for decades, we have guaranteed that it presents a raw even savage voice in our politics, rather than a refined and sober one.

And so here we are. One party seems enamored of a European model that probably doesn’t fit America as a whole (although there are parts of America where it would be great). That same party’s institutions, however, have the feel of corrupt bureaucracies: FIFA for government. The other party has terribly weak leadership, both practically and intellectually.

This combination of radical Europhilia and oppressive bureaucracy is quite familiar to those of us who work at universities, as is the complete absence of reasonable conservative voices. But it is not familiar to America. America’s culture is not like that. Our roots do have a strong conservative tone: Families, religion, hard-work, limited government, humility, pragmatism. That way of thinking is still around in large parts of the population. But our elites, following our campuses, have gone off in a different direction. While lots of people like that direction, the result has been an estrangement between the people who run the country and many of the people who live in it. Estrangement brings sorrow; and so it is a sad time.