Summer, Fall, Winter, Spring, Summer

11108666_10153433225601804_1113044728188958774_n
One year ago, we vowed.

My wife is as warm as June.

As happy as July.

Steady like an August moon

Or morning kiss from September sky.

October breaks just as her light.

Her laugh is November gold.

Her spirit gentle as December’s white

and her calm a January glow.

Shy like February’s count

But her smile is the hope of March.

Gives of herself like an April fount

In May, to my unworthy heart.

I love you.

Do Kids Need Social Media to Succeed?

When you think of the things children need to succeed later in society,  you probably think of things like good education, a stable family life, and lots of love and emotional support. One thing you probably don’t think of? Social media. I’m not sure I’ve ever read a parenting book or a sociological report on childhood well-being that emphasized how important social media and iPhone apps are to a kid’s personal and economic flourishing.

Well, there’s a first time for everything I suppose. 

At The Washington Post, Crystle Martin and Mimi Ito, two researchers out of the University of California at Irvine, present their case that “digital inequity” is a serious social threat to the upward mobility of children from lower-income families. Most of the article is socioeconomic research that comports well with common sense: Wealthier families have more money to spend on things like smartphones and carrier plans, which in turn means kids and teens from those families are statistically more likely to be using the social media platforms on those phones than kids from families with less money in the budget.

Seems pretty logical to me. I’m not entirely sure why extensive academic research is necessary to confirm this, but there you go.

But the raw data alone is not the point of the research. Martin and Ito present the information in order to argue that there is a serious social and economic disadvantage to be suffered from not having access to the same social media platforms as wealthier Americans. “The emerging smartphone divide is troubling,” they write, and it must be addressed:

Teens’ access to Snapchat and Instagram may not seem like something we should be terribly concerned about, but it is an indicator of deeper and troubling forms of digital inequity. Social digital and networked media use is where young people gain everyday fluency and comfort with the technology and social norms of our times. Whether it is managing a LinkedIn network or learning to code, young people who lack digital fluency and full access will always be a step behind their more connected peers.

If you’re not reading carefully, the words “digital fluency” might race past you as you find yourself reluctantly agreeing with the authors’ conclusion. After all, isn’t the computer and the internet both integral parts of our modern economy? Isn’t inability to use email and web browser an obstacle to employment, even in the most non-tech industries?

Well, yes, but that’s not what is being talked about here. Rather, the researches seem to be equating familiarity with smartphone-only applications like Instagram and Snapchat with “fluency.” They apparently believe that the social and economic obstacles that encumber Americans who aren’t good with computers or the internet are also awaiting young children and teens who aren’t able to harness thelatest hardware and applications that their wealthier classmates might be using.

This kind of prognostication faces an obvious problem: How do you ever end up with reliable data when the products are so new and diversifying so quickly?  The research presented in this piece relies heavily ] on projection and a functional equivalence of social media with other digital tech like email. Why accept that equivalence though? Facebook is wildly different, both in form and function, than it was when it launched in 2006. MySpace, once considered the crown jewel of social media, is decrepit, and the much hyped Google+ is almost totally irrelevant to the average American teen. Forecasting what tomorrow’s social media platforms will even be is difficult enough; predicting their economic impact seems like a fool’s errand.

But that’s not this article’s biggest problem. Its biggest problem is its unqualified view of social media as something that automatically enriches a child’s life.

Let’s assume the research’s premise for one moment, and imagine that it really is true that not having an iPhone or Galaxy disadvantages teens in their future economic mobility. The next question should be: So what? Does an economic disadvantage automatically trump any and all other concerns that parents and families might have about allowing teens (and younger) unfettered access to social media?

And there are many such concerns. Of course there are the usual ones:  Sexting is pandemic among American teenagers, and it is well-known at this point that compulsive social media use (and many teens know no other kind of use) can have serious  effects on mental health. But even beyond this, there are worthy questions about why young people need to be trained on social media at all. After all, the astronomical usage rates of “connectors” like Facebook and Twitter are accompanied with the near-universal acknowledgement that our culture–particularly the Millennial culture–is marked by pathological loneliness and personal fragmentation. The stated goal of social media is to connect people with each other. If it is failing in that goal–and our reasons for suspecting it is failing are increasing–then why is it necessary?

Rather than assuming that the latest novelties from Silicon Valley will dictate our children’s futures, we should empower parents and churches, of all income levels, to take a higher stake themselves. How many teenagers spend hours on Twitter, enchanted by the most banal and transitory “Trends,” because they are left to themselves without any inkling of the delights of imagination and wonder, delights that exist right outside their window? Are many of the children I see in restaurants glued to their iPad having their mental and moral faculties shaped by corporations, merely because their parents are too glued to their email and Facebook notifications to notice?

It is one thing to submit that economic flourishing benefits the young. It is altogether reasonable as well to suggest that the youngest generation be trained in the tools of the modern economy. But it is quite another thing to urge parents to hook their children up to the most dehumanizing and trivial portals of diversion, merely out of the fear of having an incomplete resume.

If raising people who are capable of living healthy, rich lives apart from the soft blue glow of digital enchantment means a slightly less thick college application binder, so be it. One’s life, after all, does not consist in the abundance of possessions–or followers.

(featured image credit)

Movie Review: “Captain America: Civil War” (2016)

The best superhero film of the millennium (thus far) is Christopher Nolan’s The Dark Knight. It’s a brooding masterpiece, drenched in noir and teeming with the questions of life that we face every day. That its hero is a comic book warrior is almost irrelevant; it is a film rooted firmly in the moral battles of real life.

Captain America: Civil War is not as good as The Dark Knight, but it is closer than anything we’ve seen since 2008. It’s Marvel’s masterpiece and one of the best films of the year.

Surprised? Me too.

As some of you will know, I am one of those who believe that the superhero genre’s (especially the superhero sequel genre) dominance right now is a weakness and not a strength of the film industry. I’ve said before that the way franchises have consumed the movie market tends toward lower quality from studios and less risk tasking from filmmakers. I still believe that. I still believe that on any given day a 6th installment of a film series—especially a comic book one—is probably designed to help its audience expect less from a film.

But the great thing about movies is that sometimes, it all just falls into place. Sometimes your expectations and carefully thought out analyses get broadsided by a great story, compelling characters and bold, smart filmmakers. What’s great about movies is that sometimes you get one like Captain America: Civil War.

Civil War builds extensively on the events of Captain America: Winter Soldier. A few years ago it was probably easy for someone who had never seen an Avengers movie to jump right into the latest installment. Not anymore. If you don’t know at least the basic universe and events of the previous movies there’s practically nothing to grab onto here. None of the characters are “introduced” (save for two new superheroes, a familiar web-slinger and a prowling prince) and most of the action is thematically anchored in the past. This makes for an unusually intelligent and perceptive script, but a pre-movie refresher is mandatory.

Do I need to describe the plot? A quick glance of the trailer would at least explain the film’s title to you. The most important thing to know is that at the heart of Civil War is a question that haunts not just the Avengers but every superhero story I’ve ever heard: What about the humans who are in those buildings that always blow up? What about the faceless, nameless average folks who are not hero, villain, or rescued? Most movies in this genre either seem to pretend that these people don’t exist (the amount of vacant real estate in New York City is astonishing) or pretend that they can somehow withstand being caught in the middle of supernatural apocalypse. Civil War drops both these illusions. Like Nolan’s Dark Knight, Civil War uses the mythology of the superhero to ask moral questions of its characters, and its audience.

Should those trying to save life care about “collateral damage”? Is the power to intervene for good always tempered by the potential to do harm? Who and what determines innocence? This is normally the stuff of Oliver Stone war pictures, not comic book adventures. Here is that rarity: a superhero film willing to question itself, to not drown out thought in a torrent of CGI destruction.

As Civil War opens, the powers that be believe that the Avengers, heretofore an independent, apolitical group of an “enhanced” warriors, need governmental oversight. The debate amongst the heroes centers on whether their power to defend life is helped or hindered by submission to political bureaucracy. Some of the Avengers agree with Iron Man (Robert Downey Jr.) that the lost lives of innocents demands that the heroes surrender some of their autonomy; others side with Captain America (Chris Evans) that such submission will only handcuff their abilities.

This may sound like another edition of the “Hero or vigilante” trope so common in this genre. But where this theme is often treated with either glib humor (think Sam Rami’s Spider-Man trilogy) or a kind of meandering sanctimony (think Man of Steel), Civil War takes it seriously and asks the audience to as well. An early encounter between Tony Stark and the mother of a young man killed in one of the Avengers’ battles is a deeply affecting and uncomfortably realistic sequence. There’s a maturity and confidence in this writing that elevates Civil War far above the level of live action cartoon. Children will still delight in these heroes, but adults will leave thinking more seriously about a superhero’s world than perhaps they have in a while.

One thing I noticed about Civil War is that its action sequences seem more grounded and physical. I’ve seen a lot of Marvel films where the heroes defy the laws of physics in a way that doesn’t feel thrilling. Here the visual effects seem to have more humanity; the biff-bam-pow spirit of the comics is more evident than the flawlessly pixelated violence of video games. This too was true of Nolan’s Batman films (a very different sort of comic book film, of course). Except for some inexplicably jittery photography in the movie’s very first battle, Civil War features some of the best superhero battling I’ve seen in years.

Though the title says this should be Captain America’s film, it’s really another volume for the Avengers as a whole. That’s good news because Robert Downey Jr. and Chris Evans together are far and away the best asset this franchise has. Their rivalry is the soul of Civil War. Marvel deserves credit for not turning its cast into human placeholders for green screen, which would rob us of the serious talent on display here. The two new heroes are particularly well picked, and Martin Freeman has a great (though short) time as a government agent.

Not just another episode of digital playtime, Civil War offers the superhero genre humanity, thoughtfulness, and a higher plane of excitement than it has seen in a while. It all works, from the intelligent and even surprising screenplay by Stephen McFeely and Chris Markus, to Joe and Anthony Russo’s confident direction. If future comic book films will learn the lessons in craft found in this movie, our death-by-nostalgia Hollywood may yet have a fighting chance.

 

Debunking 3 Myths About Homeschooling

When you hear the word “homeschool,” what do you think? Do you think of underground bunkers filled with fringe evangelical families trying avoid any infecting contact with the outside world? Do you think of rural nightmares where parents stash children in home like pets and feed them anti-government conspiracies?

If you do, you desperately need to read Matthew Hennessey’s exceptional article on the rise of urban homeschooling. It’s a lengthy piece, but worth every second of your attention.

Hennessey’s research into the rise of homeschooling within urban communities is especially helpful in tearing down some noxious myths about homeschooling families. Here are just a few that his piece thoroughly dismantles:

Myth #1: Homeschooling Is Just For Conservative Evangelical Families Wanting a Conservative Evangelical Curriculum. 

Hennessey’s essay makes a powerful case that the current rise in homeschooling, particularly in urban contexts, is not particularly religious. While the booming emergence of the national homeschool movement in the 1980s and 90s did owe much to conservative Christian culture and politics, Hennessey notes that religious reasons are less common for homeschooling families. An excerpt:

 …[T]he homeschooling population has continued to grow dramatically, while also becoming more secular. In 2002, according to a DOE survey, 72 percent of homeschooling families cited “a desire to provide religious instruction” as one of their reasons for educating in the home. By 2012, 64 percent cited religion as a motive for homeschooling; only 16 percent called it most important. “Most people assume we’re doing it for some sort of strange, creationist religious reason,” says Rachel Figueroa-Levin, a homeschooler who lives in Inwood, a middle-class neighborhood at the northernmost tip of Manhattan. “But we are stereotypical secular Jews.” Indeed, concern about “the environment of other schools” has supplanted religion as the Number One reason given for homeschooling, according to the DOE survey. Ninety-one percent of homeschooling parents cited school environment as at least a contributing factor.

As the rest of Hennessey’s piece demonstrates, families are more likely to opt for homeschooling as a response to low quality or insufficient public education than they are merely to provide an explicitly religious curriculum.  This undermines the common criticism that homeschooling families are a feature of struggling public education and not a response to it. Too often public education officials, teachers unions and politicians will claim that fixing schools requires roping in students who are kept out of the system for ideological reasons. But this misrepresents the homeschooling movement entirely.

Myth #2:  Homeschoolers Receive a Poorer Education 

Hennessey pulls some helpful data to make a convincing case that not only are homeschoolers not behind their public school peers, they’re actually leading them. Here’s Hennessy again:

Some critics claim that homeschooled kids won’t be prepared to do college-level work, but available data suggest otherwise. In 2009, NEHRI’s Ray looked at the standardized test results of 12,000 homeschoolers from all 50 states, as well as Guam and Puerto Rico. He found that homeschoolers scored 34–39 percentile points above the norm on the California Achievement Test, the Iowa Test of Basic Skills, and the Stanford Achievement Test. A recent study published in The Journal of College Admission found that homeschooled students had higher composite ACT scores than their non-homeschooled peers and graduated college at higher rates—66.7 percent, compared with 57.5 percent. “In recent years, we’ve admitted ten or 12 homeschooled students” per year, says Marlyn McGrath, admissions director at Harvard, where each class numbers about 1,600.

Personally, I didn’t need to read this data to know that my homeschooling peers were above the pack academically. I’ve had more public schooled friends than homeschooled ones but I can say with no hesitation that the homeschooling friends I had were more academically achieved in college and, generally speaking, did better in careers right out of school. And it’s not that my publicly educated friends were dimwits; far from it. It’s that the homeschoolers in my social circle weren’t just marginally better at college, they were a lot better. There are just so many resources for homeschooling families nowadays that only the completely uninformed could imagine that a homeschool education is de facto a disadvantage.

Myth #3: Homeschooling Families Need More Government Oversight

Of all the myths about homeschooling, it’s probably easiest for me to empathize with this one. It makes sense on the surface: We have lots and lots of federal standards for education in public schools. Why do homeschooling families deserve special treatment?

The first answer is that the premise behind the myth is wrong. There really isn’t a compelling reason to believe that loads of federal standards on public education do a lot of good. Regardless of where you fall on the political spectrum or what specific policies you endorse, it’s difficult to ignore letters like this one in The Washington Post , describing a public school teacher’s frustration with the invasive and pedagogically difficult standards.

Secondly, strict oversight of homeschooling families is often at odds with basic civil liberties. The line between trying to help children receive a quality education and violating personal rights is a notoriously fragile one. Listen to this story that Hennessey cites in his piece:

In November, on behalf of homeschooling parents Laura and Jason Hagan, the Home School Legal Defense Association filed a federal civil rights lawsuit against two members of the Nodaway County, Missouri, sheriff’s department. The sheriffs had forced their way into the Hagan residence after being called by a child protective-services caseworker investigating a report that the home was “messy.” The Hagans refused entry to the investigators, so the sheriffs pepper-sprayed them, tasered Jason, and threatened to shoot the family dog—all in full view of the Hagan children. The sheriffs charged the Hagans with resisting arrest and with child endangerment. At trial, however, a judge ruled that the lawmen had violated the Hagans’ Fourth Amendment rights by entering their home without a warrant.

You may think that’s an isolated, outrageous incident. But we’ve already seen an astonishing amount of examples in the past year of enforcing “helicopter parenting,” of legal retribution to families that don’t strictly supervise every minute of their child’s day. In my view, the burden is on critics of homeschooling to give compelling reasons why increased regulations on homeschoolers won’t result in egregious violations of personal and religious liberty. Until that concern is answered in a clear way, aiming additional legislation at homeschoolers is worrisome.

Anyway, you really should read all of Hennessey’s piece. Send it to friends, especially friends dubious of homeschoolers.

Tassels and Truth

I spent about four hours of my Monday night at a college graduation. My wife was being awarded her degree in elementary education, and she was joined by (according to the college president) 995 other undergraduates. Graduates were welcomed, inducted, charged, presented, and awarded, in that order. The night was long; speeches repeated, processionals and recessionals slogged, and of course, each of the 995 students were called, conferred, and congratulated individually.

It was a ceremony clearly not tailored to the entertainment generation or the babies of endless social media connectivity. Neither was it the du jour of those “radicals,” found so often on college campuses, who detest tradition and protest uniformity. Students marched in step behind large banners, signifying their membership in one of the university’s schools. Everyone wore the same traditional black gown and cap. Songs older than many US states were sung. It was, in many ways, a kind of religious ceremony, in which tradition, institution, and (academic) success made up the liturgy.

I realized at one point that for all the endless intellectual coddling and culture policing that characterizes the contemporary American university, a bachelor’s degree culminates in an event that defies such self-expressive autonomy. Graduation invites students, faculty, family and friends to believe that they are participating in something greater than themselves, to find satisfaction and joy in the idea that what they have achieved has been achieved before and will be achieved again. Yes, graduates have their names called, and yes, graduates receive their own degrees. But the entire ethos of the ceremony is one that says: “This is not ultimately about you.”

This is the opposite, of course, of what many undergraduates learn in the college classroom. We hear almost daily updates on an American university culture which at every turn empowers freshmen and sophomores to authenticate themselves through protest, rather than sit and learn about an imperfect world at the feet of imperfect people. Much of young adult life is what Alan Jacobs calls the “trade-in society,” a life of loose connection and easy escape from situations that become difficult. If institutions become ornery, if they cease to align up perfectly with my individual desires and goals, then the solution is to either give up on the institution or else demand that it change.

Nihilism in higher education has been rampant for some time. But if what I saw Monday night was an indication, it looks like it has mostly failed to leave its imprint on graduation. Presidents and executive administrators sat on the stage, above the floor of graduates; no one protested this obvious hierarchy. I didn’t see any letters to the editor in the following days demanding that the school change its individualism-stifling policy on the robe and cap. Nary a thought was given to whether the school fight song, written in 1892, might have been penned by someone with questionable social or political opinion. In other words, there seems to be no pressing need to make commencement in our sociopolitical image. The ritual is allowed to be ritual.

Why is this? Why, among all the college unrest and university politics in our culture today, is there no national movement to “democratize” commencement? Why is there no formidable backlash to its rigidity and solemnity?

Perhaps one answer is that graduation is one of the few moments remaining in our culture where achievement needs tradition. What a conferring of degrees means is dependent on what, or who, is conferring them. This is, after all, the difference between a college education and a few bucks paid to a diploma mill at a PO box. Anyone can write anything on a piece of paper. But the bigness—we might even say transcendence—of the commencement ceremony befits a time where graduates are declared matriculated by those with the (trigger warning) power to say so.

A commencement invites students to become not just graduates, but alumni. That’s why so much of the chancellor’s speech on Monday was given to exulting in the university’s history and prestige. Students aren’t just receiving degrees; they’re receiving membership, a form of covenant (however informal) that ties them to a specific place and a specific body. Implicit in the commencement is the idea that people need to belong, and that belonging to something greater than and outside oneself is not opposed to individual achievement and success.

Unfortunately, from August to April, much of college life teaches the opposite. From radical deconstructionism in the humanities, to rank scientism in mathematics and biology, to the campus hook up culture—all of these coalesce into a living liturgy of lonely autonomy and hopeless self-authentication.

Is the unraveling of the American campus really a surprise? I can’t see how it is. If everything in the classroom and commons area screams that transcendence and God are nothing but ciphers for the powerful, might one eventually want to apply the rules learned about home, country, and religion to the college itself? Why be oppressed? Higher education was comfortable directing this energy toward the general culture for decades; the only problem now is that the barrels are turned the wrong way. If Lady Thatcher was right that running out of other people’s money was the trouble with socialism, you might say the problem with nihilism in education is that, eventually, you run out of other people’s safe spaces.

So the drama of higher education continues. In the coming years we will see just how strong an institution it is, as it tries to fend off the threats of digitalization, debt, and decay. It very well could be that the internet age was created for such a time as this, to rescue the university from itself and provide a generation with the knowledge and intellectual formation that a coddling college culture has defaulted on. In many ways it would be, as Ross Douthat has noted, a punishment that fits academia’s crime.

Whatever the future holds, let’s hold off on tampering too much with commencement. It’s indeed tedious and self-congratulating. But it’s also a spark of meaning and permanence and truth in the cavernous culture of higher ed. As tassels move to the left, it could be that something much bigger moves to the right.

Of Mothers and Imaginations

floramabel

In his poem “Elegy in a Country Churchyard,” the English poet Thomas Gray memorably reflected on the legacy of un-famous lives buried in a rural graveyard.

Let not Ambition mock their useful toil //
Their homely joys, and destiny obscure;
Nor Grandeur hear with a disdainful smile //
The short and simple annals of the poor.

With tender lyrical beauty, Grey conveyed the worth and righteousness of a small, obscure life, one spent in the ordinary hum of love of God, family, and neighbor. It’s a sentiment that cuts across our fame-seeking, platform-building digital age. The idea of living and dying while the world isn’t watching is an idea that fills many of us with horror. But that is the fate of so many of whom Jesus said would be called great in the kingdom.

Here’s something to consider this weekend: Of all these noble unnamed, how many are mothers?

How many women have given their life to their children? How thicker would the books of history be if we could record the daily love and loss of women whose heart was with their home? I doubt it could even be imagined. When it comes to bearing the burdens of our very humanity, surely mothers carry the heaviest and hardest loads. And yet how many of these years—or rather, how many of these lives–of sacrifice ever cue public applause or congratulations?

Meditate with me on two women, two mothers, whose names will probably be strange to you: Mabel Suffield and Flora Hamilton.

Mabel Suffield lived to be only 44 years old, dying of disease. She was widowed less than 10 years into her only marriage, left to raise two children by herself. To make life even harder, she was shunned by both her family and in-laws when she joined the Catholic church during a time of rampant English anti-Catholic sentiment. Living in charitable housing and often relying on the kindness of priests and strangers, Mabel tied her whole self, her entire earthly well-being, into protecting and raising her children.

By earthly standards, her life was a tragic waste. She had married too daringly an adventurer who died in Africa, thousands of miles away. She had chosen religion over relationship and financial support. Nothing about Mabel Suffield’s existence registers on the scale of worldly success. What success she did enjoy, however, was in shaping the imagination and talents of her youngest son. She gave him

…more than a lovely world in which to grow up; she gave him an array of fascinating tools to explore and interpret it. We know little of her own education, but she clearly valued learning and vigorously set about transmitting what she knew…She taught him to draw and to paint, arts in which he would develop his own unmistakeable style.

Mabel was clearly talented, but her talents did not earn her the rewards of ambition or the approbation of her peers. They went, instead, to her son. That was to be, in divine Providence, the outermost borders of her life, her “short and simple annal.”

Flora Hamilton likewise died young, at 46, of cancer. In many ways her life is more obscure than that of Mabel Suffield. You won’t find anything named after Mabel in her native Northern Ireland. Even her love life was cool and temperate; she responds to passionate letters from her husband with, “I wonder do I love you? I am not quite sure. I know that at least I am very fond of you, and that I should never think of loving anyone else.” Imagine if those kinds of words appeared today, anonymously in an advice column. They would be met with pity and calls to radical action.

But nothing about Flora was radical. Her life was small and given to her children. She loved books and taught her boys to love them too. She was imaginative and rational, and educated her boys to think with both logic and fervor. When she passed away, few took note, except for her family. Her youngest son would write years later that Flora’s death had signaled that “all settled happiness, all that was tranquil and reliable, disappeared from my life.”

Flora and Mabel lived brief, small lives. They invented no great thing and built nothing amazing. The only architecture that bears their names are likely gravestones. What they did do was love, nurture, and teach their children. Their legacies were made in young hearts, not the hearts of adoring fans or thankful shareholders but the hearts of their sons.

What appeared wasted at the time was anything but. Mabel’s youngest boy would put her sacrificial spirit in the characters of his fiction—characters like Gandalf, and Aragorn, and Frodo and Sam. J.R.R. Tolkien’s mother may have been mere biographical trivia to the millions who were moved by The Lord of the Rings, but for Middle-Earth itself, she was a specter whose love and faithfulness and resolve is dazzlingly bright in the pages of her son’s masterpiece.

And Flora? I think we see her too. I think we see the mother of C.S. Lewis in The Magician’s Nephew. She is, I believe, Digory’s deathly ill mother. It’s not outrageous to think that Aslan’s gift of Narnia’s healing fruit is the moment of joy and life that Lewis always wished had come to Flora. She was a beam of happiness in his young life, and it’s not hard to hear lingering sadness in the description of the healing of Digory’s mother:

About a week after this it was quite certain that Digory’s mother was getting better…And a month later that whole house had become a different place. Aunt Letty did everything that Mother liked; windows were opened, frowsy curtains were drawn back to brighten up the rooms, there were new flowers everywhere, and nicer things to eat, and the old piano was tuned and Mother took up her singing again, and has such games with Digory and Polly that Aunt Letty would say “I declare, Mabel, you’re the biggest baby of the three.”

Without the brief, small, hard lives of Mabel and Flora, we may never have known the lives of Frodo and Sam, or Digory and Polly. Without the quiet, unremarkable love of two mothers, how much more impoverished would countless imaginations and faiths be?

How thankful we ought to be for homely joys, and destinies obscure!

__________

Biographical information is taken from The Fellowship: The Literary Lives of the Inklings: J.R.R. Tolkien, C. S. Lewis, Owen Barfield, Charles Williams

Sitting Athwart History

Timothy George’s profile of Capitol Hill Baptist Church and its senior pastor, Mark Dever, is a joy to read. It was a joy for me personally because my wife and I are members of a church in Louisville that owes much to Dever and Capitol Hill. My pastor, Greg Gilbert, studied under Dever, and Third Avenue Baptist bears much resemblance to the vision that Dever has cast in his “9 Marks” ministry.

I was raised in very traditional Southern Baptist churches. These churches, I am told, thrived during the middle of the last century. I have to rely on the testimony of others for that information, because by the time I was old enough to notice, many of the churches I saw—including the ones I attended—were losing members yearly, becoming more insular and less evangelistic, and were often more enthralled by their internal politics than by the doctrines of Christianity. I spent my teenage years in an evangelical culture that desperately wanted to regain relevance. Thus, much of the preaching, teaching, singing, and “discipleship” that I heard was crafted carefully in the image of the “seeker-friendly” movement, which sought to make the experience of church palatable to Gen Xers and millennials who demanded entertainment and variety.

I didn’t fully realize what was going on until I arrived at Third Avenue. Then it became ridiculously obvious. For the sake of those accustomed to the secular liturgies of American culture, evangelicalism had tried to make the local church recognizable; but instead, it had made it invisible. Intellectual and spiritual formation of members was being neutered by the efforts to make church fun.

George describes how Dever pulled Capitol Hill away from this trend:

…[Dever] began to preach sermons that lasted upwards of one hour. Next, the church excised from its rolls hundreds of inactive members—some so inactive that they had long been dead! The practice of church discipline was begun. Members were also required to subscribe to a confession of faith and to say “an oath”—this is how a secular journalist described the church covenant—at the monthly communion. Entertainment-based worship was replaced by congregational singing, including many long-forgotten classic hymns from the past.

This describes perfectly my experience at Third Avenue. These churches are counter-cultural, not only in the content of their gospel but in the character of their pedagogy. And yes, pedagogy is the right word, because for churches like Third Avenue and Capitol Hill, the worship culture of the church is designed not merely to amuse or entertain, but to teach. The teaching doesn’t just begin and end with the sermon. The whole mode of worship is one that demands—and trains—intellectual and emotional maturity. Times of silence invoke the kinds of reflection and meditation that a smartphone culture often finds impossible. Old hymns with archaic but theologically rich vocabulary remind singers of big truths that require old words, not just mantras that could be found in any young adult novel. At any given point in the service there is a sense that members aren’t just spectating or even just participating in an event, but that they are learning in both word and desire.

This is the personal formation that has been lost in the noise of much evangelical church culture. It’s a loss that may carry a higher price tag than we ever thought. Could it even be that our current political crisis—and a crisis it is—is due at least part to the fact that millions of self-identified “evangelicals” are in churches that keep their attention but don’t teach them much? I’m not even talking mainly about the failure of churches to explicate a Christian view of political engagement, though that is certainly part of the problem. I’m talking mainly about the millions of people who name themselves members of evangelical churches, and yet find that reality TV lewdness and Twitter demagoguing are “speaking their language.” Instead of trying to jockey over whether they are actually “evangelical,” it might be better to acknowledge the possibility that many churches have failed to teach their members a better language.

Imagine a member of a entertainment-oriented church. He attends once per week, faithfully but passively. He absorbs many contemporary worship songs, some of which seem inspired by the Psalms but many of which seem inspired by Hallmark. Though he doesn’t consciously register it, the language and ritual he hears in church overlaps with that of commercialism. Everything about the church service is “accessible” to him as an average, working class American Christian. Everything feels new, and interesting, and immediately useful (or would if he could remember it after lunch). The hour he spends on Sunday morning feels like time well spent, mainly because it wasn’t much time and because there’s little cognitive dissonance between life in the church and life in the world.

Can this kind of spiritual formation provide any ballast in the wake of economic hardship, cultural alienation or political anger? Not at all. For those who aren’t being actively formed to think deeper thoughts, the rhetorical power of talk radio and social media demagogues is too invigorating and too empowering. Much of our American political rhetoric is pure showmanship, training the audience to respond as quickly as possible, as emotively as possible, to the world around them. Outrage, mockery, and hysteria feel so real, and when a moral imagination has not been trained to want something more, there’s no defense against them. If the moral imaginations of evangelicals aren’t being formed in church, where will they be formed?

The local church’s mandate of discipleship is a mandate for maturity. If evangelicalism has failed in the voting booth, perhaps that is because it is failing in the pews. Perhaps evangelical church culture cannot be satisfied with “relevance.” Perhaps what it really needs is transcendence, to risk sounding out of date and out of place if it means thinking big thoughts about big questions. This isn’t a call for civics lessons from the pulpit. It’s a call for the recovery of the Christian tradition that stood up to Roman emperors for the cause of religious freedom, and stood up to kings and presidents for the end of vicious slave trades. It’s a call for the church to be more than accessible—to be formative, to meet people where they are in order to raise them up.

There is a God-appointed time for Christians to come together, with unity in diversity, and learn to look at the world the way God sees it. That time is the gathering of the local church. Before evangelicals can stand athwart history, we need to sit athwart it first.

What I’m Reading

I thought I’d share some of my current reads. Right now I have a list about 6-7 books deep that I hope to get through before my wife’s due date of August 1. We’ll see about that, I suppose.

Keep in mind that none of these blurbs are necessarily recommendations, for the simple reason that I’m currently reading them and haven’t finished yet.

One note: Please consider buying these books and using the links on this blog post to do so. I’ve recently become a partner in the Amazon associates program for bloggers. If any of these titles appeal to you, I’d be grateful if you use these links to make your purchase.

The Fellowship: The Literary Lives of the Inklings: J.R.R. Tolkien, C. S. Lewis, Owen Barfield, Charles Williams, Philip Zaleski and Carol Zaleski

This one is #1 on my summer priority list. The Zaleskis set out to do something of a 4-way intellectual biography of the most important members of the Inklings: J.R.R. Tolkien, C.S. Lewis, Charles WIlliams, and Owen Barfield. I’ve loved the legacy of the Inklings for years and have always read about them in individual biographical volumes of the writers. This work looks like a treasure trove for anyone interested in these incredible minds and how they intersected with one another.

Bowling Alone: The Collapse and Revival of American Community, Robert Putnam
Robert Putnam is one of the most well-known and influential social scientists of our time. Bowling Alone is an older work, published in 2000, but it is frequently referenced as a seminal study on the disintegration of close social bonds in America.

A Peculiar Glory: How the Christian Scriptures Reveal Their Complete Truthfulness, John Piper

This is John Piper’s new book, which looks like a brief theology of Scripture and the Christian use of it. Piper is one of the few authors on my “Read no matter what” list (a list everyone should have, by the way).

The Moviegoer, Walker Percy

I haven’t read anything by Walker Percy, and from what I’ve heard this is the best place to start. I decided to jump into this one after constantly seeing references to Percy in some of my favorite non-fiction writing.

The Devil in the White City: Murder, Magic, and Madness at the Fair That Changed America, Erik Larson

Erik Larson is quickly becoming one of the most celebrated history writers in the country. This book tells the true story of the Chicago Worlds Fair and the serial killer who stalked it. I’m very early onto this one, and already Larson’s rich prose and keen historical eye have me hooked.

Modern Philosophy: An Introduction and Survey, Roger Scruton
Roger Scruton is one of my favorite living philosophers. I’ve read his books How to Be a Conservative and The Soul of the World. Modern Philosophy looks to be an introduction to contemporary issues in philosophical writing. Scruton has a gift for incisive scholarship, a crucial talent for any philosopher.