The Origin of Species

Lessons learned or relearned in the six months since I last wrote an entry for this blog:

1.) Some of us work on who we are, while neglecting what we are. Others work on what they are, while neglecting who they are. As a “Who” person, I can say that we are always evolving, without ever quite becoming; it is important that we always have people around to remind us that life is not preparation for anything. I cannot speak for “What” people. I think I admire them, because to have a trade is to have an estate; to a prospective employer, they have all the syllogistic clarity of an algebraic function (and should be pretty good with those as well). Ergo, the ideal LinkedIn profile should look like a Who while the sum of its parts add up to a What.

2.) I enjoy teaching, and only wish it paid more.

3.) The First Amendment, as popularly conceived, is a brass ring. It always was. The right to speak is not the same thing as the right to be heard, which is all that most public grievances about free speech violations today seem to be about. Why choose a college campus to broadcast, say, white supremacist ideas, when permits are available for thousands of other venues? Why haven’t any higher education institutions exercised their own First Amendment rights, which Citizens United restored to corporations, to pick and choose their guests without reprisal?

4.) After numerous interviews, and still more numerous unanswered applications, I realize that my interest in a position as an international student advisor has always been rooted in a misunderstanding. The job title itself was always an atavism, a throwback to a simpler time when F-1 caseloads were more manageable, i.e., before university administrators discovered that this demographic could hold the answer to their growing budgetary woes. I found a brief, antediluvian New Directions paper (Wood & Kia, 2000) that described the role as encompassing orientation, counseling on personal issues (insurance, dependents, loneliness, etc.) and post-graduation opportunities, and building cross-cultural bridges with faculty, students, and staff, in addition to visa status management. The authors cite the “marginalization” of international student affairs at some institutions and the need for on-campus advocacy to raise the visibility of the office. That’s it (and it’s a lot). Aggressive recruitment strategies, expanding the school’s social media presence, ensuring that staff be well versed in Salesforce—all of this was in the not-too-distant future, though I’m sure few anticipated it at the time.

5.) Everyone has the right to declare the end of history for at least six months. By the end of history I mean, permission to stop following the news via print, radio, or Internet. I tried this after Trump was elected, and managed to stretch it beyond his first month in office, but had no long-range plan for what to do while driving in Atlanta at rush hour, and soon relapsed. I estimate that I own over 7,000 hours of great music on CD, at least half of which I have not listened to in many years. Since this country appears to be nowhere near the end of its dark night of the soul—much less realizing it is in the midst of one—I have an excuse to rediscover Albéniz’ Iberia, Nusrat Fateh Ali Khan, the Renaissance polyphonists.

I am old enough to remember entering the digital era reluctantly, resentfully, but by choice. Even before I opted out of a stable but dreary future in retail banking to pursue the life of an itinerant man of letters, I was saving book reviews on a floppy disk and hand-delivering them to Creative Loafing, or literally mailing them to what few out-of-state gigs I could find. This was the late 1990s, and for a long time I’m sure my editors thought me eccentric at best. It’s not that I was oblivious to the paradigmatic unrest around me. On the mornings after working the night shift at the bookstore, I would go to the YMCA for a lap swim, and for some reason at that time of day in the locker room the TV would be on, and inevitably the same Cisco commercial would be playing, with its spooky parade of kids only a little younger than I was asking me if I was ready. I was not. I still maintained rich and lively par avion correspondence with friends and relatives on three continents, but on a more basic level, I didn’t identify with the otherworldly messengers in the ads. When I finally opened an email account, with MindSpring, it must have been around 1999, shortly after the publication of Francine Du Plessix Gray’s At Home With the Marquis De Sade; my review was a rush job for the Chicago Tribune and could not wait for the postal service.

By accident or design, of course, we’ve evolved from being patrons of the Internet into its product. Now, it really isn’t possible to opt out of a life under some kind of surveillance, benign or otherwise, and what surprises me is that I don’t think about it very much even though I know I should. Searches, purchases, and even thoughts I didn’t know I had leach into the algorithm-rich groundwater. I can’t bring myself to resent Mozilla Pocket’s recommendations any more than I can feel stalked when the book cover of, say, Azar’s English Grammar suddenly flashes in the margins of websites I’m visiting for the first time, like the red dwarf in Don’t Look Now. In fact, one of Pocket’s most provocative reads described a study that would not have been possible without the cooperation of a surveillance society: after studying tens of thousands of mobile traces, the researchers found that the average person revisits no more than 25 places at any moment in time, providing further evidence that humans are distinguished by their restless curiosity for new experiences. (Not sure how visits to virtual spaces would confound these findings.) If a similarly low number could be found for self-serving political convictions (and conventions), I would be even more hopeful.

I haven’t seen evidence that the Internet’s dataphages have sniffed out my rediscovery of Charles Sanders Peirce, but I suppose now they will. “A young man would hardly be persuaded to sacrifice the greater part of his thoughts to save the rest; and the muddled head is the least apt to see the necessity of such a sacrifice,” wrote the great logician in 1878. “Time will help him, but intellectual clearness comes rather late, an unfortunate arrangement of nature, inasmuch as clearness is of less use to a man settled in life, whose errors have in great measure had their effect, than it would be to one whose path lies before him.” Hardly the most incisive insight in the misleadingly titled “How to Make Our Ideas Clear,” but one that I can I identify with, as I fool around with statistics and rudimentary calculus in my spare time with no obvious path to using them, at least not to the extent that my LinkedIn page will practically sing “What Person: Quant” any time soon. But Peirce’s famous if bizarre essay—a key text of pragmatism which basically ends with a prediction of a post-human universe where questions will nonetheless still matter—seems forgiving of late bloomers like me; indeed, given his liberal attitude toward time, it is not surprising that he says of paths that they are equivalent so long as they begin and end at the same point. No doubt William James had his Johns Hopkins colleague in mind when he contrasted Romeo from a bunch of iron filings, since the lover’s path “may be modified indefinitely.” More on that in a moment.

I seem to gravitate towards ill-fated university types, of which Peirce is another classic example. Even if a 19-year-old Peirce had not recorded in the 1959 Harvard Class Book that he entered college as a fast man turned bad school boy, embraced the pursuit of pleasure as a sophomore, and in his senior year “gave up enjoying life and exclaimed ‘Vanity of vanities,’” his personal idiosyncrasies and ideas—the latter sometimes too clear, especially on religion—were such that he could count on few allies but James at Hopkins and, later, at Harvard, where he had already gone out of his way to make enemies with the Board of Trustees. Charles Eliot would have nothing to do with him. I had been unfamiliar until recently with his writings on higher education; they are few but not particularly interesting, except that he wrote them. As a Hopkins man, Peirce was very much of his time and place (and to some extent, ours) in his insistence that teaching be subordinated to the production of knowledge, and even his argument that the true object of a liberal education is to train students to think logically sounds a lot like today’s truisms about teaching critical thinking. But even in a piece of hack work like “Definition and Function of a University” there is this gem: “[I]n my youth I wrote some articles to uphold a doctrine I called Pragmatism, namely, that the meaning and essence of every conception lies in the application that is to be made of it…Subsequent experience of life has taught me that the only thing that is really desirable without a reason for being so, is to render ideas and things reasonable.” Whether the ailing philosopher felt this intellectual clarity came too late to be useful, he does not say, but even that does not seem to matter to him either. The whole point of reasonableness is to reach for a more integrated, more comprehensive, sense of wholeness, he says. “In the emotional sphere this tendency towards union appears as Love; so that the Law of Love and the Law of Reason are quite at one.”

It is no exaggeration to say that Eliot’s distaste for Peirce may be one of the few notable anti-mentorships in the history of higher education. Eliot had known Peirce as an undergraduate, lectured him in chemistry and math, just as Franz Boas introduced Edward Sapir to Native American linguistics at Columbia and Benjamin Mays taught Martin Luther King about social justice at Morehouse, and if young Charlie had been a model Victorian like his father Benjamin, founder of the Harvard Observatory, the young Peirce might have had a very different path before him. But Eliot’s track record as a judge of men is hard to beat, and this spring, as my wife and I hiked the circumference of Jordan Pond on Mount Desert Island, threading between boulders by the emerald light of the pines, I confirmed for myself that the man also had a curator’s eye for untamed landscapes—and their potential. I saw no stone marking the precise spot where he and George B. Dorr were photographed over one hundred years ago, midwifing the birth of Acadia National Park, but I’m sure we passed it. We had pushed back lunch in order to get in as much exploring as we could, so no doubt all I had on my mind were lobster rolls and beer.

Declaring the end of history does not mean giving up the study of it. My inner Victorian will always be in search of origins, even if the search brings me face to face with my future, as Wallace Stevens—no less a deconstructed Victorian than Peirce—suggested:

If ever the search for a tranquil belief should end,

The future might stop emerging out of the past,

Out of what is full of us; yet the search

And the future emerging out of us seem to be one.

Stevens almost hits a Peircian note here, since the godfather of Pragmatism did insist that the establishment of belief is the only reason for thinking. But can too much reading of history devolve into a substitute for self-study? I have been excited to discover the work of the late Constitutional historian Leonard Levy (1923-2006), who has been keeping me well supplied with origin scholarship—and penmanship—of a very high order, whether he is writing about the origins of the First or Fifth Amendments, trial by jury, the Establishment clause, or the Bill of Rights itself. (Though subject to criticism since it appeared in 1960, his Legacy of Suppression: Freedom of Speech and Press in Early American History remains a tour de force of revisionism, amassing considerable evidence that the question of what speech should be free was not settled until after the Alien and Sedition Acts, and perhaps not even then.) And aren’t I supposed to be trying to become more of a “What” person anyway, working on old stats problems from grad school and learning how to use my Stata package to conduct meaningful research on the free speech practices of today’s undergraduates while my brilliant and overworked friend and I have access to certain datasets (a matter we discussed on my last day in Boston, only hours after S and I had walked the surface of Cadillac Mountain)?

Of course, as long as I am running classrooms for non-native speakers back in Atlanta, I am a “What” person. Just the other day, another of my students passed her Citizenship test, and despite everything I say about “wanting to make more of a difference” by being an international student advisor at a large research university, I feel that I am making a difference for a great many of the determined, decent people—Venezuelan CPAs, Bangladeshi grandmothers, South Korean housewives, an Ethiopian math whiz—who arrive every week in search of a better life through learning. I remain amazed at the number whose sending countries are not priorities for the current administration but who think Trump is great. It could be a Strong Man thing, but I’m reluctant to generalize. All I know is that the immigrants I know are supremely practical people, “What” people in their nations of origin yet still too burdened by language to be the same thing here (or for now at least, they are Bangladeshi grandmothers and South Korean housewives). Though freedom of speech becomes infinitely more complicated when you don’t command the language, I take some comfort in the fact that, as immigrants, most of these students only want to be free enough in English that they no longer feel they cannot be part of American society. This is an achievable goal, although it demands a good deal of self-work, since language is integral to who we are or believe ourselves to be. It is even true of American undergraduates who believe their native language to be English, only to arrive on a campus where the lingua franca is Discursive Normative English—indeed, I suspect that protesting against Political Correctness is, for some students, a politically correct way of fighting back the persistent demands that they work harder at making their ideas clear.

I can sympathize. It’s no fun to go for so long believing yourself entitled to, if not a scenic path through life, then at least a visible one, that the belief is part of who you are. On the other hand, I don’t see how anyone can avoid running the risk at one time or another. Everything I’ve read about Peirce indicates that he started out this way, able to tinker brilliantly if idly with pendulums simply because of his good fortune to be the son of the man running the U.S. Coastal Survey. I myself did it, doing the man of letters thing in the tranquil belief that it would always be a viable profession, instead of getting “ready.” But I’m not sure I want to sermonize on this point too strongly, because the line between entitlement and hope can be hard to discern. No one, for example, would call a parent entitled who expects his son to outlive him, as Eliot likely did Charles, Jr., whose meteoric trajectory as a landscape architect (i.e., his What) was surpassed only by his legacy as a land conservationist and the inspiration—after his sudden death at 37 from spinal meningitis—for Acadia National Park.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s