The other day I wrote a bit about remembering, and things otherwise unremarked tied together to make the cloth of history. This was framed a bit as a reaction to purgers, cleansers, simplifiers and the norm, but really? Really it was a circle’s line crossing several times (they can do that, you know).
I use that phrase “drawing a circle” often. It’s Charles Fort’s, originally. Go look it up if you need reminding. He was a crazy old crackpot of a fellow, Charlie was. Amusing, though: he could after all tell an engaging story.
Anyway, a decade is a foolishly long time to draw a circle; fifteen years, twenty years, a lifetime even crazier. You get wild sometimes along the path, or jittery, or buffeted off course, or just plain bored. Eventually you don’t end up with anything quite so precise or Zen as a “circle”. But you draw, and what you draw you call a circle because that’s what the metaphor demands: it’s always a circle.
I think 1997 it was I had a chance to visit Trinity. The place with the bomb, and the place about which I wrote my first blog-ramble thing about my father’s stories of working on high-speed cameras capable of taking photographs of nuclear explosions, and how (because surely the records are gone, or at least unremarked) those photos of the growing blossom of Trinity’s explosion must therefore be his, mine, ours, that camera’s. I called it “nanohistory” back then; it was a bit of a gag on nanotechnology I thought, and the future, and also multiscaled phenomena which we were all doing back in the day, but that was the thing I saw and wrote.
By now some of the folks who ate Green Chili Cheeseburgers with me at the Owl (best in the world) and rode the Trinity bus that day have had strokes. Some Ph.D.s are granted; some wives are dead. They’ve moved out of their well-preserved Moderne homes along the Turquoise Trail, they’re still living in Santa Fe, they’re where they went. And so on. Years of little accumulated drifts have piled up around the line they drew collectively. Unremarked (at least by me) since the day I filled a roll of 35mm film with their portraits lit by White Sands sun.
And in that time I’ve emptied out the house where my father’s and mother’s material stories, their “memorabilia”, was stockpiled, and picked up new habits and careers, had deaths and all kinds of wiggles of my own. Even the rambling essay I wrote in Santa Fe is nearly gone from the world. Its earlier versions are surely gone, as I never have been bothered to keep older versions, early edits, that sort of thing, and I just threw away the diskettes a week or two back. In the trash. There is no machine in my house that could read them, after all, so why bother with drafts of unread rambles?
This is normal stuff. Mundane; entirely of the world. But it’s about remembering. Being reminded.
We were inventing Big Data back there in the late 1990s. Have I told you that? I think I’ve apologized for it already. But some of the very people at Trinity with me that first Saturday in October fifteen years back were the founders of bioinformatics. Some of us are the data miners who wrestle piles and reams of ASCII and pixels into cobbled-together contraptions we built from folk wisdom and jury-rigged repurposed components we dragged out from the garage. We were discovering how to render data down into clarified, burning utility: models, predictions, and above all controls.
Control was a big one, and I think the most ironic. After all, we were complexologists: for fuck’s sake we were the End of Science, with our hand-waving anecdotal subjective contingent agent-based models. We were about emergence, the not just uncontrolled but inexplicable.
Though it didn’t really work out that way somehow. Nowadays not many of us are left here in the Proverbial Woods. There’s a fad or a revolution or a war or something, or so I hear, and the vast majority have put on ill-fitting suits and gone down to the City to be hired up by Big Data Distilleries—Big Brother, Data Science, and even a few at the Tower of Words. Those folks stroll the aisles now under suspended ceilings of fluorescent-lit data centers, patting earnest workers in (proverbial) white lab coats on their shoulders. Either that, or they fell to the service of corporations, and their work became the jargon of the Street, which dearly loved our Edges of Chaos and Emergences and Nonlinearities as handy excuses for doing what had already been decided: making this world we live in.
Just a few of us left here Ending Science these days. A little bit at a time, the work goes on until we’re all bought off or dead. We’re not a colony in any sense now of course; more in the role of folksy fogeys in the shadows of the diner downtown, talking up contingency and narrative, while clinging to an obsolete humanistic definition of “emergence” and “path-dependence” nearly all worn to thread like a quilt in a barn.
Yeah, well. At least there’s coffee.
Hey, here’s a funny thing: Did you know it’s no longer obsessive compulsive disorder when you collect a petabyte of data from a particular rat neuron and absorb your months’ attention focused on just the lovely patterns in the spike trains? Or that it’s no longer hoarding when you’re driven to stockpile every digitized book in the entire world? Or that even the old saw about trying the “same thing over and over and expecting different outcomes” doesn’t really come into play nowadays when the things you keep trying are the functional capacities of combinatorial variants of protein sequences? And! And! It isn’t eavesdropping—you are not a scary neighbor lady—when all the phone calls of a city are pressed into your service of knowing what those damned kids are doing over there, with their parents away (it shouldn’t be allowed)!
That’s inference now, not madness.
It’s the frontier we (and others not far out along our social networks) opened up for you all, about the time we rode the dusty road into Trinity. All those things are now new kinds of service. Not a sad lone madness left among them.
[“Isn’t that interesting, isn’t that interesting.” That’s what my sharp old friend Lew Tilney would have said, without a single question mark at all, when I was dazedly walking the halls of Leidy Labs trying desperately to discover what was wanted of me by my superiors. He’d walk up and slap his hand down on your shoulder and say, “Tozier! You know about trees! I was just reading about trees! Did you know there’s absolutely no damned way water can get to the top of trees? Physics won’t handle it! Now isn’t that interesting.” And he’d stride down the hall in his red socks and I’d wish I could see what came of that thread, instead of having to justify the counting of combinatorial proteins’ functions to people who found it mad. I learned years too late that Lew was always right every time he told you, “Now isn’t that interesting.” It was and is always interesting, salient, connected. There is never any question to mark.]
So a point is, that I wouldn’t be surprised if there was a time around 1900 when talking into boxes and expecting an answer stopped being considered madness. Or a time when acting as though you knew what a person far away was doing that very day didn’t make folks laugh. And so on. You get that picture? Now isn’t that interesting.
At any rate, some of the people on that bus to Trinity, and plenty more who didn’t make the trip that day, or who I met later or earlier in my life by a few years one way or the other—they made all these madnesses into stuff you see on magazine covers and RSS feeds.
And I love that. I can’t tell you how lucky I’ve been to fall into this hobby of watching smart people noticing things.
It feels like “madness” periodically becomes the fabric of societies, in turns, as new transformative technologies come online and escape and spread and do their stuff. I could be more focused I’m sure, more journalistic. I could refer to one of those Philosophers of Science you only really see in epigrams these days, Kuhn or Lakatos or somebody. But not this time; this is mere folksy rambling, not observation of a sort that’s useful.
I just noticed, is all. Way I see it, this is me just having fun watching smart people starting to try to realize they ought maybe to notice something again. And undoubtedly I’ll just sit here and watch for a while more, and when nothing’s forthcoming, maybe I’ll just change the subject.
Not worked it out? Well, that’s fine, that’s fine. No reason to stop chatting, is it?
Maybe we ought to shift gears, talk about the humanities for a while. Wikipedia (I smile for some reason whenever I link there these days) says the humanities are disciplines that study the human condition. “Disciplines” is another word that makes me smile nowadays, too, thank Abbott.
You know, I have a fond respect for those poor folks in the humanities. Personal fondness even. When I was a kid, it was decided I was either going to go to Case and be a biologist, or go to Oberlin or what’s that other place’s name that begins with a D—I can’t recall—and be an English major. A writer sort. Senior year it was old Bill Cawley, my high school English teacher (so hard not to say “professor”, isn’t it?) who slapped a hand down on my shoulder and told me people actually still could make a living, if a hard one, writing. But I picked the other, and luckily too because I met my beloved wife of twenty-five years (amusingly enough in a History of Science course, about stories, words, though we barely paid attention at the time for love), and as a pretty good science sort I got eventually to that bus in Trinity, and learned or to some extent made up the skills of Big Data. And here I am. A folkways practitioner of complexology.
Along the way I spent time in various academies and such. Over there sat the archaeologists, writers, the historians and all those other humanities folks (who I swear actually wear tweed sometimes), clinging to shrinking islands of departments in the context-focused Transformed Universities of the Austere Era. Trying diligently to instill a love of letters, or story, or memory or something in the thousands of kids who trooped through the lecture halls.
Kids are still, at least for the moment, expected to get an embedding cultural framework slapped around them, if only to keep them good citizens and informed voters and able to see perspective on the human condition. Though not too much.
Whatever is the “human condition” these days? Surely it’s 2.0 by now. It’s a kind of madness to think it hasn’t changed, that people haven’t been transformed utterly by all this networking and having machine intelligences at hand with which they can sift the raw data of the revolution to produce information, utility, weal and woe of various sorts. I mean: we have a new ubiquitous sensorium! A different world, in which Science didn’t End at all.
And see all of public policy seems now to want to do away with the waste Great Works entail, the distraction from what kids want and what’s best for them. Ideally they should be getting jobs, and learning skills, and preparing for whatever it is Big Data uncovers “automatically”. That’s what I seem to hear. Politically conservative folks want to do away with the thoughts that the humanities provoke; politically liberal folks want to do away with the ties to benighted and inhuman Bad Old Hegemonic Times the humanities rehumanize. In both cases I think it’s maybe the sense of inconsistency you get from read literature and discussed history that’s the biggest threat. We talk of the humanities in terms of waste and inutility, but really they’re seen as a threat.
They’re confusing. They dilute the story of the present and the future.
Letters, you see, are complex. History isn’t glib, it’s really never glib: it’s got folds tucked into its folds, and everything seems to mean something else to somebody else. The humanities are onerous because they’re all so tied together by these confusing personal subjective accidental ramified networks that reach back down into the stacks of libraries we’re emptying, and meanings and usage we’re glossing over these days.
And so they’re dangerous. Lean times call for leanness; what’s needed now is an efficient ability to frame every actionable item and sort it on the basis of delivered value. History doesn’t have a lane on the kanban.
It’s a wasteful kind of madness to dive down too far into old books. And a dangerous kind of madness to force kids who might better be working in the present and building our future to sit quiet and look instead into the past. What could they possibly gather up from that well-trod cemetery soil? Things are different now.
By now you’re thinking I’m bemoaning the end of the humanities departments and the closure of libraries and the loss of all that tweed. Really? You know, that would be a nice simple story you could distill out of this path if you like: “Dagnabbit, wouldn’t it be better if we taught kids Greek again? Why not add Letters and History to STEM, and make it… STEMLH. Crap. We’re going to need more vowels. Get Art on the phone, stat.”
But no, that’s not what I’m encircling. That’s been done, and besides I’m supposed to be Ending Science.
The trick is, Science is all tied and twisted up in the Humanities, Snow notwithstanding. They’re jealous siblings, copying one another in turn. Now isn’t that interesting.
Here’s what I love about the humanities, at this juncture: Just as every family gathering has that memorable crazy Aunt or Uncle, the humanities still insist on coming to our metaphoric Thanksgivings and rambling on about their personal hobby horses.
Brutal frankness: I like them humanities folks much better these days than I like most of my Sciencey Engineery cohort, or most any of the folks who sit with me at conferences of learned societies nowadays when I deign to drag myself down to the City and attend. They’re all good people who compute and sift and train up the Future, but they are nonetheless a boring old bunch. That stereotype is still just as true as the tweed humanists’ trope.
Ah but see, those humanities folks, they can tell a story. And they remember stuff. Crazy stuff, like how to read the shipping manifests of third century Asia Minor, or how some elliptic references to “death” are really horny poet-talk while others are about tuberculosis. And this one is best, as I see it: They’re willing to use the word “remember” to refer to acts of construction.
They apologize a little bit to the rest of us when they “remember”, just to explain the weird affectation they have that telling a story is building a thing. The mode in science these days, and also engineering, is that remembering is paring away mistakes, and disclosing the real truth of the world so it can be shared and consistency may reign on Earth as it does… (well you know the rest of that one). Among themselves the humanities folks all know remembering is a special kind of making, that recalling and recording is constructing novelty, that it’s not computation or reduction or scouring away matrix. And even better: they know how to make this special mad kind of making useful, or at least engaging and entertaining. Often as not they spend most of their time entertaining one another, reading their papers aloud at conferences and such, but sometimes one will be lifted up from their shrinking island preserve and be presented in the popular press, as a kind of Outsider Artist or something.
That thing they do, I like that. I like their mindfulness, that they act as if knowing were making.
Not many of us like it so much any more, though. It’s a mad notion when you look at it from a modern perspective: history and literature, poetry and classics, archaeology and dancing about architecture. “Mad” for the same reasons you’d be put away in a rest home for standing up in a busy public place where people are trying to go off and get their proper work done, yelling and ranting and invoking archaic names in ceaseless demands that they slow down and notice, see what’s there—or more likely what isn’t there.
Crazy people tell folks to slow down in lean times. They question what’s real and known and true all over again, stuff we’ve shipped, the truth we’ve accumulated. As if when you examined it again for the hundredth time, the old photograph of a bomb exploding would this time be more than an image of reality hanging on a fence in a desert. Some kind of story you made up on the spot, different next time.
But of course you and I know remembering is simply looking stuff up.
It’s not making things up. Data access, which is why we’re all so earnest in our recording and curation of the facts. Data access is what drives Big Science now, and marketing and all statistical miracles that have come to pass and are nascent in the world. It’s the real world, the world of data that’s important, not the made-up world of fiction and history. A cancer cure is not a story, nor is the money in the bank you made from high-speed trading, nor even the counts of the number of times the gendered pronouns appeared in our digitized Early Modern books. Those are facts, written down right there on in public.
And yet there are still a few of these other poor folks, sitting down and quietly reading old stuff and acting as if modern statistics and data-driven explanations were anything at all like story-telling. Mad folk. Fiddling in back-country hollers of the academy, little ivy-covered museums and even lone shacks off the beaten track, refusing for whatever reason to move down to the City and get themselves a proper job adjuncting or something.
No, that was it. I was just thinking out loud about the humanities, is all. Sad to see them go, you know. But it’s for the best.
Say, I bet you know about data! I’ve been thinking a little about data lately. Did you know that there’s so much data now that there’s no damned way to consider every model, prediction, or control mechanism—even for one given data stream? Let alone all of them! It makes no sense. Data’s all there, models are simple to build, and so now all the work is boiled down to arguments over technique, concocting various approaches and invoking conflicting proofs, and worrying about utility functions and constraints and contingencies. Hell, it’s like now we have the data, only the hard part is left: figuring out what questions to ask first.
Now isn’t that interesting.