This piece is an essay printed in the main book. There is no audio component.
An autographed game-used baseball—bearing personalized inscriptions by two players on the Minnesota Twins, Chuck Knoblauch and Hall of Famer Kirby Puckett—is the sole surviving physical evidence of a childhood consumed by sports. Although it’s not inaccurate to say I was born DeafBlind, since I have the progressive-blindness condition known as Usher syndrome, it’s often more helpful to say I was born Deaf and gradually became blind, growing into my DeafBlind identity. As a kid, I had tens of thousands of baseball cards that I would strain my eyes to read. My shrines to that time are all gone now, save for this one baseball. I never thought I would entertain the idea of giving it away, but here I am. Should I keep it? It doesn’t take up much space. But what does it mean to me now? It’s like a moon rock, a lonely object out of space and time.
Jim Fuller, a staff writer for the Minneapolis Star Tribune, was the one who, beaming, presented me with the baseball. He first came to our house to interview my father about his efforts to establish a bilingual Deaf charter school. Jim soon discovered that I was a sports fanatic. We got to scribbling notes back and forth about our beloved Twins. Jim said he would have a surprise for me the next time he stopped by.
Although he gave me the baseball in the summer of 1993, it evokes my happiest sports memory, which took place two years earlier, when the Twins played in the greatest World Series ever. When Kirby walloped it out of the park to take us into Game Seven, I could hardly breathe. Then it was John Smoltz and Jack Morris taking turns on the mound. In the bottom of the tenth, with Smoltz out of the game and Alejandro Pena in relief, a Twin stretched a bloop into a double. A few at-bats later, the bases were loaded, and when the next batter made contact, the first tore off. As Dan Gladden, a.k.a. Clinton Daniel Gladden III, a.k.a. the Dazzle Man, flew down the home stretch, the universe, the whole world, my very being rushed toward him. Nothing can do justice to the moment he leaped and landed on home plate except witnessing it with your own eyes. Any attempt to describe it is futile. Description can only serve a roundabout purpose.
It took me a long time to realize this. I continued to follow sports after I could no longer witness events with my eyes. It pleased me to believe I still had access through sports news, box scores, occasionally enlisting someone to sit next to me and relay games, and, above all, reading fine sportswriting. Wasn’t baseball synonymous with literature? It therefore baffled me when I found myself keeping up with sports less and less. I skipped the Super Bowl a few years after the television screen ceased to be legible, breaking a tradition going back as far as memory. I unsubscribed from ESPN: The Magazine even though the list of magazines available in hard copy Braille is short and precious. Then it was down to one sportswriter, Bill Simmons, a most diverting raconteur. I accepted that I now required good writing to maintain my interest in sports. But even the Sports Guy’s columns began to lose their charm after a while. What was going on?
At first, what I read or listened to live through an interpreter teemed with players I had worshipped with my own eyes. I knew their faces, their tics, the way they licked their upper lips or groaned or stared or gasped in horror or with joy. As they faded into retirement, there was less and less poetry in what I gathered, replaced by new and strange and meaningless names. Direct experience goes a long way. It meant that sports did resonate with me for years after my last eloquent encounter. But without direct experience, I learned I couldn’t access the same life.
Disability rights activists have long fought for access, most often in the form of basic and unobtrusive accommodations. Today, billions of dollars are poured into projects seeking to increase inclusion. It’s not that I don’t appreciate it when a restaurant has a Braille menu. A device attached to a streetlight post that vibrates when it’s safe to cross the street is huge for me. Programs created in the name of access make it possible for me to write this essay. Ramps, elevators, wide doorways, flashing lights, railings, benches, assistants, care workers, and myriad technologies make all the difference in the world. But the way those things are lobbied for, funded, designed, implemented, and used revolves around the assumption that there’s only one world and ignores realms of possibility nestled within those same modes.
The question I am asked most frequently by hearing and sighted people is “How can I make my [website, gallery exhibit, film, performance, concert, whatever] accessible to you?” Companies, schools, nonprofits, and state and federal agencies approach me and other DeafBlind people all the time, demanding, “How do we make it more accessible?”
Such a frenzy around access is suffocating. I want to tell them, Listen, I don’t care about your whatever. But the desperation on their breath holds me dumbfounded. The arrogance is astounding. Why is it always about them? Why is it about their including or not including us? Why is it never about us and whether or not we include them?
In my community, we are in the midst of a revolution. We have our first truly tactile language, called Protactile. We insist on doing everything our way, fumbling around, groping along, touching everything and everyone. We are messing with traditional spaces, rearranging them to suit us better, rather than the other way around. The Protactile movement is obsessed with direct experience. As Robert Sirvage, a DeafBlind architect, put it in a recent conversation, the question we begin with is not “How do we make it more accessible?” Instead, we start by asking, “What feels beautiful?” When hearing and sighted people join us, they pick up Protactile and learn how to work and socialize with us in our space. They often find themselves closing their eyes, either literally or by dimming their visual processing, because sight isn’t necessary. Bodies in contact become as normal to them as they are to us.
When the word access comes up, it usually refers to tools or avenues that complement the sensory experience people already enjoy. Captions for movies, TV shows, and videos are excellent examples. They are said to provide access for Deaf people, who, I need to stress, already have a relationship with the images flitting across the screen. When blind people ask for audio descriptions, this accommodation merely supplements what they already hear. For example, the audio description might helpfully note that “the King is waving his sword, his cloak billowing in the wind” when a king shouts, “Follow me, ye good knights!” But then there are the efforts to feed captions into Braille displays so DeafBlind people can have “access” to radio, TV, and film. This isn’t complementary access. It’s a replica, divorced entirely from the original. This is how we frequently find accessibility features—as sorry excuses for what occasioned them in the first place. Access itself is too often all we have, a dead end, leading nowhere: captions without images, lyrics without music, raised lines without color, labels without objects, descriptions without anchors.
In the United States, there are tens of thousands of American Sign Language interpreters, who are trained to facilitate communication in the most accurate and impartial manner possible. You could say they are human captions. The rigor with which they strive to translate between ASL and English, and between various cultural frames of reference, may be a wonderful way for sighted Deaf people to gain access—which is to say, complementary access—to many settings. But ASL interpreters are an atrocity for DeafBlind people, constantly inserting themselves between us and other people in order to facilitate conversations, but instead getting in the way of direct connections. This was one of the things that unwittingly helped give birth to Protactile in 2007.
Protactile took root only when a group of DeafBlind leaders in Seattle decided to conduct meetings and workshops without any interpreters. DeafBlind community members were shocked by how well those events went, with participants communicating directly and rotating from cluster to cluster. This success emboldened us to break many taboos related to touch, including touching one another’s bodies instead of just moving our hands in the air. A grammar soon developed to coordinate all that contact. A new language was born. It’s no accident that this explosion occurred when we took a break from the most prevalent manifestation of access in our midst: ASL interpreters.
My experiences on September 11, 2001, provide an illustration of why, before the Protactile era, it was so frustrating to work with interpreters. I went to my postcolonial literature class at the University of Minnesota without having read any news online earlier that day. I found my two ASL interpreters already there. They immediately asked me, “Did you hear about an airplane hitting two poles?”
I laughed. “No, but that’s funny. So today they’ll be talking about Rudyard Kipling’s Kim. How about we give Kipling this ASL name and Kim this ASL name? To distinguish between the author and the character? Good?”
A long pause.
I repeated, “Good?”
“Yes... that’s fine,” they said. They were acting strange. When the professor arrived, the energy was weird. He asked if everyone was all right. Did anyone have family in New York? Did anyone need to leave class?
It wasn’t until hours later that I read the news and understood what had happened. People must have been upset and crying. All the TV screens running the same footage over and over. And my interpreters had failed—miserably—to convey any of it to me in a meaningful way. Why? Because they were there only to “provide access,” primarily to the spoken English content of the class.
Fast-forward to one of the Protactile movement’s biggest achievements to date: creating a new kind of interpreter. In 2017, we launched the DeafBlind Interpreting National Training and Resource Center and began hosting week-long immersion trainings for interpreters, led by DeafBlind Protactile experts. When my colleagues and I started developing the program, we quickly realized that the point wasn’t just to help interpreters become fluent in Protactile. It had to facilitate a complete reinvention of their role. Instead of providing “accurate and objective information” in a way that unsuccessfully attempts to create a replica of how they’re experiencing the world, Protactile interpreters must be our informants, our partners, our accomplices. Typically, ASL interpreters are system-centered, leashed to a platform or classroom or meeting room or video call, jerked into action every time someone speaks in English. There’s usually a power imbalance, such as between a hearing teacher and a Deaf student, a hearing doctor and a Deaf patient, or a hearing boss and a Deaf employee. With this power imbalance in mind, we can understand why ASL interpreters often “belong” to the hearing party more than to the Deaf party. This is problematic for sighted Deaf people, but it is devastating for DeafBlind people. Protactile interpreters, by contrast, are consumer-centered, firmly aligned with us, following our lead as we figure out how to hack into situations. We recognize that there’s little value for us in most distantist spaces—that is to say, spaces where people are rarely touching but remain visible to one another. The question in working with an interpreter for us then becomes: What do we want to get out of it? What we want is never what hearing and sighted people plan or propose to do, because they never ask us, “What shall we do together? How do you want to do this?” They merely wish to include us.
A story to illustrate what has changed with this new role for interpreters: Early in the COVID-19 pandemic, a DeafBlind friend told me about her recent experience working with a Protactile interpreter I’d helped train. She had a doctor’s appointment, and the Protactile interpreter met her at the entrance to the building.
“Wow,” the interpreter said as they entered the waiting room, “everyone here is tense and talking about COVID. The TV over there: it’s on COVID. Do you want me to relay that, on the TV, or eavesdrop on what the doctor over there is saying to a cluster of people… something about masks?”
My friend dismissed it all with a sweep of her hand across the interpreter’s chest. “Not interested. So how was your trip to—”
“Yes, yes,” he interjected, “we can talk about my trip, but I just want to make sure. Do you know what COVID is?”
“I have no idea.”
“Whoa. Okay, okay, okay. Listen, COVID is earthshaking news.” He grasped her shoulders to mock-shake them for emphasis.
After he explained COVID-19, my friend was awed and now wanted to know what the TV was saying, and had many questions for her doctor.
Here, the Protactile interpreter operated as my friend’s partner, making subjective yet vital contributions. When he found her dismissal of COVID-19 odd, he pressed her on the topic. An ASL interpreter would never have done that, unless they allowed their instincts to overrule their training.
When I teach ASL interpreters that they must share their opinions and assessments, they always protest, “But I don’t want to influence the DeafBlind person!”
“If you’re worried about influencing us,” I reply, “you give yourself too much credit and us too little.”
Another thing ASL interpreters habitually do is describe the whole of things. Upon entering a room, for example, they stop and say, “This is a midsize room with a few tables, here, there, and over there. There are… let’s see, one, two, three, four, five, six, okay, six windows—”
Here I stop them. “Why are you telling me, telling me, telling me things? Your job isn’t to deliver this whole room to me on a silver platter. I don’t want the silver platter. I want to attack this room. I want to own it, just like how the sighted people here own it. Or, if the room isn’t worth owning, then I want to grab whatever I find worth stealing. C’mon, let’s start over. What we’ll do is start to touch things and people here, together, while we provide running commentaries and feedback to each other.”
Although I travel places and enter spaces alone all the time, interacting with the environment and people I encounter according to how things unfold, it’s often nice to have a sighted co-navigator, such as an interpreter. It may mean being able to approach someone who is not standing where I’d typically be exploring, along a wall or from landmark to landmark. If the person doesn’t know Protactile, the interpreter can translate my quick explanation of what I need them to do—put their hand on my hand and give me feedback with their other hand—and why I am using their upper chest or arm or leg to describe something with. I can establish that contact with strangers without an interpreter, but it may take a few false starts before they “get it.” They may forget to give me adequate feedback, so the interpreter will relay to me their reactions to our exchange.
Giving quick reads without getting bogged down in details is an important skill for a spy doing live reconnaissance. But ASL interpreters are at first inhibited by notions of neutrality and objectivity. They start by offering something like “Walking by over there is a tall, thin, light-skinned person with curly dark hair down to here, wearing a white tank top, blue jeans, and brown boots…” They’re pleased that they’ve avoided race and gender.
“No, no, no.” I brush my hands back and forth across their arm in vigorous negation. “That’s not the way to do it. The very same description could be applied to a gorgeous Latina in chic, expensive boots, who oozes money, or to a pasty, rangy white man, hair a mess, boots falling apart, maybe looking angry. Like, they’re the opposite of each other? Yet they share the same sanitized description.”
Because of their fear of bias, I discuss four implicit safety nets to help them feel better about uttering an assessment:
First, we’re not so fragile that saying something wrong will topple us. We know what we are doing. We—not they—are in charge of our missions. Responsibility lies with us, not with them.
Second, I tell them a story about the best interpreter I worked with before the Protactile era. He was a volunteer rather than a professional interpreter, and because of this, his commentary was so unvarnished that I picked up a ton through him. He also happened to be a racist and misogynistic Deaf man, but I was able to separate his bias from the information he gave me. I ask my interpreting students, “Are you an unabashed bigot? No? Then you have that much less to be worried about.” This interpreter wasn’t good at his job because he was bigoted; rather, he was good because he functioned as an open channel of information, and so everything in his brain was revealed, his bias along with it. “You don’t want his bigotry,” I tell my students, “but you want his talent for not thinking twice.”
Third, if they’re so terrified of letting slip their own opinions, I tell them, then they should consider what I call “collective subjectivity.” Suppose a hundred sighted people see someone sauntering into a room. In that Gladwellian blink of an eye, they all come to a hundred slightly different conclusions based on their own life experiences. An interpreter may happen to be a fashion maven and know the person’s expensive-looking boots are knockoffs, for example. But nevertheless, there will be certain cultural signifiers that are recognizable to the majority of those hundred people, however correct they may or may not be. The question is: What is it that is being broadcast to the collective? We don’t have time to listen to a long deposition, the thousand words that a picture is rumored to be worth, for us to reach a reasonable conclusion—if we can even reach such a conclusion, since ours is not a visual world. It’s so helpful to have an aide de camp to tell us whether someone is receptive to us or if our charm is being wasted.
Fourth, there’s the Gladwellian blink of an eye, and then there’s the Gladwellian—or Clarkian!—slide or pat or jiggle of the hand. By bumping into, sniffing, tapping, brushing past, we are gathering intelligence of our own. That’s why we shouldn’t stop while our interpreter attempts to construct a replica but should instead continue picking up important information that may confirm, contradict, or qualify what an interpreter contributes.
After nudging two hundred–plus ASL interpreters through the travails of rebirth as Protactile interpreters, I began to understand why people who work around access cling to the concept of accuracy. This commitment to accuracy, to perfect replication, is a commitment to the status quo. We are expected to leave it untouched, or, if it must be altered, then to do so as little as possible. Access, then, is akin to nonreciprocal assimilation, with its two possible outcomes: death by fitting in or death by failing to fit in. The Protactile movement is the latest pulling away from replication. In Deaf history, generations of hearing educators have tried to use sign language expressly to represent the dominant written language, first by finger spelling letter for letter and, after tiring of this, word for word. It was always shaped around the dominant language and never about what real sign language had to offer. One unintentionally hilarious attempt at accuracy was a system called Signing Exact English. In blind history, reading by touch started with raised lines that followed, exactly, the lines of printed letters. When the lines proved painfully slow to trace with one’s fingers, sighted educators grudgingly allowed for them to be more blocky and a bit easier to feel. Braille—as a different world, a world of dots as opposed to lines—was long from developing at that point, and hasn’t, in fact, been fully embraced as a medium in its own right even to this day, with so many people still concerned about representing print accurately. Of course, the problem isn’t accuracy, per se, but whose accuracy.
In recent years, there has been a rush on the internet to supply image descriptions and to call out those who don’t. This may be an example of community accountability at work, but it’s striking to observe that those doing the most fierce calling out or correcting are sighted people. Such efforts are largely self-defeating. I cannot count the times I’ve stopped reading a video transcript because it started with a dense word picture. Even if a description is short and well done, I often wish there were no description at all. Get to the point, already! How ironic that striving after access can actually create a barrier. When I pointed this out during one of my seminars, a participant made us all laugh by doing a parody: “Mary is wearing a green, blue, and red striped shirt; every fourth stripe also has a purple dot the size of a pea in it, and there are forty-seven stripes—”
“You’re killing me,” I said. “I can’t take any more of that!”
Now serious, she said it was clear to her that none of that stuff about Mary’s clothes mattered, at least if her clothes weren’t the point. What mattered most about the image was that Mary was holding her diploma and smiling. “But,” she wondered, “do I say, Mary has a huge smile on her face as she shows her diploma or Mary has an exuberant smile or showing her teeth in a smile and her eyes are crinkled at the edges?”
It’s simple. Mary has a huge smile on her face is the best one. It’s the don’t-second-guess-yourself option. My thinking around this issue is enriched by the philosopher Brian Massumi’s concept of “esqueness.” He exemplifies it by discussing a kid who plays a tiger:
One look at a tiger, however fleeting and incomplete, whether it be in the zoo or in a book or in a film or video, and presto! the child is tigerized… The perception itself is a vital gesture. The child immediately sets about, not imitating the tiger’s substantial form as he saw it, but rather giving it life—giving it more life. The child plays the tiger in situations in which the child has never seen a tiger. More than that, it plays the tiger in situations no tiger has ever seen, in which no earthly tiger has ever set paw.
Just as the child and an actual tiger are not one bit alike, the words Mary has a huge smile on her face have nothing in common with the picture of Mary holding her diploma. Yet the tiger announces something to the world, its essence, and a kid can become tiger-ized and be tiger-esque, their every act shouting, I am a tiger. The picture of Mary at her graduation is shouting something, and the words Mary has a huge smile on her face are also shouting something. It is at the level beyond each actuality, in the swirl that each stirs up, that the two meet.
We would do well to abandon the pretense that it’s possible to reproduce base things in realms other than those that gave birth to them. Instead, we can leave those things well alone where they belong, or, moved by possibilities, we can transgress, translate, and transform them. We can give foreign things new purposes, which may be slightly or extremely different from their original intent. Take the card game UNO, one of the games widely available in a Braille version. The standard cards have dots at the corner that say things like “Y5” for a yellow card with the number five. In practice, playing the game with the Brailled cards is painfully slow. If Protactile hadn’t given us permission to rip sighted norms into shreds, I would still be fingering those dots like a fool. The way to go is with textured shapes, as in our homemade version of UNO, called Textures and Shapes. In this Protactile version you feel the player ahead of you hesitate and make a joking gesture before depositing a velvet star. Now the attention shifts to you, with some hands feeling yours as you deliberate, while a couple of knees tauntingly jostle your knees. Should you unload your velvet square or your rayon star? But the transformation doesn’t end there. Ideally, there are up to four players, who can feel everything at all times if they want to follow the action, or can chatter in three-way Protactile while the fourth attends to their turn. With four players as the ideal limit, there are somewhat fewer textured shapes than there are UNO cards in a set, and there are further tweaks to the rules. And the “wild card” is a delightful eruption of fabrics! It’s a different game, and one that is naturally more inclusive than UNO could ever be. Our environment has endless potential for life. For centuries, however, much of our vitality was forbidden. We were forced to stick with the effects of the hearing and sighted world. Now, though, we are all in varying stages of flight.
Sighted and hearing people have always had a hard time accepting that we are happy for them. Why have they never been happy for us? They wish only to be happy for themselves through us. Part of the fear many of them feel when encountering DeafBlind people comes from the way we naturally decline so much of what they cherish. They seek relief from this anxiety by insisting that we take in their world. Then they ask us a rhetorical question: “It’s great, isn’t it, this world of ours?” This is the awful function of access: to make others happy at our expense. Until Protactile plunged us into the churning currents of being, we didn’t know what we were giving up by consuming access. And the sighted and hearing didn’t know what they were missing out on by not entering our world.
In April of 2020, I made a small but telling gesture. An online journal wanted a photo of me to go along with three poems it was publishing. I had long wanted to do something about this photo business, even if there were an image description to make it “accessible.” Since I don’t see author images, I’m not immersed in the conventions of that particular species of media. So why should I provide a headshot as if I knew what it conveyed and knew that it was what I wanted to convey? To my surprise, the magazine agreed to my request: No photo! Instead, a few words, a tactile description suggestive of what it’s like to touch me in person. I now tinker with it like I do with my bio. My current line goes something like this: “Short hair of feline softness. Warm and smooth hands. A scent of patchouli. Flutters betray his exhilaration.”
A few months later, I went aflutter when Terra Edwards, a dear hearing-sighted friend and a leading Protactile researcher, told me she wanted to henceforward avoid images as much as possible for the materials we publish related to Protactile. Our research team set up a website called the Protactile Research Network. No pictures, icons, or graphics. Text only! Under “People,” where our bios and CVs reside, there is a “tactile impression” of each member of the team. I love Terra’s: “Strong hands. Heats up in conversation. Frequent and enthusiastic tapping likely.”
And here is Hayley Broadway, a DeafBlind researcher currently working with DeafBlind children on Protactile language acquisition: “Wears fashionable, textured attire. Wiggles her fingers on you when she is deep in thought. When you talk to her, you feel a steady stream of taps and squeezes. Sometimes, when she is really excited, she slaps you. Engage at your own risk.”
As always, Jelica Nuccio—my dearest friend, personal hero, and rock of the Protactile movement—has the last hug: “Her stories are smooth and come with the scent of lavender. She draws you in slowly and then grips. When she laughs on you, you can’t help but laugh too.”
John Lee Clark is a National Magazine Award–winning writer and a 2020–21 Disability Futures Fellow. His first collection of essays is Where I Stand: On the Signing Community and My DeafBlind Experience (Handtype Press, 2014), and he is at work on his second collection. He makes his home in St. Paul, Minnesota, with his family.