One afternoon in September 2018, six years after the work accident that destroyed his left forearm and hand in an industrial conveyor belt, a North Carolina man named Brandon Prestwood stood in front of his wife with an expression on his face that was so complicated, so suffused with nervous anticipation, that he looked torn between laughter and tears. In the little group gathered around the Prestwoods, someone held up a cell phone to record the curious tableau: the pretty woman with long hair and glasses, the bearded guy with a white elbow-to-fingertips prosthetic, and the wiring running from a tabletop electrical device up under the guy’s shirt and into his shoulder.
Right through the skin, that is, so that Prestwood—his body, not his prosthetic—was, for the moment, literally plugged in. As part of an audacious set of experiments by an international network of neurologists, physicians, psychologists, and biomedical engineers, Prestwood had let surgeons at Cleveland’s Case Western Reserve University slice into the end of his left arm and affix tiny electrical conductors to the truncated nerves and muscles. The surgeons then guided four dozen thread-thin wires up inside his half arm and out his shoulder. Afterward, whenever he peeled off the patch that covered them, Prestwood could see the wire leads poking out of his skin.
Welp, yep, they’re wires, Prestwood would say to himself. Coming out of my arm.
He’d wasted too much time lost in depression after the accident. He felt purposeful now. And for some months he’d been making regular trips to Cleveland so that researchers could help him fasten on an experimental prosthetic arm, one of a new generation of artificial limbs with internal motors and sensor-equipped fingers. These devices are of great interest to rehab experts, but what the Case Western Reserve team most wanted to study was not simply the improved control the new prostheses provide. What really fascinated the researchers—the focus of their exploration, each time they sat Prestwood down in the lab and plugged his wire leads into a computer—was the experience of human touch.
Because it’s damnably, wondrously complicated, this critical interplay of skin, nerves, and brain: to understand, to measure, and to re-create in a way that feels … human. That’s not a very scientific way to put it, but Brandon Prestwood is a case in point. Inside the Sensory Restoration Lab, as the Case Western Reserve researchers ran him through tests, there were encouraging developments; when Prestwood made the prosthetic hand close around a foam block, for example, he felt a pressure against the foam. A connection. A tingling that seemed to be coming from fingers he no longer possessed.
Amy Prestwood had never been able to join her husband, though, during his lab sessions in Cleveland. It was not until that September afternoon, when she went to the Maryland research symposium where Brandon was among the demonstrators of new technology, that the two of them could stand within reaching distance while Brandon was wearing the experimental prosthesis with his shoulder wires plugged in.
Brandon keeps on his phone that video recording of what happened next. He still loses his composure when he talks about it. No one has edited or polished this clip; all you see is two people facing each other in a big room, uncertain and awkward, like adolescents at a first dance. Brandon looks at his feet, looks at his prosthetic fingers, grins. With his right arm, the intact one, he points Amy toward his left: Come over here.
The growing literature about our sense of touch is rich with new science, conjecture, and fantastical propositions for the future—but there are four seconds in that video I want to describe, just as Amy wraps her fingers around the hand of Brandon’s prosthetic. His head snaps up. His eyes widen. His mouth falls open. She’s watching him, but Brandon is staring straight out, plainly not seeing anything. “I could feel,” he told me. “I was getting feedback. I was touching her. I was crying. I think she was crying.”
She was. The day he showed me the video we were deep into the pandemic and sitting outdoors; Prestwood had been in the Cleveland lab for hours and wanted a cigarette. We’d met in person for the first time that morning. I can’t remember how we resolved the hesitant do-we-shake-hands-or-not moment—hesitant not because Prestwood has only one hand, but because it seemed as though everybody on Earth was still trying to figure out how to approach each other, how fully to close the distance, how to touch.
Maybe you recall the photos of people pandemic-embracing through shower curtains or hanging plastic. This magazine published an especially affecting one, a clear drop cloth clipped to a clothesline; with the plastic between them, a woman and her daughter clung to each other for the first time in months. I swear I know the sound and feel of that moment: My own daughter improvised something similar, after the weird aching season of across-the-backyard distanced visits, and I can still bring to mind the mercy of that hug.
Through a barrier, yes. Crinkly, slippery. Plasticky. A diminishment, you’d think. But my “need state,” as Liverpool John Moores University neuroscientist Francis McGlone puts it, was too heightened for me to notice. “It’s like being low on a vitamin,” McGlone told me. “You needed to be topped up again.”
Topped up with what, exactly? My grandmother would have looked at me sideways for imagining that was a question that merited answering. But neurologists and psychologists have biological markers now to explain what seems intuitively obvious to so many of us—that most human beings require the physical presence of others, the comforting touch of others, in order to stay healthy. Take in this academic-sounding prose, before I tell you where it first appeared:
Touch is a fundamental aspect of social interaction, which is a fundamental human need … Social touch calms the recipient of the touch during stressful experiences … can reduce activation in threat-related regions of the brain … can influence activation in the stress pathway in the nervous system, reducing levels of the stress hormones … has been found to stimulate the release of oxytocin, a neuropeptide produced in the hypothalamus … Elevated levels of oxytocin are associated with increased trust, cooperative behavior, sharing with strangers, the more effective reading of others’ emotions, and more constructive conflict resolution.
That’s from a federal lawsuit against solitary confinement. Lawyers bringing the decade-old suit, on behalf of inmates at a California maximum-security prison, argued that its practice of isolating inmates for years—with an elaborate security system that nearly eliminated physical contact with others, even guards—amounted to unconstitutional cruel and unusual punishment. Details of a settlement are still being fought over in court, but part of the permanent record now is the expert report written by University of California, Berkeley psychology professor Dacher Keltner, who for more than 15 years has been teaching and supervising research in the science of touch. “It’s our earliest and, you could argue, our fundamental language of social connection,” Keltner told me.
Earliest evolutionarily, he means: We humans are believed to have used “tactile communication,” as the scientific papers say, before we began figuring out speech. And earliest individually: Touch is now understood to be the first sensation a fetus perceives. At birth and during the initial months of life, it’s an infant’s most critical and fully developed sense—the way babies start exploring the world, developing confidence, learning where their bodies end and everything else begins.
One of psychology’s most influential and disturbing studies of touch featured babies, in fact, although in this case they were lab monkeys. In the late 1950s, a team led by University of Wisconsin psychologist Harry Harlow took newborn rhesus macaques from their mothers and isolated them in cages with two vaguely monkey-shaped surrogates, one made of bare wire and the other covered with soft terry cloth. In one of Harlow’s experiments, only the wire surrogate dispensed milk. The babies taught themselves to drink from it, but as soon as they were done feeding—and whenever the scientists thrust a horrible head-waggling mechanical monster at them—they scurried to their softer fake mothers, grasping the cloth midsections in a clutch that can best be described as a desperate-looking embrace.
There’s an old Harlow monkey video floating around online, and it’s awful to watch: Harlow in his lab coat, calmly narrating to an onlooker as a lone caged baby scooches up against the terry cloth. But the psychologist had what was then a heretical point to make. Influential Western child-rearing authorities of his era were instructing parents not to touch their babies any more than absolutely necessary—to regard cuddling and kissing infants and small children as antiquated forms of overindulgence. (Your children will grow up weak and dependent, they insisted; besides, it’s unsanitary.)
Harlow’s monkey experiments were ethically repugnant by modern sensibilities, but they’re part of the reason we now know how wrong those authorities were. The baby macaques, our close evolutionary cousins, needed what Harlow called “contact comfort” so profoundly that they spurned a steady food source in favor of a soft touch.
Post-Harlow studies have multiplied the evidence for the power and chemistry of contact comfort. Scientists working with lab rats, for example, have found that gently handling and stroking the rats benefits the animals’ ability to learn and to manage stress. The right kind of skin-to-skin touch produces specific, measurable improvements in human babies’ health too: heartbeat, weight, resistance to infection. Neonatal incubators were designed to hold preemies and other low-birth-weight babies in protective sterile isolation, but some hospitals now also treat these babies with a vividly named protocol called kangaroo mother care—placing the newborns against their mothers’ bare chests, as soon as possible after birth, and keeping them there for many hours at a time.
Babies held skin-to-skin against their mothers have constant, immediate access to breast milk and can absorb protective maternal microorganisms. Hospital studies have also found that when the mother is ill or otherwise unable to hold the baby for long periods, another adult can substitute as a temporary skin-on-skin kangaroo. It’s not romantic hyperbole to say that the physical warmth and touch of a mother—or a father or any other attentive person who understands the delicacy required—can keep a newborn baby alive.
Feel these,” Veronica Santos said, and pulled four rectangular tiles from a desk drawer. “With your eyes closed.”
What my fingers told me, within seconds: Plastic, all four tiles. Pits on one. A bump on another. Curves. Angles. A raised square about the size of a stamp.
If you have the use of at least one hand, you engage in this kind of instant skin-to-brain memo over and over every day. Which of the shapes inside your purse is the pen you’re groping for? Is your wallet still in your back pocket? Did the kids leave the car upholstery sticky again? Right now, assuming you’re wearing a garment, try feeling the cloth: pants, shirt, pajamas, doesn’t matter. Just don’t look down.
Santos, an engineer who directs the Biomechatronics Laboratory at UCLA, had me do that too—describe the texture of the skirt I was wearing, without looking at it. You and I likely reacted the same way: We didn’t plunk down a finger, the way we might point out a spot on a map. Instead we moved a fingertip or two lightly back and forth across the fabric, or rubbed it between forefinger and thumb.
‘IF IT WILL HELP SOMEONE ELSE’
Anatomy taught us this, not culture; we humans are wrapped, as I once heard another scientist say, in “an incredibly complex sheet covered with sensors.” The skin, that is, our largest organ. Its layers contain hundreds of thousands of receptor cells, unevenly distributed around the body’s surface, specialized for various jobs. Some shoot the brain signals about temperature or the harmful disruption we perceive as pain. Some seem specialized to soothe; neuroscientist Francis McGlone is part of an international group of scientists studying receptors, densest in the hairy skin of the arms and back, that produce a pleasant feeling when the skin that contains them is brushed or stroked.
And some receptors send the brain the kind of informational detail that helps tell us, all day long, what we’re touching and doing and using. Mechanoreceptors, these are called; conveniently, evolutionarily, their density is especially high along the palmside skin of the fingertips and hand. They’re working for you—again, if you have the use of at least one hand—at this very moment. You're reading this story on some sort of device, presumably: a computer or tablet or phone. Try closing your eyes and running a finger around the edges. Let your fingertip find metal, plastic, corners, ridges.
Done? OK. Just now, from your hand to your brain, so much was happening. The pressure against your fingertip pads, the distortion to your skin, the vibrations you didn’t notice as you slid your finger over surfaces—each of these tiny alterations to your own sensor-covered sheet was stimulating its mechanoreceptors. Four varieties of such touch receptors have been identified, each with a subspecialty of its own; your vibration-sensing mechanoreceptors, for example, were firing away as your fingertips moved across textures of machine and cloth. Nerves carry those signals from the skin up to the brain, which instantly sorts and labels: Smooth! Different kind of smooth! Denim! Corduroy!
None of this takes place in isolation, of course. Context—smells, sounds, memory, situational input—affects everything. I know that’s corduroy because I learned long ago what corduroy feels like. It’s why the touch of another’s hand can please in one context and repel in another. “The entirety of our perception is built against the lifetime of our experience,” said Case Western Reserve University biomedical engineer Dustin Tyler. “The system we’re working with”—the interplay of receptors, nerves, and brain, he means—“is always taking information in, filing it, associating it, connecting it, and creating our us. There is no beginning and end to it. We’re trying to tap into that.”
Tyler leads the multispecialty team working with Brandon Prestwood and eight other patients, all of whom—because of amputation or in one case paralysis—have lost at least one limb’s natural capacity to feel touch. A conversation with Tyler can ricochet between metaphysics and plainspoken exuberance; I once asked him how he’d found his way from a college engineering major to sensory restoration experiments, and his thoughtful reply included “Holy crap!” and “Awesome.” Electrical engineering, awesome. Neural networks, same. Neural networks run on the body’s internal electricity, after all; it’s electrical pulses that carry signals up and down the nerves. “I was fascinated by the brain,” Tyler told me. “I’m still amazed every day at how the machine we ride around in works.”
The overlap of neuroscience and engineering has a long history. During the 1960s and ’70s, for example, scientists began successfully using electrical stimulation and electrodes, surgically implanted or attached to the skin, to activate the muscles of people with paralysis. Tyler’s wife, Joyce, is a retired occupational therapist, and her work with post-amputation patients helped draw his attention to a parallel 21st-century neuroengineering challenge: What about touch? With so many post-9/11 Iraq and Afghanistan war veterans suffering explosives injuries, the U.S. Veterans Affairs and Defense Departments were heavily funding prosthetics research. In their quest for “near natural,” a term the researchers sometimes use as they incorporate new technology into new kinds of replacement limbs, could they make these limbs feel near natural as well? Might a prosthesis with built-in sensors, in conjunction with implanted electrodes, let an amputee perceive touch through the device, as though it were a living body part?
The answer, based on investigations at Case Western Reserve and a half dozen other research centers, is yes. Sort of. “We’ve found this is a challenge with all our subjects—what words do you use?” Tyler said. “ ‘Tingle’ is the most typical. A lot of the time they have no reference frame. It’s not like anything they’ve felt before.”
Like a cold drop of water, one patient told him. Or that prickly sensation after your hand or foot has fallen asleep and is just starting to come back. “I use the word ‘buzzing’ sometimes, but that’s almost too strong,” Prestwood told me. “Like somebody’s taking the tip of a sewing needle, and not trying to prick my skin—just touching it.”
Each center is experimenting with its own combination of implants and prostheses; the graphic created with the guidance of engineer Max Ortiz Catalán at Sweden’s Chalmers University of Technology, shows the arrangement Chalmers scientists have developed. Here’s the central idea: A post-amputation patient—a man like Prestwood, say, who’s lost his whole forearm—has truncated nerves within the part that remains. Those nerves are still able to send signals the brain perceives as coming from the missing limb; this can be one of the causes of phantom limb sensation.
So the trick is restoring the signaling. The sensors being built into these experimental prostheses can convert contact with a surface—a prosthetic finger touching a tabletop, for example—into electric signals. This sends data to a computer, which determines the nerves that will have to be stimulated to make the brain perceive the touch in the appropriate place. (Index finger? Thumb? Second knuckle on ring finger?) The computer sends pulses down the patient’s implanted wiring to an electrode, which stimulates the indicated nerve, sending biological electric pulses up the nerves. Voilà: sensory information, ideally the right information, en route to the brain.
When it’s working correctly, all this should happen nearly instantaneously, from the brain’s perspective, like the neural signaling we’re born with. But no two bodies are exactly alike, and for the participants who volunteer, so far about two dozen in U.S. and European research hospitals, the process demands forbearance: serious surgery followed by many hours in research labs, answering questions while tethered to a computer. “Where do you seem to be feeling that?” “How about now?” Even so, Prestwood and other participants told me, they signed on mostly for the chance to help scientists learn how this will play out—whether injured veterans and other amputees may someday be able to wear a near-natural limb that actually feels that way.
“I just wanted to see if I could pay it forward,” said Keven Walgamott, a Utah real estate agent who lost parts of his right arm and foot two decades ago after a power line sparked while he was lifting a pump out of a well outside his home. Starting in 2016, Walgamott spent more than a year as a research volunteer at the University of Utah, where he was temporarily implanted with electrodes, including some developed by scientists there. Inside their lab, wired into a computer, Walgamott would put on one of the new sensorized prostheses—this one named the LUKE, for Life Under Kinetic Evolution but also for Luke Skywalker, the Star Wars Jedi who loses his hand in a light-saber fight with Darth Vader. By the end of The Empire Strikes Back, Luke has a prosthetic that can apparently do everything, including feel. If you enter “Walgamott eggs” or “Walgamott grapes” into a search engine, you’ll see him in a Utah lab with the LUKE: Concentrating, his face sober, he’s performing the kind of simple tasks that are almost impossible for hands that can’t feel.
He lifts a raw egg in its shell, with just the right delicacy, and sets it gently into a bowl. He holds a grape cluster with his actual hand, closes a prosthetic thumb and finger around a single grape, and pulls it off without squashing it. Video clips from other research centers show similar small triumphs: at Case Western Reserve, a blindfolded patient using sensorized prosthetic fingers to pinch and pull off the stems of cherries; in Sweden, a Chalmers patient inside his own garage, using tools with both his natural hand and his prosthetic one.
But what many of the study volunteers most wanted to feel—what they longed for, they told Tyler and other scientists—was the touch of human skin. “I was amazed at how many of them just wanted to connect with somebody,” Tyler said. “It wasn’t functional. It was just: ‘I want to hold my wife’s hand.’ ”
Once I asked Prestwood, after apologizing for the boorish question, why it mattered so much to perceive Amy’s fingers around his missing left hand when his intact right hand had been there all along. He didn’t take offense. He said it was hard to put into words. Finally he said: It made him feel whole. “Because it’s something I lost,” he said. “For six years I had not held my wife’s hand with my left hand, and now I was. It’s the emotion that goes with any kind of touch. It is … it’s being complete.”
Tyler found this at once moving, profound, and provocative. What does it mean to feel the joy of a loved one’s touch when the sensation is like the tip of a sewing needle? And if the right circumstances can make a certain kind of zap to the cortex register as the squeeze of human fingers, what might that imply for individuals separated by distance? “Like, holy cow. What could we do? ” Tyler said. “This is way beyond prosthetics.”
Which brings us to Veronica Santos, and her Los Angeles lab full of robots. “Biomechatronics” essentially means what it sounds like, the blending of biological and mechanical science, and Santos specializes in developing sensors for robot hands. Much of her work is meant to make robots more useful in medical settings and in places that are dangerous for humans, like the depths of the sea. But three years ago she began collaborating with Tyler on a series of experiments in … well, the nomenclature is still unsettled. “Remote touch.” “Distributed touch.” Just picture this: One person in Los Angeles, one person in Cleveland. Across 2,000 miles—the distance from UCLA to Case Western Reserve—they’re trying to shake hands.
A robot is involved, and I’m about to explain how; Santos and Tyler decided to hook me up for a go as the Cleveland end in one of their experiments. Scientists and science fiction writers have for many decades considered how this might play out, a person in one place making what feels like physical contact with a person or an object somewhere else. If you’ve ever felt a cell phone vibrate, you’re part of the endeavor: That’s a wireless signal, from another locale, firing a minuscule motor that fires mechanoreceptors in your skin.
The engineering term of art is “haptics,” from the Greek haptikos: relating to the sense of touch. Any technology designed to set off touch sensations is haptic—those restaurant pagers that buzz in your hand when your order’s ready, for example. You can buy virtual reality gloves now, to be worn with virtual reality goggles and wired to make your actual fingers and palms feel something like contact as your virtual hands touch virtual things. (You see a wall in the virtual room your goggles are displaying; lifting your actual hand puts your virtual hand against the wall, and a force in the gloves pushes back to create the illusion that you can’t bust through. Or your virtual fingers touch a virtual tractor on virtual farmland, and your actual fingers feel the engine throb.) Gamers are currently the biggest consumer market for such gloves; they’re also being used to make VR training devices, such as flight simulators, feel more realistic.
Compared to the symphony that is natural human touch, though, the technology has a long way to go. That’s not my metaphor, the symphony; I heard it from three different scientists trying to help me appreciate the orchestral coordination behind sensations we take for granted. “I’m making do with these amazing engineered materials, and they’re still our kludgy way of trying to re-create what my little nephew was just born with, nine months ago,” Santos told me. “I’m still humbled by that.”
The day I set out to feel her fingers from eight states away, Santos was wearing a T-shirt, blue jeans, and a pandemic face mask. I caught a wobbly glimpse of her, livestreamed and in 3D, through the VR goggles two Case Western Reserve researchers had strapped onto my head. Then she tipped abruptly sideways, vanishing from view, and what was I seeing now? Floor tiles. A desk leg, two shod feet—Oh. Santos’s feet. I raised my goggled eyes. “Hi,” Santos said.
What she was greeting was a wheeled robot, which after bumbling around the UCLA lab furniture had finally stopped to point its video camera face at hers. To use the researchers’ parlance, I was “embodying” that robot, seeing through its eyes, hearing through its microphone, and lurching like a drunk because of the human incompetently navigating from Cleveland. Nothing so remarkable about that, in the era of drones; the novel part was my own right hand, which was—here’s that word again—embodying the metal and plastic hand of the rolling Los Angeles robot. Taped to my gloved palm and my index finger were two metal disks. Wires connected the disks to a lab computer, which was internet connected to the robot, which had tactile sensors on its own robot fingers. Whenever the robot touched a surface, the sensors shot pulses to its robot brain—its computer. Those pulses zipped across the country, down the lab wiring onto my hand disks, through my skin, and then up my nerves into my somatosensory cortex.
Buzzing, Prestwood had said, but fainter. A needle’s tip. Those were good words for this—plus a pressure against my fingers when I, meaning the robot, closed my hand around the plastic wineglass on a table beside Santos. The experiment was designed to suggest two separated people celebrating a business deal with a glasses-clinking toast and a handshake. I failed the toasting part; my robot self kept dropping the glass. But the researcher whose place I had temporarily taken, a Case Western Reserve graduate student named Luis Mesias, was much more adept by now with long-distance touch. He’d learned how to maneuver his gloved hand expertly enough to pick up the glass in Los Angeles by the stem and tap it against a second glass his counterpart had raised. Feeling the tap, in Cleveland. Clack.
Mesias, embodying the Santos lab robot, has long-distance peeled a banana. He has long-distance squeezed a toothpaste tube with the gentle precision of a person preparing to brush his teeth. Give the research enough time and you can conjure a future in which touch is transmitted, as vividly as sight and sound are now, into tele-everything: work, travel, shopping, family gatherings. Consolation. Sexual intimacy. Medical care, the kind that requires a practitioner’s touch. Maybe in the metaverse, that not-yet-realized virtual gathering place that has leaped from sci-fi imagining into corporate business models, something we put on our actual bodies—gloves, a suit, whatever—will convince our brains that we truly are feeling virtual people, virtual animals, virtual things.
Maybe. If I hadn’t been looking right at Santos’s face as she went in for the handshake … if I hadn’t once shaken her actual hand and walked beside her in Los Angeles learning the timbre of her voice … in another context, I mean, the sudden buzzing-and-needles jolt to my skin would have felt nothing like the clasp of human fingers. It made me suck in my breath, though. I could see her face, as she laid her bare hand over the robot’s, and for a long time afterward I thought about Brandon and Amy Prestwood, and the solidity of my daughter’s embracing arms beneath that barrier of plastic, and how fully the mind can meld story and setting with the pulses running along human nerves.
Two years ago, in the early weeks of pandemic lockdown, a pastor told me about his first Sunday services on Zoom. What his congregants missed most acutely, he said, was the passing of the peace—the murmuring of “peace of Christ be with you” and the quick clasp of hands, there in the pews, one person to another. It didn’t occur to either of us to wonder just then about the biology of that touch, a two-second deformation of skin cells making humans feel connected to each other and to their God. The neural diagrams now taped to my office walls include many explanatory labels, Receptor Sites and Impulse Conduction and so forth, and when I asked Tyler just how much of this might eventually be replicated by bioengineering—how much of the symphony, via body electrodes and computers?—he corrected me before my question was finished. “ ‘Replicate’ is dangerous,” he said. “We struggle a lot with this. We don’t have the fidelity to replicate the natural scheme. The general term we use is ‘restore.’ ”
From my Merriam-Webster, the red clothbound edition my grandmother gave me a long time ago: Restore: to renew, to rebuild, to give back. Newer dictionaries inhabit my cell phone, but I keep that volume in reach because my palm on its worn cover sends my brain a story it understands. I’ve watched Brandon Prestwood speak before audiences of scientists; it still makes him nervous, he told me, but he’s learned simply to tell them what happened to him and watch them sit up straighter when he reaches the part about feeling Amy’s hand.
“In one of the speeches I gave, I talked about the military guy that’s been stationed in Afghanistan or whatever for a year,” Prestwood told me one of the last times we spoke. This was a hypothetical military guy, you understand, and Prestwood was riffing, imagining where the experimentation might lead. “And before he left, his wife got pregnant, he’s never seen his daughter, but he’s able to, you know, reach out and touch her via this system somehow. Or the businessman who hasn’t been home in six months. The National Geographic photographer headed off to the Ivory Coast.”
He meant Lynn Johnson, whose images accompany this article and who spent time with the Prestwoods at their home in Hickory, North Carolina. She had mentioned impending work in Africa, Prestwood said, and he was envisioning Johnson, her luggage of the future containing some over-the-counter version of nerve-stimulating electrodes and tactile sensors, with a matching setup at her widowed father’s home in Arizona. “Just to be able to give, and receive, the reassuring touch,” he said.
The National Geographic Society, committed to illuminating and protecting the wonder of our world, has funded Explorer Lynn Johnson’s storytelling about the human condition since 2014. Learn more about the Society's support of Explorers.
This story appears in the June 2022 issue of National Geographic magazine.