Skip to main content

Drones Are Doing the Dirty, Dangerous Work of Search and Rescue

As drones get less expensive and computer vision systems improve, rescuers are getting help from artificial eyes in the sky

Tod Seelie/Getty Images

A few years ago a Scottish mountaineer who was descending Broad Peak—a menacing, 26,400-foot Himalayan crag on the Pakistan-China border—fell from an ice cliff. His team, inferring an accident had happened because he didn’t return and thinking he couldn’t possibly have survived, continued to a lower camp. When other mountaineers there heard about the lost climber, they thought perhaps they could help: They happened to have a drone. Maybe they could try to find the man or his body?

The mountaineers flew the quadcopter—and scanned with its camera—through cold and blustery winds to the climber’s suspected position. And despite being alone without supplies for 36 hours, there he was—alive—clinging to an icy cliff face with his crampons. He was tagged with GPS coordinates, and a rescue team reached him within hours, ending the harrowing ordeal. “Without use of the drone, locating the climber and executing such an efficient rescue would have been unlikely,” according to a 2019 case study published in Wilderness & Environmental Medicine.

The climber was one of thousands of people who get lost, hurt or killed in the wilderness every year. Mushroom hunters wander, hikers turn their ankles and mountaineers get “cliffed out,” finding themselves in a spot where they can neither go up nor down. Volunteer search-and-rescue (SAR) teams often need to step in. With the survival clock ticking, SAR teams doing searches typically have had to scout by guesswork and meticulously comb big patches of remote territory on foot. Traditionally, if they knew approximately where a missing person was—but needed to, say, scale a rock wall to get to that person—they would have to gather their geography intel from the field. But all this is now changing as SAR teams get access to cheaper and easy-to-use drones, helping them find lost people faster while keeping rescuers safer.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Danger, Dirt and Drudgery

When Colorado’s Chaffee County Search and Rescue–South started using remotely piloted craft in 2018, it was only because a team member bought and donated four commercial video camera drones. Today this SAR team has also gained a newer model that was half the price and designed specifically for public service rather than cinematography, according to drone program leader Bill Sample. The drones are deployed from a mobile command center—a cargo trailer with two large monitors for viewing camera feeds and generators that recharge batteries. A few licensed members of the team operate the drones.

The drone operators are currently training to use the machines to help their team with swift-water rescues, looking for people who have fallen in the Arkansas River. “You can hop on a raft, and you can go down the stream at eight miles an hour,” trying to search while avoiding boulders or other hazards, Sample says. “Or you can get the drone, and you can go downstream at 25 or 30 miles per hour with much less risk.”

The view from a drone flying above the steep McCoy Gulch on the southern side of Colorado's Mount Shavano. Rescuers have lowered a harness to a man about to be lifted to safety.

Chaffee Search and Rescue South

Chaffee County Search and Rescue–South now uses drones for about 20 percent of its missions—the ones that, for searchers using traditional techniques, would fall within categories Sample calls the “three D’s.”

“Give me the dangerous jobs” (such as scouting precarious cliffs), he says. Or give him the dirty jobs, he continues—such as the time his team slogged through a swamp in search of a missing person. Finally, give the drone scouts the drudge work, such as tediously scouring a large area. In Chaffee County, that might mean a steep mountainside above tree line. “If you’ve got good visibility, you can cover almost a square mile an hour” with a drone, Sample says. People (and their feet and eyes) are much slower and would take far more precious time to reach a mountaintop.

A Drone’s-Eye View

These drone applications mirror other success stories across the country. Last fall, for example, a drone spotted a missing hiker who had fallen hundreds of feet down a cliff north of Los Angeles. And aerial vehicles are even more useful when coupled with artificially intelligent software that can quickly parse a drone’s video feed and, often far faster than a human eye, identify a person among natural features. One such piece of software, called Loc8, seeks a specific color in photographs or video frames, looking for clustered pixels an eye might miss—such as a hint of orange if a missing climber is reported to have been wearing an orange jacket.

This type of computer vision is an active area of international academic research. Because desktop computers can be awkward in the wilderness, researchers in Scotland are creating object-detection software simple enough to run on a cell phone. In Croatia two scientists trained a neural network to recognize humans using a dataset of drone images collected mainly for search-and-rescue operations in their region.

In Austria computer scientists have created techniques to spot humans who are hidden—even to an object-detection algorithm—by tree cover. In papers published between 2020 and 2024, computer scientist Oliver Bimber of Johannes Kepler University Linz has been pursuing the idea of “airborne optical sectioning.” This technique helps compensate for the fact that drone cameras typically have small lenses. “That means everything that you capture, no matter how far it is away from the camera, is in focus,” he explains.

A larger lens lets an operator zoom in on a specific focal distance, leaving the rest of an image blurry. In a forested area, for example, a user could deliberately blur the treetops while keeping the ground beneath them in focus. With a big enough lens, the foliage becomes so unfocused that its reflected light is spread across the picture. “The occluding trees are still in the image,” Bimber says. “But the image signal has been weakened so much that they disappear.” What remains are the sharp features below them—such as, hopefully, a lost or injured human.

Inside a trailer that serves at the drone control center for Chaffee Search and Rescue South. The map on the left monitor shows the drone's location and takeoff point, while the other monitor (right) displays the pilot's view, including a live video feed from the drone.

Chaffee Search and Rescue South

Bimber’s approach simulates a camera with a larger lens to mimic the focusing effect. (For his purposes, “larger” would mean many meters wide—obviously impractical for a small quadcopter.) Essentially, it combines images that were captured either in sequence from an individual drone or simultaneously from a drone swarm. The synthesis of these many images into a cohesive picture achieves the sharpening effect. “This technique,” he says, “can remove occlusion in real time,” digitally pulling the leaves back to reveal what’s beneath. The published tests have involved single drones and a simulated swarm of them, but a forthcoming paper in Science describes a study that used a physical swarm.

With data gathered from the synthesized large lens, Bimber says, his research group uses neural networks—similar to those developed by other teams for SAR applications—to pick out and track human-shaped objects. In the researchers’ newer work with drone swarms, he adds, they have also advanced to “anomaly detection”: searching beneath the trees for things that could be “abnormal with respect to color, temperature, or motion pattern” and then tracking them. SAR teams can then examine those anomalies specifically for signs of people.

In their tests—in which people hide in the woods while a camera drone flies above them—the researchers found their quarry more than 90 percent of the time when they used a single drone to seek a stationary person. With the “lost” person on the move, a drone swarm yielded similar results.

Tools like Bimber’s could potentially be paired with others’ research products, such as software that would map a drone’s optimal flight path, for human searches. The drones could even help with radio communication, acting as repeaters that would boost signals in the steep and remote areas rescuers often find themselves in. In southern Utah a SAR team helped researchers test such a system by trekking to locations where they typically had trouble connecting. They were able to call back to base with no problem. In the U.K. the Warwickshire Search and Rescue team has even partnered with Virgin to test a drone that directly delivers 5G connectivity to rescuers, no matter where they are.

While people remain central to search and rescue, drones can be useful partners. Figuring out how that looks in Chaffee County has been rewarding for Sample, who is a retired engineer. “I just love to solve problems,” he says. “That's what engineers do: solve problems. Keeping the drones in the air and trying to find people with them does that.” And whether an organic or synthetic eye spots them first, the wandering or wounded will be just as grateful.

Sarah Scoles is a Colorado-based science journalist, a contributing editor at Scientific American and Popular Science and a senior contributor at Undark. Her newest book is Countdown: The Blinding Future of Nuclear Weapons (Bold Type Books, 2024).

More by Sarah Scoles