• Uncategorized

Robots’ abilities still far from human, but getting ever closer

nnttttt

It may seem uncomfortably close to science fiction, but robots are moving ever nearer to acquiring humanlike abilities to see, smell and sense their surroundings, allowing them to operate more independently and perform some of the dangerous, dirty and dull jobs people don’t want to do.

They can smell gas leaks, conduct underwater surveillance and even sort boxes by shape and color and toss them into the appropriate warehouse bin. Advances in sensor technology and software allow these machines to make split-second decisions without human masters overseeing them about how to follow a scent trail or where to go to next.

n

“They are gaining human capabilities, whether it’s smell or touch or recognizing our voices,” said Daniel Wilson, who has a doctorate in robotics and authored “Robopocalypse,” a techno-thriller about what happens when robots go wrong.

“If they are going to solve human problems, they will have to have human abilities. Those are things that robots will have to understand if they play a role in our lives,” he said.

Until now, robots have had to navigate via small infrared sensors that keep them from bumping into things. Some have relied on video cameras that send images to human operators. But a new generation of robots is gaining the ability to understand voices, see objects with the same depth of perception as humans and use grasping arms with dexterity close to that of humans.

Of course, none of them is yet as lifelike as Sonny, the android of Isaac Asimov’s novel, and the subsequent movie “I, Robot,” who can feel, think for himself, move on his own and, in a limited way, emote. Most robots with advanced sensing abilities are still in the experimental stage. More than toys but not yet tools, they work well in the laboratory but cannot yet handle real-world situations.

Take, for example, the robots that are sorting boxes, picking them up and placing them into the right container. This robot uses 2-D and 3-D video cameras and software to look at the size and shape of the object and then decide where it should go.

The robot works pretty well — as long as the boxes are pretty much rectangular and aren’t moving, said Stanford University computer science professor Gary Bradski, co-founder of Industrial Perception, the start-up that invented it. But the robot isn’t quite ready to replace human workers in the mailroom or on the factory floor.

“It’s easy to get 80 or 90 percent of the way there,” he said. “But it’s getting the speed and reliability to make it economic. You can’t fail very often, otherwise, you’re not saving any labor.”

Getting robots to smell is one of the bigger challenges.

A recent project out of the University of Tokyo is taking a step in that direction. Scientists at the university recently unveiled a tiny robot that is driven by a male silkworm moth responding to a female moth’s seductive pheromone aroma.

The researchers built a motorized wheeled car that moves when a moth, spurred by the smell, launches into a mating dance of repeated zigzags on top of a trackball, similar to the ones used inside a computer mouse. As the moth does its dance, sensors transmit its motions to the robot’s motors, allowing it to follow the path chosen by the male.

The researchers said the “odor-tracking behaviors of animals” could eventually be “applied to other autonomous robots so they can track down smells, and subsequent sources, of environmental spills and leaks when fitted with highly sensitive sensors.”

Noriyasu Ando, an associate professor at the University of Tokyo’s Research Center for Advanced Science and Technology who worked on the moth robot, said the challenge is to develop a robot that can “behave alone, free from external wired connections because the silk moth turns quickly and rotates very often.”

He said the ultimate goal is to develop a robot with its own smelling capabilities, one that can follow a trail just like the moth on the trackball. The team is currently trying to build an artificial brain they’ve named Kei. The motor-moth using its sense of smell is one step toward that goal, he said.

Achim Lilienthal, who directs the mobile robotics and olfaction lab at Orebro University in Sweden, said smell is more complicated for robots than vision. Cameras can see an object as long as there is enough light, while odors exist as plumes and patches in the air and are not consistent in strength, which makes finding the source difficult.

He gave the example of methane emanating from an old landfill. The town managing the site had set up devices to capture the gas produced by the landfill’s decay and burn it to heat the local hospital. But over time, as the plastic lining beneath the landfill developed cracks, more than half of the methane was evading capture. The town hired someone to walk around the site and sniff for leaks, but that didn’t work very well because the human nose is not very efficient.

Enter Lilienthal’s gasbot, which looks like a lawn mower with a big metal eyeball perched on top of a metal pole. This mini all-terrain vehicle picks up smells using two laser beams: One absorbs the chemical signature of the methane and determines its concentration in the air; the second helps provide a three-dimensional map of the gas plume.

The advantage of the gasbot is that the lasers detect the gas remotely, without machine or human having direct contact with the plume.

“For most gas sensors (such as smoke detectors), you need to (physically) encounter the smell,” whereas the gasbot uses its lasers to detect gas at a distance, Lilienthal said.

Scientists are working as well to create effective underwater robots. This task is highly challenging because there is often not enough light for cameras to function well, while swirling currents and eddies play havoc with smells and chemical plumes.

To deal with this, a European group has built a robot that uses something called lateral line sensing. The lateral line is a series of nerve cells in fish that runs from just below the head to the tail and allows them to sense the speed and direction of currents, helping them catch food and swim in schools without bumping into each other.

More than 30,000 fish species have lateral line sensing, according to Maarja Kruusmaa, a professor of bio-robotics at the Tallinn University of Technology in Estonia. She and colleagues set out to create an electronic equivalent that would allow underwater robots to navigate more efficiently through currents.

After four years of work they came up with FILOSE (Robotic FIsh LOcomotion and SEnsing), a robot that is shaped like a rainbow trout. The researchers developed tiny sensors to monitor pressure differences in the water flowing around the robot, allowing it to follow in the wake of an object to cut energy use, according to Kruusmaa.

“It is similar to reducing your effort in the tailwind of another cyclist or reducing the fuel consumption of your car by driving behind a truck,” she said.

The robot is driven by a small electric motor and can be outfitted with a video camera for surveillance or with chemical sensors to detect pollution. The next step is to take FILOSE for a swim outside the lab to see how it does in the real world.

Meanwhile, Kanna Rajan, principal researcher for autonomy at the Monterey Bay Aquarium Research Institute, is designing underwater robots that are programmed to make their own decisions. They use sensors to determine where to hunt for oil spills, for instance, or to swim toward a place in the ocean that scientists want to study, such as a feeding ground for fish or a range of underwater volcanoes that might erupt.

Rajan, a former NASA researcher who helped develop the rovers that landed on Mars almost 10 years ago, said it is harder to build smart robots that work underwater than ones that function in outer space.

That’s because communications for the latter travel through the relatively quiet vacuum of space. Underwater communications, on the other hand, are often blocked by layers of warm and cold water, slow-moving underwater storms and the sounds of passing ships and wildlife. The ocean’s salinity also tends to degrade robots much more than the cold temperatures of space, Rajan said.

“The ocean is a lot more harsh,” he said. “If you go in deep space, there’s not much going on, you are very careful as to what you do. Even on Mars you can talk to your robot.”

So what are the prospects for a seeing, smelling, sensing robot that can work in the real world? That probably won’t materialize anytime soon, according to Jelle Atema, a professor of biology at Boston University and the Woods Hole Oceanographic Institution.

“When it comes to a broad, robust exposure in the natural environment and still being able to perform the task,” Atema said, “animals have it all over human engineering.”

ntnttnttntntnttnttntntnttnttntnnnt

In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.

ntSUBSCRIBE NOWnnnntnttPHOTO GALLERY (CLICK TO ENLARGE)ntntnttnt ntttttttttntttttnttttttntttttttnttttttntttttnttttnttttttttttntttttnttttttntttttttnttttttntttttnttttnttttttn ntnnntnttKEYWORDSntntrobots, interactivenn n n tn tttt