---
  Robots and Ethics

 Robots and Ethics



The next great consumer technology will arrive in the form of personal robots, says Ron Arkin a Regents professor in the College of Computing and director of the Mobile Robot Lab.

The innovations will be accompanied by a host of ethical concerns about human-robot interaction, adds Arkin, who co-teaches a course on robots and society with Charles Isbell, an assistant professor in the College of Computing.

The introduction of robots to the general public may be sluggish at first, but it is inevitable, says Arkin, reflecting a consensus among roboticists worldwide. Among the tasks frequently mentioned as suitable for personal or domestic robots are housecleaning, cooking, helping care for elderly or disabled people, tutoring and secretarial tasks.

As robots become more animated and sophisticated, Arkin says, they may even be designed as humanoid companions, teaching humans how to dance, for example.

This first generation of personal robots is rising from advances in microelectronics, machine vision, voice recognition, microelectronic mechanical systems called MEMS, artificial intelligence and numerous other computing technologies.

"Georgia Tech is starting to push together as a coherent group," Arkin says, "trying to make big things happen here by drawing faculty from mechanical engineering, the College of Computing, aerospace engineering, industrial and systems engineering, electrical and computer engineering, biomedical engineering and the Georgia Tech Research Institute."

In the United States, the overwhelming majority of financial support for robotics research and development comes from the Department of Defense.

That doesn't mean advances in personal or other nonmilitary robots are entirely neglected, says Arkin, whose Mobile Robot Lab is one of the funding recipients.

Much of the lab's work aims to identify and combine the elements of reflexive behaviors with cognitive functioning to create autonomous, decision-making robots. The process is aided by techniques that help a robot "learn" from its interaction with the environment.

Human-robot interaction, military applications — these issues are also addressed in Arkin's robots and society class.

"What are we doing in terms of military applications? Is this appropriate use? Should robots be able to employ lethal force?" Arkin asks rhetorically. "At some point, do we trust the machines more than we trust ourselves?

"My concern right now is not to formulate doctrine, but rather to formulate a consciousness among roboticists and robotic scientists that these questions need to be asked," he says. "Georgia Tech, through this course development, has provided me a wonderful forum to share those questions with my undergraduates."

Although these questions are meant to stir discussion rather than resolution, there is virtue in considering the ethics of a new technology in advance of its deployment, according to Arkin.

"Historically, technologists have been woefully ignorant of the implications of what they created," he says. "I would probably put myself in that category until a few years ago. Research and development will move forward, but we still need to understand what the consequences are, then come to grips with them and determine whether we should do anything about them."

©2005 Georgia Tech Alumni Association

 
Around the Corner from Everywhere


Phil Gordon Plays a Winning Hand