Artificial Intelligence and Mobile Robots: : Case Studies of Successful Robot Systems

Kybernetes

ISSN: 0368-492X

Article publication date: 1 March 1999

289

Keywords

Citation

Andrew, A.M. (1999), "Artificial Intelligence and Mobile Robots: : Case Studies of Successful Robot Systems", Kybernetes, Vol. 28 No. 2, pp. 243-246. https://doi.org/10.1108/k.1999.28.2.243.3

Publisher

:

Emerald Group Publishing Limited


Mobile robots are of particular interest from a number of points of view. They have a special appeal as the traditional popular image of a robot, and there are many purposes that can be served only by a mobile device. They provide a rigorous “proof of the pudding” testing‐ground for the relevant AI techniques. It is also an area where useful comparisons can be made between animal and machine performance. None of the robots described imitates the outward shape of a person, but they clearly have human‐like characteristics including, in some cases, two‐way speech communication.

The 13 chapters of this book, following a preface and a useful introduction, give descriptions of the same number of successful mobile robot systems, chosen to represent the best available from leading universities and research laboratories. Many of the robots have distinguished themselves, usually with first or second place finishes, at indoor and outdoor mobile robot competitions. As the editors point out in the introduction: “These are not robots that simply work in the laboratory under constrained conditions. They have left the lab and been tested in natural and unknown environments. They have all demonstrated their robustness many times over.”

In the introduction the state‐of‐the‐art is reviewed in more detail. The performance of mobile robots has been greatly enhanced in recent years by a paradigm shift prompted mainly by a paper in 1986 which introduced “subsumption architecture”. Prior to the shift, mobile robots generally reviewed their sensory data and formed a plan of action. The perception of a previously‐unnoticed feature, such as a hole in the floor in the robot’s path, could only be taken into account by tediously repeating the whole process of analysis and planning.

The newer paradigm follows the example of biological systems, particularly vertebrate motor systems relying on spinal reflex arcs. Means of obstacle avoidance are embodied, and can produce a deviation in the robot’s path or actions when necessary, but without abandoning the overall plan. The higher levels subsume these lower (“spinal”) levels; hence the name of the architecture.

In each of the 13 chapters a separate complete system is described, so readers have the chance to compare whatever feature of the systems they choose. The principles of operation are described in a fair amount of detail, including sections of pseudo‐code. Nevertheless the papers are grouped under three headings, according to the aspect to which they contribute most strongly. The three parts of the book are concerned with Mapping and Navigation (four papers), Vision for Mobile Robots (three papers), and Mobile Robot Architectures (six papers). Each part has an introductory section by one or two of the editors.

Of the four papers in part one, three are explicitly aimed at robots to operate in an office environment, and the other describes a similar system, but without explicit reference to this application. Such robots can act as messengers, or collect rubbish, or can be tour guides. Navigation can depend on maps of different kinds, one being a metric map, usually stored as a grid, and another being a topological map in the form of a graph in which nodes correspond to objects (obstacles or waymarkers) linked by arcs denoting relationships. The two forms of mapping may be combined and one of the schemes incorporates a means of deriving topological maps from the grid kind.

The robot described in the fourth chapter is said to roam the corridors of the Carnegie‐Mellon University, and uses navigation architecture based on “partially observable Markov decision process models”, or POMDPs. These have found applications in other areas in which decisions must be made on uncertain data, and standard algorithms are available.

In the introduction to part two, on Vision for Mobile Robots, it is mentioned that for a long time designers of mobile robots were reluctant to include vision, with its large overhead of processing. Where the main requirement is the detection of obstacles to allow navigation among them, simple ultrasonic or infrared rangefinding devices are more convenient. Visual input is needed where scenes and objects have to be examined in detail, and is increasingly coming into use.

Of the three papers in part two, one describes an early visually‐guided robot, unfortunately no longer operating, which offered to guide visitors around a floor of the AI Laboratory of MIT. The second is an exposition of a principle of “action‐oriented perception”, according to which any perceptual process should be planned with regard to a perceptual context, which includes analysis of the information requirements of the overall task. The principle is illustrated by its application to an unmanned land vehicle. The remaining paper in this part is also concerned with automatic vehicle guidance, with an immediate practical application, namely the safe steering of a truck onto the hard shoulder of a motorway, should the driver fall asleep.

In the papers in part three the focus is on architecture. In the first two, the treatment is in the context of highly versatile mobile robots including, in the second paper, the line of development that includes the well‐known names of “Shakey” and “Flakey”, both products of the Stanford Research Institute. “Flakey” is able to respond to speech commands of simple syntax, and provision is made for a human‐like immediate response such as a nod or an appropriate spoken reply, or, if the utterance has not been recognised, something like “what?”

The applications considered in this part range widely. There is another paper on vehicle guidance, one on robots able to collaborate as a team, and another on robots to operate alongside humans. Another describes an autonomous underwater vehicle, and “virtual reality” simulators to allow it to be tested in the laboratory under conditions corresponding, as far as sensory input is concerned, to a great depth in the ocean.

A paper from the group in Oxford University describes advanced methods for the guidance of mobile robots used to convey goods in manufacturing industry. Robot trucks for this purpose are widely used, guided by cables buried in the floor, or by painted lines. The use of rigidly defined routes has disadvantages, since there will certainly be occasions on which some activity requires blocking of the route. In the paper an alternative scheme for guiding robot trucks is described in considerable detail. The autonomous vehicles scan with laser beams to locate reflecting beacons, and can thereby navigate to their destination, finding an alternative route when the previously‐known one is found to be blocked, and remembering the new route for future trips.

There is a wealth of useful material in this collection, with all of it, as emphasised in the Preface, referring to systems that have been well tested outside the laboratory and shown to be robust in operation.

Related articles