What is good service? We can certainly name examples of it: a mechanic who fixes cars on time, a restaurant that accepts special food orders for its customers. But according to Wayne Iba, associate professor of computer science at Westmont College, good service is not as easily defined as we think. “We know it when we see it,” says Iba, “but we can’t always define it.” On October 19, Iba gave a talk at Westmont College on exploring service rendered by artificial intelligence in simulated environments.
“[Service] is more than just competency,” said Iba, citing the notorious Microsoft Paper Clip that used to “help” Word users with their spelling and grammar as an example of programmed service. The Paper Clip, which would appear onscreen whenever the user made a spelling or grammar error, was known for making undesired corrections or re-enabling functions that the user had disabled. According to Iba, the quality of service can be measured by competency, attention, anticipation, persistence, deference, and integrity.
Iba created virtual environments to examine how artificial intelligence learns to provide service. A demonstration was shown using a simulator called MDEN: A green triangle was assisted by a blue triangle (both called intelligent agents) to overcome obstacles in a two-dimensional grid in order to acquire “food” – with varying degrees of success. Service developed in two ways: responsiveness to the agent’s need; and through anticipation, the ability to predict and fulfill the agent’s needs in an efficient or inventive manner.
By studying how service develops in the environments, computer scientists can learn more about what service means – important, according to Iba, considering that “computers continue to revolutionize life : so more computer scientists are concerned with service.”
Iba also touched upon the shortage of computer scientists. The demand for computer science positions has been increasing steadily yet there has not been a proportional rise in computer science majors. “We need to extend the three Rs,” said Iba, referring to a popular shorthand for basic educational priorities: reading, writing, and ‘rithmetic. “I think ‘Rogramming [programming] should be added.”
Also speaking at the talk was Kim Kihlstrom, another associate professor of computer science, and Chris Hoeckley (via videoconferencing), adjunct instructor of philosophy. Kihlstrom talked about the need for diversity in the computer science field given that computer science “has changed how other disciplines are done. It’s gone beyond data storage.” He noted that men and women are equals when it comes to using the Web, but not in development. Hoeckly talked about how sometimes “anticipation is unwelcome,” and the importance of considering what kind of help is wanted from computer-driven assistants. By relying on “inventive” computer assistants we “cede control to the machine’s designer,” Hoeckly said.
Iba concluded by raising this provocative question: Will the artificial intelligence of a program exhibit the same behaviors and values as the person or persons who created the program? Or will the artificial intelligence develop different behaviors and values as it learns?