Should insurance companies be allowed to use your purchasing history to set prices on your policies? Should self-driving cars be allowed on public roads? What are the implications of robotic police?
These are some of the questions Frank Wattenberg explores during his webinar, Robotics and Artificial Intelligence – Shaping a Future Shared with Robots. Frank is a professor in the Department of Mathematical Sciences at the United States Military Academy (USMA) and a Co-Principal Investigator of NCSCE’s Engaging Mathematics initiative.
USMA uses robotics and artificial intelligence as interdisciplinary topics that cut across the curriculum, helping to unify the academic experience. The topics provide a great context for discussing a wide range of interrelated areas. For example, an assignment on self-driving cars raises questions in economics (Who is liable for accidents?), ethics (What would happen to cab drivers?), and logistics (Should cars driven by humans and self-driving cars be allowed on the same roads?).
Students need to have what Frank calls “intellectual integrity” to successfully program a robotic vehicle to pull into a garage and park itself, without crashing into the garage wall. The kits he uses with his cadets cost about $300 and give students experience with hands-on building and coding. It isn’t enough to get a problem 80% right. If the final product is going to be used in the real world, it needs to work; requiring rigor, tenacity, and attention to detail to perfect the design.
As part of his webinar, Frank led Yuxi Chen, who helped with filming and webinar production, as she built her own self-driving and self-parking robotic car. She used her hand to imitate the garage wall. An ultrasonic range finder mounted at the front of her car sensed the presence of her hand with sound waves, telling the car to stop before a collision.
Designing self-driving cars is a good starting point for cadets because its use in the real world is clear, and because it covers core content, such as linear functions. Another assignment Frank does with his students relates to sentries, or soldiers who stand guard, controlling access to a place. He describes the sentry job as both boring and dangerous–a bad combination for a human, but a perfect task for a robot.
The two big worries with robotic sentries are false positives (taking unnecessary defensive action) and false negatives (failing to take defensive action when there is a real threat). In his webinar, Frank discusses this in the context of a parking garage whose access is managed by a robotic arm. His students graph different scenarios, showing possible behavior patterns of approaching cars that are authorized or unauthorized to access the garage. Students also discuss the varying cost levels of defensive action. Blowing up an unauthorized car before it drives through the gate would be an extreme measure with high cost, whereas raising tire shredders or sounding alarms and flashing lights would be lower cost. This concept of cost, Frank mentions, is analogous to other situations, such as medicine. Different health interventions come at different costs than others.
For more on how Frank uses robotics and artificial intelligence with his USMA cadets, view his webinar.
Frank and his colleague Matthew Mogensen, an instructor of mathematics at USMA, also explored these topics with participants at the 2015 SENCER Summer Institute through a hands-on robotics workshop and panel discussion on the civic implications of robotics and artificial intelligence. Frank (firstname.lastname@example.org) and Matt (email@example.com) invite you to email them with questions about using robotics and artificial intelligence in the classroom, or to continue the discussion further.
Photo credit: Jenn and Tony Bot (CC BY-NC 2.0)