Aimee van Wynsberghe is proof that the detour is sometimes more compelling than the intended destination.
Since her studies at Western veered from cell biology to the almost-life of artificial intelligence nearly 15 years ago, van Wynsberghe, BSc’06, has emerged as one of the world’s leading experts on robot ethics.
In January, she began a five-year appointment at the University of Bonn as an Alexander von Humboldt Professor. One of Germany’s most prestigious academic and research professorships, the appointment brings with it €3.5-million (about $5.5-million Canadian) to start a lab focusing on the applied ethics of artificial intelligence and, in particular, sustainable AI.
Technology, van Wynsberghe said, is anything but ethically neutral – and since robots have no intrinsic moral intelligence, it’s up to humans to bridge any gap between what is possible and what is responsible.
“Instead of ethics being an afterthought, like, ‘Oh, we needed this robot, and now we’re using it all across the world and maybe we should think about some ethical issues,’ it’s important to have someone like me asking about the ethical issues as we’re testing it – to be this intermediary, the step between innovation and policy.”
When ethics are an afterthought of design, things can go badly wrong. Consider, for example, a recruitment tool Amazon developed to sift through résumés of candidates for jobs and promotions. The algorithm trained itself to base hiring choices on key words indicating employees who had been hired or promoted in the previous decade – which filtered out female applicants for technical or supervisory jobs.
“A cultural bias that was a part of their daily practices became ingrained in the algorithm, and then was perpetuated the more you continued to use the algorithm,” van Wynsberghe said. (The company scrapped the machine-learning recruiter in 2018 and said it had never relied solely on its results.)
Now, regulators are trying to create frameworks and tools that can pick up on possible sources of discrimination and create ethical technology assessments, she said. Companies will have to make sure the assessments comply with certain data sets, and log how the system was trained, tested and validated. That will both improve the technology and guide regulators as they draw do-not-cross lines, van Wynsberghe said. “For example, it may be, ‘no, you cannot use emotion recognition or facial recognition because the technology isn’t good enough.’”
CSTAR a major career influence
One of eight von Humboldt professors chosen in this latest round, van Wynsberghe earned two master’s degrees and her PhD in Europe and was a professor of AI at Delft University of Technology in the Netherlands.
But her interest in the field began while she was in her third year of undergraduate studies in cell biology at Western. That’s when she landed work as a research assistant at CSTAR (Canadian Surgical Technologies & Advanced Robotics), a collaborative centre of London Health Sciences Centre, Lawson Health Research Institute and Western to research and develop computer-assisted robotic surgery and simulation.
As part of a team testing the capabilities of a tele-surgery robot, she began to wonder if there was a parallel process for evaluating non-technical aspects of remote surgery: how doctors and nurses interacted with patients they had never met, for example.
This is where Dr. Chris Schlachta, CSTAR medical director, played a powerful part in van Wynsberghe’s life: “He said, ‘those are really cool questions but I don’t know how to answer them for you. I think it would be useful for you to go to study ethics.’ ”
Having support from the medical and technical side right at the start was important, she said. “It was a safe environment for me to be able to ask those questions and to feel empowered to go and study the things that I’ve studied.”
AI’s carbon footprint
Key principles that must be embedded into machine-learning development culture are security, safety, privacy, fairness, sustainability, accountability and transparency, said van Wynsberghe. The intent is not to stifle innovation or quash creativity, but to establish common ground on some of AI’s thorniest questions, including responsible use of resources.
Even searching a Google Map or demanding that Alexa play our favourite music carries a carbon cost that we rarely consider, van Wynsberghe said. “Some of the preliminary research is showing that just to train an AI system is the same carbon footprint of five North American cars over their lifetime. That’s a lot. Before we have passed the point of no return, I want to look at the different methodologies for developing algorithms and see if there are ones that are more environmentally friendly.”
Her goal is to provide policymakers with data so they can decide, “okay, this algorithm has such a high carbon footprint that it can only be used by companies like DeepMind, who are going to use it for understanding how proteins fold, but you can’t use it to create a model that’s going to write poetry, for example.”
At Bonn, van Wynsberghe will also be director of the university’s Institute of Science and Ethics, where her doctoral and postdoctoral students and lab will be located.
Moving mountains
She is the first woman to receive a von Humboldt professorship, and one of the youngest women in Germany to have attained full professorship.
She is also co-director and co-founder of the Foundation for Responsible Robotics, an advisor to the European Commission on questions of artificial intelligence and a member of the World Economic Forum’s Global Futures Council on Artificial Intelligence and Humanity.
She made the move to Bonn with her husband, a postdoctoral researcher; their young son and daughter, and their dog. It’s difficult to be so far from her local cheerleaders, mother Catherine Woodburn (Dip’Ed’73) and brother Erinn, she said, but it’s been worth every sideroad she travelled to get here.
“I’m especially happy my daughter gets to see me do this, that we’re moving primarily for my career. I think that’s pretty rare and I think it’s pretty special. And I hope that if she’s ever in this position, the expectation will be, ‘of course, it’s an incredible opportunity, we will move mountains to make it happen.’ ”