Based in Brussels, Eline focuses on European technology policy issues and on how policymakers can promote digital innovation in the EU. Prior to joining the Center for Data Innovation, Eline worked for several years as a policy analyst at HCSS, a leading Dutch think tank, where her work included research projects on defence, security and economic policy issues. More recently, Eline worked at DigitalEurope, one of Brussels’ largest trade associations, where she managed its relations with representatives of the digital tech industry.
What is the challenge you/your organisation is working on?
We are critical optimists. We focus on how policies can enable the success of technologies, like AI, that are beneficial to societal prosperity and economic competitiveness. We also raise awareness about how certain policies, in spite of their laudable goals, often focus on the harmful effects of technologies and so prevent their development as a result.
How does AI affect women?
More optimistically: Why should women care about AI?
Both women and men users interact with AI systems daily—from Netflix recommendations to Gmail. All of us should become more aware of its impacts on our lives. Many sectors, like healthcare, are using applications that can accelerate medical diagnoses or help discover lifesaving drugs. The reason why there’s a particular discussion on women and AI is the so-called algorithmic bias.
Human decision-making is imperfect. We’re influenced by the weather, our mood, hunger, and existing stereotypes. AI systems can be fairer because they’re more consistent and because we can now consciously decide which information we want them to use or not.
But AI is not a silver bullet. One factor that leads to algorithmic bias isn’t the actual mathematic model of algorithms, but the data we train the AI system with. This is where we meet issues like outdated data, or bias-riddled data generated by male techies.
Why is women’s leadership important in AI?
Women must be involved in this field to ensure emerging AI technologies are increasingly accurate, reliable, and that we do not perpetuate our biases when building these systems. Technology won’t meet their needs, or those of a diverse population in general, if only a small section of society with a specific worldview is involved in shaping it. Engaging women in AI is both good news for women and for everyone generally!
More women in this field is also good for the economy. Our work identifies what can help promote and accelerate AI adoption: Prioritising inclusion and diversity, from policymakers and institutions to industry and individuals; and for businesses, incorporating these efforts as part of a long-term strategy. Systems performing at lower rates for some groups of people can miss out on business opportunities: If a company only caters to one section of a population, it will limit the growth of its user base.
How are women leading on this issue?
Participation in organisations such as Women in Machine Learning is growing and many similar platforms have mushroomed—such as Women in Tech, or Women in AI. Having benefited from those networks myself, I can see how efficient such support systems truly are. Furthermore, AI is fueling sectors like education and healthcare, where many women work—a great opportunity for positive change.
But we’re not there yet. A study by innovation foundation Nesta recently showed that in the UK, the amount of research published by female AI professionals has dropped over the last decade, and gender diversity in AI is little better than in the 1990s. The Global Gender Gap report (World Economic Forum, 2018) suggests that only 22% of AI professionals on LinkedIn globally were women with no evidence of improvement in recent years.
Statistics are depressing, but societal progress typically takes time before becoming a robust reality.
How can more women lead?
First, aside from women getting involved in supportive networks, many initiatives work in practice by creating international connections between women for future collaboration.
Second, fixing this will require attracting, recruiting, and retaining more women in this field. Government, academia, and industry should do more to include women by inspiring them to take up relevant courses and creating welcoming environments through child care, supporting visibility of their female experts at corporate events, and encouraging women entrepreneurship. Schools can initiate interventions from professionals or role models that work in tech to demystify tech and explain what it has to offer. It’s often seen as a boring, a men-in-suits or men-in-sneakers-behind-screens kind of world. Let’s show younger people—especially women—how it is actually anything but boring.
Third, companies need the right tools and concepts to ensure and measure progress. That’s where standards come in. We’ve seen how many companies have led the charge on this, but you also need a culture where everyone owns the inclusive effort.
How do you think this issue will benefit from collaboration between companies and organisations, like the approach the Women’s Forum is taking with its partners?
It will consolidate awareness-raising efforts, create a sense of urgency, raise the stakes among decision-makers, help in growing interactions between various types of leaders, and perhaps generate more investment on those issues both substantially (time, discussions) and financially. People who can effect change at higher levels – in the public and private sectors, in NGOs and companies—may need to meet more often.
This series of stories highlights women leaders and entrepreneurs who are driving positive impact on our most pressing global issues and demonstrating women’s unique contribution to inclusive solutions. It draws on the community of the Daring Circles – workgroups committed to positive impact in areas where women are most affected and where women are demonstrating outsize leadership. Share your stories with the hashtag #Women4AI, and submit your own stories to the Women’s Forum editorial team by emailing Sophie Lambin (firstname.lastname@example.org).