Statement from Hon. Gonul Eken, Chairperson of FICAC Women in Diplomacy Committee, with the assistance of Prof. Elif Vatanoglu, Advisor to the President of FICAC

April 20, 2020 | News

Dear Colleagues,

In these days of COVID-19 pandemic, we keep finding ourselves more and more in the hands of Artificial Intelligence (AI). It is an inevitable fact. We are dreaming of a new age with AI without gender bias.

We started to work on a new project, together with the advisor of the president of FICAC.

Medical doctor, Prof. Elif Vatanoglu, gave us the resume of the many articles about AI.

Here is her point of view:

“New technologies give us a chance to start afresh – starting with AI – but I know that it is up to people, not the machines, to remove bias.

It seems that without the training human problem solvers to diversify AI, algorithms will always reflect our own biases.

AI appears neutral, but it has made by humans, which means it internalizes all the same bias as we have – including gender bias.

AI is a mirror of ourselves.

When I do my research and reading about the subject, I see that it is no surprise to find that AI is learning gender bias from humans. I have come Across many articles that cite word-embeddings as a bias aspect of AI. Like a game of word-association, these systems can often associate ‘man’ with ‘doctor’ and ‘woman’ with ‘nurse.’ These do not reflect modern society, or at least how we want modern society to progress. These are outdated views.

Another example, natural language processing (NLP), a critical ingredient of conventional AI systems like Amazon’s Alexa and Apple’s Siri, among others, has been found to show gender biases – and this is not a standalone incident. There have been several high profile cases of gender bias, including computer vision systems for gender recognition that reported higher error rates for recognizing women. —specifically those with darker skin tones. In order to produce more clean and trustable technology, there must be a concerted effort from researchers and machine learning teams across the industry to correct this imbalance.

Gender bias could also cause problems for facial recognition software that uses AI. Such as security applications at concerts, airports, and sports arenas, a concern that also extends into the gender binary.

If AI sees gender as merely male and female, this does not align with modern perspectives of non-binary and transgender expression, causing potential harm for these communities.”

We want to open this issue, the Artificial Intelligence, and Gender Bias, to the views and opinions.

Let us observe and follow the steps of AI in our lives in the next three months’ period and share.

If you share your observations and thoughts with us, it will be a significant action of the Women in the Diplomacy Committee.

At the end of the next three months, We are planning to write a declaration about the topic after hearing your opinions as well.

Fortunately, we are starting to see new work that looks at exactly how that can do accomplished.

So hopefully, women, together with men, will play a significant and critical role in shaping the future of a bias-free AI world.

Yours sincerely

Gonul Eken

The Chairperson of the  Women in Diplomacy Committee, FICAC