Does AI support homogeneous algorithms while it supports sustainability?

Updated: May 14

By Kubra Onat

Development and technology are two elements that cannot be considered separately. Do you think this wonderful technology that supports development has become sustainable in every field? Of course, it has not come yet, but as of 2030, the demand for labor will decrease and it is expected to replace manpower in more stages from health to agriculture.

Artificial Intelligence technologies provide support for states to achieve sustainable development goals, especially in industry, education, energy and health. While the use of artificial intelligence in the industry increased by 40 % in European countries in last 5 years, it increased by 267 percent in China.


Artificial Intelligence is reducing costs in energy production, and this is what we need most in terms of sustainability. It facilitates access to health and education systems. These are just a few examples of what AI can do. However, technology, which is sustainable in such a wide area, unfortunately cannot yet overcome the discrimination problem .



Computer operators with an Eniac — the world’s first programmable general-purpose computer. Credit: Corbis/Getty Images1

Although the person who wrote the first computer algorithm in the 19th century was a woman, even today's artificial intelligence can still discriminate against women. Even algorithms can be at a level to undermine the discrimination that has been tried to be improved for years.


AIl can become functional thanks to collected data but considering that these ones may have a biased or discriminatory side, it is inevitable that technology will discriminate between genders. By the 1980s, the dominance of the programming industry passed to men.

Even today, 59 percent of European scientists and engineers are male, according to data from Eurostat, the European Statistical Office. This sexist and discriminatory stance in algorithms can be solved by including all segments of the society from the beginning of software to the testing stage.

For instance, women's menstrual cycles were not included when Apple Health-kit was created, because there were only men in the software phase. Even now, I cannot say that they are very good in menstrual follow-up.


Joy Lisi Rankin of the AI ​​Now Institute in New York says artificial intelligence and algorithmic technologies are shaping every aspect of our daily lives. "We are not aware of this effect because the technology is not visible and how it works is not very clear," says Rankin, who studies gender, race, and the power of AI. "Algorithm systems also determine who has access to important resources."



“Ridding AI and machine learning of bias involves taking their many uses into account. Image: British Medical Journal”2

Amazon's attempt to automate its employment system has become the most notorious example of AI-related gender discrimination. The US-based giant company gave up the system used in recruitment for four years in 2018, as it had 'sexist' results.

In the system used by Amazon, 10-year CVs were taken as a reference for the purpose of collecting data and developing a model, and artificial intelligence, due to the dominance of men in the industry, judged male personnel to be preferable and led to sexist decisions.

According to Levy, resume selection is one of the most problematic areas: "Even if algorithms are commanded to consider gender, it somehow comes up in other ways".

It turned out that Amazon's problematic algorithm eliminated resumes with words associated with the female gender.


Discrimination in AI is not limited to gender. Such algorithms can punish all kinds of diversity and make choices in favor of the "white male" group, which is presented to society as more privileged. Facial recognition systems are also among the problematic areas.

Researches show that if you are a dark-skinned woman, your situation is not good as others.


It is possible to see the effects of the sexist algorithm on women in social media and search engines.

For instance, advertisements that appear even in personalized advertisements are actually the product of gender and racial discrimination. Furthermore, the fact that disadvantaged societies have fewer data in artificial intelligence, especially in the field of health, raises discrimination to a higher level.

The data of societies that cannot access the areas where AI is used because they do not have the opportunity, are also missing and thus they are completely excluded from the event.

In the next stage, the algorithm does not detect them and may cause wrong judgments, such as wrong diagnosis in health. If we want to see AI as sustainable in every field, we must try to write harmonious algorithms.



Written by Kubra Onat

Edited by Noemi Nardi



Kübra Onat was born in a multicultural and multi-religious small city. In her early years, she had an active childhood in the fields of social and environmental sensitivity. She continued her activities in social awareness projects while she was a student in Political Science department at Bilkent University, one of the best universities in Turkey. Despite studying in the Department of Political Science, she was more sensitive to women issues, technology and awarness projects. She was honored with the title of "women and their impact on peace processes" in the Transdisciplinary Senior Project course. She always has a unifying nature against the discriminatory nature of politics. She graduated from the University in 2013.

Unity and sensitivity were her basic principles. She continued to emphasize in her professional life that sustainability and women should be in all areas of life, thinking not only of the present but also of the future. Even while running a cafe in the small town, where she was born, for 2 years, she carried out activities based on these principles. While she was working for the European Union projects, she did not give up on honesty and sustainability. She continues to improve herself in the Department of European and Global Studies at the University of Padova, where she got the opportunity to study abroad, which is her delayed dream, with her daughter and for her daughter…

No matter how old you are to make an impact!



To find out more about 'SET Padova':

Follow us on Instagram (@set_padova)/YouTube (SET Padova)/LinkedIn (Student Engagement Team)/Facebook (SET Padova).



References


1. Boucher,P. (2020,06). Artificial intelligence: How does it work, why does it matter, and what can we do about it? Scientific Foresight Unit (STOA), European Parliamentary Research Service. https://www.europarl.europa.eu/RegData/etudes/STUD/2020/641547/EPRS_STU(2020)641547_EN.pdf


2. Rodriguez Martinez, M. and Gaubert, J. (2020,03,10). International Women's Day: how can algorithms be sexist? Euronews. https://www.euronews.com/2020/03/08/international-women-s-day-our-algorithms-are-sexist


142 views0 comments