Eticas Foundation promotes a directory to learn about the risks and challenges of machine learning systems that directly affect us and that are used by governments and large companies. Know the project!

Algorithms
Algorithms.

OASI is the first search engine to find the algorithms used by governments and companies on citizens. Created by the Eticas Foundation , this algorithm observatory with social impact collects information on dozens of algorithms used by Public Administrations and companies from around the world to learn more about its social impact. The aim is to give public access to information from both governments and companies' algorithms , and to know who uses them, who develops them, what threats they represent and whether they have been audited, among other characteristics.

Currently, both companies and Public Administrations automate decisions thanks to algorithms. However, its development and implementation does not follow external quality controls, nor is it as transparent as it should be, which leaves the population unprotected. With this search engine anyone can find out more about these algorithms : who developed them, who uses them, their scope of application, whether they have been audited, their objectives or their social impact and the threats they represent.

At the moment, OASI has collected 149 algorithms . Among them, 24 are already being applied in the US by the Government and Big Tech companies. For example, ShotSpotter, an algorithmic tool deployed by the Oakland Police Department to combat and reduce gun violence using sound monitoring microphones, and an algorithm to predict possible child abuse and neglect used by Allegheny County, Pennsylvania . Another corporate example is Rekognition, Amazon's facial recognition system, which was audited by the MIT Media Lab in early 2019 and found to perform substantially worse at identifying a person's gender if she was a woman or darker skinned.

The most common discrimination is based on age, gender, race or disability , unintentionally produced by developers who lack the socio-economic skills to understand the impact of this technology. In this sense, these engineers design the algorithms based only on technical skills, and since there are no external checks and it seems to work as expected, the algorithm continues to learn from poor data