Microsoft will kill its AI for ethical reasons

Undoubtedly worried about the ethical excesses of its artificial intelligence, Microsoft is preparing to remove it.

Artificial intelligence has been worrying for a few weeks. Whereas that of Google is convinced of having a soul and feelings, that of Microsoft will come to an untimely end. Specialized in facial recognition, the technology of the American giant would indeed be able to “lire” on a face several sensitive information, such as age, gender, ethnicity, and even emotions. A technical feat that is difficult to pass among the defenders of privacy.

Microsoft will kill its AI… not quite

Aware that the technology raises many ethical questions, Microsoft announced this week that, faced with the impossibility of establishing a tangible link between a person’s facial expressions and the emotions felt, it had taken the radical decision to put an end to its artificial intelligence. In reality, the web giant undoubtedly has other concerns in mind, starting with the human discrimination that his system is potentially capable of causing.

For new users, it will now be impossible to access the Microsoft Face programming framework facial recognition system. Customers already registered have until June 30, 2023 to say goodbye to the software, before it is no longer accessible to the general public. Note, however, that behind this announcement, Microsoft intends to continue to integrate its technology into certain tools “controlled”, to start with Seeing AIthe software designed to make life easier for the visually impaired.

Is ethical AI possible?

This decision obviously does not come about by chance. A few days ago, Microsoft publicly shared its Responsible AI Standard, a sort of white booklet in which the company draws up the future of the IT decision-making process. For GAFAM, AI will have to focus on inclusiveness confidentiality and transparency if it wants to impose itself on the general public. This will happen in particular, believes Redmond, by excluding the detection of emotions.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button