Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business

News

  • Mohammed bin Rashid Al Maktoum Global Initiatives resumes food aid to Gaza

    In line with the directives of His Highness Sheikh Mohammed bin Rashid Al Maktoum, Vice President, Prime Minister and Ruler of Dubai, the Mohammed bin Rashid Al Maktoum Global Initiatives (MBRGI) has announced the resumption of food aid deliveries worth AED43 million to the Gaza Strip,

  • DoH launches Future Health Initiative

    Under the directives of His Highness Sheikh Khaled bin Mohamed bin Zayed Al Nahyan, Crown Prince of Abu Dhabi and Chairman of the Abu Dhabi Executive Council, Future Health – A Global Initiative by Abu Dhabi (Future Health) has been launched by the Department of Health – Abu Dhabi (DoH).

  • Salik to apply peak-hour toll rates for Dubai Ride

    Toll gate operator Salik said it will charge peak-hour fees on Sunday, November 2, as the Dubai Fitness Challenge's first flagship event - Dubai Ride - gets underway.