Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business

Confirmed

  • Martin Garrix

  • Stormzy

  • Young Thug

  • Machine Gun Kelly

  • Cheat Codes

  • Bastille

Headliners

No sectional selected

Buy Tickets

Today's Weather

  • Abu Dhabi

    Sunny intervals

    High: 38°C | Low: 27°C

  • Dubai

    Sunny intervals

    High: 36°C | Low: 26°C

  • Northern Emirates

    Sunny intervals

    High: 36°C | Low: 27°C