Microsoft has unveiled a new text-based filter to root out toxic messages on Xbox Live and revealed that it is working on a similar system for blocking actual voices.
The first system, revealed in a blog post from Xbox Monday, will scan text-based messages on Xbox Live and block specific content based on a users preferences.
Users can adjust the filter by choosing between a Friendly, Medium, Mature, or Unfiltered mode. The filter is alsocustomizable, meaning users can opt to allow certain words or block those not already excluded by Xbox.
Messages that the system believes to be inappropriate will now come with apotentially offensive hidden message warning label. Adult Xbox Live account holders will be able to click and see what the actual message is if desired, while child accounts will not be able to see such messages by default.
The message filter will be made available to those enrolled in the Xbox Insiders program today. Microsoft hopes to make the system available to all Xbox users later this fall.
But thats not all the company is working on. Xbox is also aiming to develop a voice-based filterfor its party chat feature.
Speaking with the Verge, head of Microsofts Xbox operations Dave McCarthy, said that his team is looking into combining Microsoftsspeech-to-text technology with its new filtration system.
What weve started to experiment with is Hey, if were real-time translating speech-to-text, and weve got these text filtering capabilities, what can we do in terms of blocking possible communications in a voice setting?’ McCarthy said. Its early days there, and there are a myriad of other AI and technology that were looking to stack around the voice problem, things like emotion detection and context detection that we can apply there. I think were learning overall were taking our time with this to do it right.
Rob Smith, a program manager on the Xbox Live engineering team, added that the ultimate goal would be to create a system that can detect and bleep out inappropriate comments much like is done on TV.
Its a great goal, but were going to have to take steps towards that, Smith said.
While such a system would be difficult given that live TV shows often run on a delay while gamers are communicating and coordinating in real-time, Smith says that minimally Xbox could detect a users level of toxicity.
In the meantime, we could do things like analyzing a persons speech and figuring out, overall, whats their level of toxicity theyre using in this session? Smith said. And maybe doing things like automatically muting them.
McCarthy went on to add that it is taking user privacy into consideration while creating such systems.
We have to respect privacy requirements at the end of the day for our users, so well step into it in a thoughtful manner, and transparency will be our guiding principle to have us do the right thing for our gamers, McCarthy said.