Google is using its Artificial Intelligence systems to help people get access to critical information while avoiding potentially shocking or harmful content, so that they can stay safe, both online and offline
Representative Image. Pic/iStock
Google is using its Artificial Intelligence systems to help people get access to critical information while avoiding potentially shocking or harmful content, so that they can stay safe, both online and offline.
Google shows contact information alongside the most relevant and helpful results when people search on suicide, sexual assault, substance abuse and domestic violence. But for people in personal crises it takes the help of machine learning to understand their language.
The tech giant's latest AI model Multitask Unified Model, or MUM can automatically and more accurately detect a wider range of personal crisis searches.
MUM can better understand the intent behind people's questions to detect when a person is in need, which helps us more reliably show trustworthy and actionable information at the right time.
"MUM not only understands language, but also generates it. It's trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models," shared Pandu Nayak, Google Fellow and Vice President of Search, in a blogpost.
"And MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio," he added.
Another feature to keep an individual safe on Search, while also steering clear of unexpected shocking results, is the SafeSearch mode, which offers users the option to filter explicit results.
"This setting is on by default for Google accounts of people under 18. And even when users choose to have SafeSearch off, our systems still reduce unwanted racy results for searches that aren't seeking them out," Nayak said.
Further, Google uses advanced AI technologies like BERT to better understand what an individual is looking for.
BERT has improved the understanding of whether searches are truly seeking out explicit content, helping vastly to reduce the chances of encountering surprising search results.
Nayak said last year, BERT has reduced unexpected shocking results by 30 per cent.
"It's been especially effective in reducing explicit content for searches related to ethnicity, sexual orientation and gender, which can disproportionately impact women and especially women of colour," he added.
Nayak stated that Google is also working with trusted local partners to better detect personal crisis queries all over the world, and show actionable information in several more countries.
"Whatever you're searching for, we're committed to helping you safely find it," Nayak said.
ADVERTISEMENT
Also Read: Applications may soon ask your permission to send notifications in Android 13
This story has been sourced from a third party syndicated feed, agencies. Mid-day accepts no responsibility or liability for its dependability, trustworthiness, reliability and data of the text. Mid-day management/mid-day.com reserves the sole right to alter, delete or remove (without notice) the content in its absolute discretion for any reason whatsoever.