Published on October 19th, 2020 | by Emergent Enterprise0
Google Details How it’s Using AI and Machine Learning to Improve Search
In many ways, artificial intelligence is the “hidden technology.” It touches our lives in so many ways and we don’t realize it is being used. The act of sending a search query on Google Search is just one example. And, as Kyle Wiggers shows in this article from VentureBeat, the AI, machine learning, AR and more tech is being launched into action every time we want Google to help us solve a mystery. This also means that this powerful AI can be incorporated into our own applications and solutions. The dark side of all of this specific and custom knowledge availability is that others can know more and more about us. As the AI becomes more exact, so does what we share about ourselves. Are you ready to draw back the curtain about yourself in a very transparent way?
Image Credit: Reuters
During a livestreamed event this afternoon, Google detailed the ways it’s applying AI and machine learning to improve the Google Search experience.
Google says users will soon be able to see how busy places are in Google Maps without searching for specific beaches, grocery stores, pharmacies, or other locations, an expansion of Google’s existing busyness metrics. The company also says it’s adding COVID-19 safety information to businesses’ profiles across Search and Maps, revealing whether they’re using safety precautions like temperature checks, plexiglass shields, and more.
An algorithmic improvement to “Did you mean?” — Google’s spell-checking feature for Search — will enable more accurate and precise spelling suggestions. Google says the new underlying language model contains 680 million parameters (the variables that determine each prediction) and runs in less than three milliseconds. “This single change makes a greater improvement to spelling than all of our improvements over the last five years,” Google head of search Prabhakar Raghavan said in a blog post.
Google says it can now index individual passages from webpages, as opposed to whole pages. When this rolls out fully, Google claims it will improve roughly 7% of search queries across all languages. A complementary AI component will help Search capture the nuances of webpage content, ostensibly leading to a wider range of results for search queries.
“We’ve applied neural nets to understand subtopics around an interest, which helps deliver a greater diversity of content when you search for something broad,” Raghavan continued. “As an example, if you search for ‘home exercise equipment,’ we can now understand relevant subtopics, such as budget equipment, premium picks, or small space ideas, and show a wider range of content for you on the search results page.”
Google is also bringing Data Commons — its open knowledge repository that combines data from public datasets (e.g., COVID-19 stats from the U.S. Centers for Disease Control and Prevention) using mapped common entities — to search results on the web and mobile. In the near future, users will be able to search for topics like “employment in Chicago” on Search to see information in context.
On the ecommerce and shopping front, Google says it has built cloud streaming technology that enables users to see products in augmented reality (AR). With cars from Volvo, Porsche, and other auto brands, for example, smartphone users can zoom in to view the vehicle’s steering wheel and other details — to scale. Separately, Google Lens on the Google app or Chrome on Android (and soon iOS) will let shoppers discover similar products by tapping on elements like vintage denim, ruffled sleeves, and more.