Here's everything Google announced at its big I/O conference

San Francisco: Google's Android operating system has reached a milestone by powering 2 billion monthly active devices around the world, the company said.

The push marks another step toward infusing almost all of Google's products with some semblance of artificial intelligence-the concept of writing software that enables computers to gradually learn to think more like humans. Think about Google Search: it was built on our ability to understand text in webpages.

Google has announced a new tool that will enable users to get information about any object just by pointing a phone camera at it. (Have you noticed that Google loves this term "in the coming months?") That means you might finally be able to ditch your house's landline - if you're one of those weird people who still has a landline. Jobs will leverage Google's advanced machine learning capabilities to better match opportunities with candidates.

AI can also help with basic sciences like DNA sequencing.

Yes, the Google search engine may indeed be the most popular.

Abu Qader, a high school student in Chicago, taught himself how to use TensorFlow using YouTube.

The Assistant is a powerful example of these advances at work. This is already available on the Echo and other devices that use its Alexa assistant. However, consumers love to tune in as well, so Google usually makes a few announcements to get everyonel excited.

Pichai has made AI the foundation of his strategy since becoming Google's CEO in late 2015, emphasizing that technology is rapidly evolving from a "mobile-first" world, where smartphones steer the services that companies are building, to an "AI-first" world, where the computers supplement the users' brains.

It will also recognize wireless network names and passwords printed on stickers on Wi-Fi routers, and automatically connect the smartphone to this network. Sameer Samat, a Google executive, said on stage that the new OS has a feature that easily tracks mobile data usage and ties automatically with the billing systems of wireless carriers.

The tech giant also announced that Google Assistant is now available on the iPhone for the first time.

All of this requires the right computational architecture. This is, of course, impossible to achieve without machine learning. This might not mean a lot to Samsung Galaxy S8 owners, but it might mean the world to anyone from Tracphone users to people in developing countries.

Alphabet introduced Google Assistant at last year's I/O conference for Google Home and Allo, and this year announced a breadth of new integrations and use cases. "We've had significant breakthroughs". But today, there are too many barriers to making this happen. Google's first-generation TPUs are created to perform AI inference using 8-bit integers; it's not clear what math precision the second-generation units use. You have to feed your robot brains (aka neural networks) unfathomably large amounts of data to chew on before they can start to learn what they're doing. Here's a what a good photo looks like, now take this bad photo and make it look better.

I don't really know what Kotlin is besides the fact that it's a programming language, but the developers in the Google I/O crowd seemed STOKED that it will soon be supported on Android. We've used ML to improve the algorithm that detects the spread of breast cancer to adjacent lymph nodes. It's ideally suited to Google, with its vast reserves of visual information and growing cloud AI infrastructure.

In the course of the keynote, Google gives us examples of how AI is fundamentally changing the way we interact with technology.