The first ingredient common to all of the applications of the superpowers is computing. Computing at its most basic is any task that requires a calculation. More specifically, computing is the ability to process information, to store, retrieve, compare, and analyze it, and automate actions most often nowadays with the support of a computer. In the beginning, giant computers owned by institutions, filled entire rooms and took a long time to process small amounts of data. But two recent changes dramatically affected the computing landscape. The first change brought computers from institutions to individuals, and kicked off an exponential growth trajectory that doubled computing power every other year. This growth rate is known as Moore's Law, and the increase in compute power dramatically changed what we could ask computers to do. The first version of Unix in 1971, used approximately 10,000 lines of code. The first version of Photoshop used 100,000 lines of code, and at Google today, we use two billion lines of code and counting. This first change then lead to an increased consumer demand for services. Moreover, it increased consumer expectations that services be available wherever they are and on whatever devices they're using, which led to a second evolution: the nature of services consumers want to consume has changed. Services must be fast, convenient, and relevant to their context. Customers no longer expect to be served web page style content. Today's services involve natural language, immersive video, constant interaction, and instant visual representation. They are highly connected and very compute-intensive. Google has invested heavily in its mission to help businesses meet these demands, which has led to the building of a multilayered more secure and scalable IT infrastructure. That same infrastructure now supports all of our products. Search, Google Maps, G Suite, YouTube, and many others, with hundreds of machine learning models built into each of them. And this is only the beginning. Moore's law is being disrupted twice by radical new designs in chips. The first disruption comes from processors that are specifically meant for this type of application, and which we call TPUs for TensorFlow processing units. They are not twice as powerful, which Moore's Law predicts. Instead, they are 50 times more powerful than traditional chips. The second disruption comes from quantum computing, which is a 100 million times more powerful. To give you a concrete understanding of the impact this has, a machine learning model that would require a day of training with traditional processors, only requires half an hour with TPUs. A model that will require 10,000 years of training with traditional processors, will now require only a handful of seconds with quantum computing. There is a constraint attached to this evolution though. The cooling requirements for these new processors can only be met in large industrial environments. TPUs need to be cooled with pressurized water within the chip, while quantum computing requires absolute zero to operate. Cloud data centers are the only environment where we can create these conditions at scale, and will be in the foreseeable future, the only option to tap into this vast amount of computing power. Think about electricity: in the first years after its discovery, most users had generators located where they lived or worked. As the industry became more mature, we created power plants which we access through a grid. The same principle applies today to computing. It will mainly be generated in plants which are the data centers, and accessed through a grid which is the Internet. This means that even the huge amounts of information that we are collecting with industrial sensors, embedded assistance, and smart devices will be easily addressed in the coming years with revolutionary computing technology. This brings us to the second ingredient common to all of these applications: data.