Now, a lot of you may be wondering is the BigQuery Web UI the only way to access BigQuery and return the insights from your query results. Say for example, we had this recently abandoned carts query that returns the top most recent abandoned carts. Wouldn't it be awesome if we could use our own application to programmatically run this query and say through like a RESTful API and then access the unique session IDs and be able to do something like that in our program language of our choice, NodeJS or Python, or another different application and not have to come to the web UI and then wait for this to export and then do something manually? Well, you can actually access the BigQuery service programmatically through the RESTful API and let's go ahead and just take a quick demo. This is largely based on this quickstart here that I'll provide the link for, so if you want to actually use the language of your choice instead of using Python, which is what I'm going to be using in the demo, you are more than welcome to select one of the different languages here. So we've got this query and what we need to do is execute it inside of a Python environment. And how do we actually set that up and what does that Python code look like? So, we're actually going to be using what's called an IPython Notebook and inside of there you can actually have, as we'll go through and use a quick demo, a Python code, arbitrary a Python code and as well as Markdown which is documentation for your code. So it's a nice place for data scientists to pretty much get together collaborate, run a bunch of different queries in a particular Notebook and then add in documentation in Markdown there as well. IPython Notebooks on the Google Cloud Platform, one option is to host them on Google Cloud Datalab, or if you've heard of Kaggle. Kaggle is famous for their data science's competitions and you might also be familiar with Kaggle Kernels or IPython Notebooks in the machine learning world where data scientists are constantly using these IPython Notebooks to run new models, and share the results. This is a very popular tool to get used to. So for the purposes of this demo don't worry about the technologies behind it, just understand that we're running arbitrary Python code within an environment here, host environment. So inside this demo notebook that I've created, what's going to look familiar is we've pasted in this query and its recent abandoned carts. We're going to wrap it and so there's accessing the RESTful API through Python is just done by simply calling from Google Cloud, import the BigQuery library and the BigQuery library comes with a lot of different hooks into it. So you can actually query as BigQuery and in the notebook itself here you can see I actually have just set up some description that looks a little bit more friendly than the green comments that you're used to. We have called them the BigQuery library. We're defining our query, we're specifying that we want this to run the BigQuery client, we're going to pass it a query job, and we would love for it to run this query and return the results, format them a little bit so we can actually run this outside of the Web UI. We can run this directly here. So let's just run this code and see if we can't get some results as it returns. And boom, actually this return is pretty quickly. So this is the 10 results. I got a little bit of formatting I'm just adding a prefix for dates and session. And you can imagine that instead of displaying these results, you could pass it through to your application and basically take a look at those sessions and proactively market to those individuals who have abandoned their carts. Maybe even you catch them the next time they're on the website by offering a special promotion or something like that. Either way, you'll have programmatic access the data that's beyond just clicking on "Run Query" inside of the Google BigQuery Web UI. Now, beyond that, the great thing about it if you're going to use something like a Cloud Datalab Notebook, instead of actually specifying all this different Python code is, it integrates really well with BigQuery and a lot of the other tools like machine learning tools, like TensorFlow. This is where you can write a lot of your TensorFlow code in these Data Science Notebooks, but for our purposes we'll stick with BigQuery code. Instead of writing all of that additional Python that's going to wrap all of this and define the queries. Cloud Datalab automatically knows if you specify this "Magic". BigQuery, I'm going to call this a query. It provides a lot of the rest of that for you and syntax highlighting which is useful. Now, you don't have that handy check or x for a valid query, so generally I like to write the query and BigQuery Web UI and copy and paste it in here. You still get all the benefits of actually running the queries within here. So running all of the queries in the notebook, you can see we get our results here much like you would inside of the BigQuery Web UI. It's just a couple of different ways. Again, the Python way up here, the Datalab BigQuery magic here or within the BigQuery Web UI. Now, you're going to see this a lot more. These Cloud Datalab Notebook or these Kaggle kernels which are iPython Notebook based when you're using BigQuery in addition to a lot of other tools that are available for you, particularly if you're going to be working with the machine learning specialization and that's when you can intersperse creating a machine learning dataset or prepping it with BigQuery and Cloud Dataproc, and Cloud Dataflow and then using a language like TensorFlow to be able to run Machine Learning models and then scale them out to the cloud using a tool like Cloud Machine Learning Engine. So this is just a quick primer on how you can actually run BigQuery outside of the Web UI but if you stick around in future specializations, particularly data engineering, or machine learning, or Google Cloud Platform this definitely won't be the last time that you'll see Cloud Datalab Notebooks. All right, that's it for the demo. If you're interested in more, I'll throw up some links where you can just get a quick tutorial on how to actually do this yourself.