In this demonstration, I'll take a somewhat large log file and load it up into BigQuery. Using BigQuery I'll perform SQL queries on the data in order to gain some insight about the patterns represented in the log. We'll start by loading data from Cloud storage into BigQuery. In the GCP Console's Products and Services menu, I'll scroll down to BigQuery. Inside my GCP project, I'll create a new dataset. I'll name my dataset logdata. And I'll choose for its location, the location that's geographically closest to me. Now that the dataset has been created, I'm going to add a table. For the source location, I'm going to give a URL that points to a CSV file I've staged into a Cloud Storage bucket. This CSV file contains web server log access data. For my destination table name, I'll specify accesslog. I'll tell BigQuery to automatically detect the Schema. I'll accept the rest of the defaults and click "Create Table." It takes a moment for the load to occur. Now the load is complete. I can click on the table name to view its Schema. The Schema was automatically created by examining the types of data in the fields. We can click "Preview" to look at some of the data. Now let's perform a query on this table using the BigQuery web user interface. We click "Compose Query." We'll paste it in the SQL query and run it. Now we can browse the query results. We can also run a query using the BQ command from within Cloud Shell. Because this is our first use, we are asked to confirm which GCP project we're using. Here are our query results. In this demonstration, I loaded a logfile into BigQuery. And I performed SQL queries to gain insight about what the log represented.