Tutorial on Google IoT Core with Raspberry Pi to visualize the sensor data.

by Mar 20, 2020IoT Cloud, Projects, Raspberry Pi projects

In this tutorial we will learn how to build an IoT application using the Google Cloud Platform. The application will get sensor data from the Raspberry Pi and send it to the cloud. This data will be stored in BigQuery and plotted in a graph.

Contents

  1. Hardware Requirements
  2. Initial Setup
  3. Creating A Project
  4. Enable Required APIs
  5. Setup IoT core Service
  6. Download API credential JSON file
  7. Setup Google BigQuery
  8. Setting up cloud storage bucket
  9. Setting Google Dataflow
  10. Setting up the Raspberry Pi
  11. Running the Code
  12. Visualize the data

1. Hardware Requirements

  • Raspberry pi 3
  • HC-SR04 (or any sensor)
  • 330 Ω and 470 Ω resistors
  • Breadboard
  • Jumper cables

NOTE :- you can use any sensor you like and send any data to the cloud. Here we will be using the HC-SR04 distance sensor as an example.

2. Initial Setup

  1. You will have to first create a google account or use an existing google account.
  2. Create your google cloud account. (This process requires your credit card details to verify that you are not a robot).

Note : You can use the free trial as well to follow this tutorial.

3. Creating A Project

A project organizes all your Google Cloud resources. A project consists of a set of users; a set of APIs; and billing, authentication, and monitoring settings for those APIs. Let’s begin.

  1. Go to the manage resources page on your Google cloud console and select “Create Project” ( https://console.cloud.google.com/cloud-resource-manager)
  2. Enter a name for the project, it can be anything that you wish, and click “Create“.
Create new project

4. Enable Required APIs

  1. We need to make sure that we have enabled the following APIs :
    1. Dataflow API
    2. Cloud Pub/Sub API
    3. Cloud IoT API
  2. Find “APIs and service” in the hamburger menu present on the top left. Hover over it and click “Dashboard”.
APIs and services dashboard
  1. Next click on “Enable APIs and Services”.
Enable APIs button
  1. In the search bar type “Dataflow API”.
Dataflow search result
  1. Select the result and you should have a page like this.
Enable Dataflow API
  1. Click on “Enable“.
  2. Repeat steps 4 to 6 for the other two APIs mentioned.
Enable Pub/Sub API
Enable google cloud IoT API

5. Setup IoT core Service

  1. In the search bar present at the top of the page search for “IoT core” and select it.
  2. Once it opens the IoT core page click on the “Create Registry” button.
Create registry
  1. Enter a registry ID and the region that is closest to you geographically.
ENter reg ID
  1. Create a new Cloud Pub/Sub topic. For this, click on the drop down menu and choose “Create A Topic”.
  2. Give it a topic ID that you like, I am going to name it “my-topic” and click “Create Topic”.
Create a Topic
  1. Let’s finish the creation of registry by clicking “Create”.
Create button

6. Download API credential JSON file

  1. Now we have to get the credentials to authenticate our API requests. https://console.cloud.google.com/apis/credentials/serviceaccountkey

Note : make sure you have the correct project selected.

  1. Under service account select “New service account” and then the rest of the options will open up. Enter a name that you wish and choose your role as owner and keep the Key type as JSON and click create.
Create Credential file
  1. Now send this JSON file to the Raspberry Pi and store it safely, as it gives access to your cloud account.

7. Setup Google BigQuery

We are going to use Google’s BigQuery service as our database to store all the data that our sensor captures.

  1. Now search for “BigQuery” in the search bar or in the menu on the left side, under the “Big Data” category and select it.
BigQuery in menu
  1. Once the page opens up you will see a button labeled “Create Dataset”, Click it.
Create dataset button
  1. Now you’ll have to name your new data set. I have named it “DistanceSensorDataset”. You should also choose your geographic location in the “Data Location” drop down.
create dataset info
  1. Click on the “Create Table” button. All the data that you generate from the sensor will be stored here.
create table button
  1. In this page, select “Empty table” in the “Create table from” drop down. Enter a table name, it can be anything that you wish.
  2. Click on “Add field”. Enter a field name and the data type that you wish to store in that field. You can add as many field names as you wish.
Create table info

8. Setting up cloud storage bucket

  1. Find and select “Storage” in either the hamburger menu on the left side or search for it in the search bar.
Storage button in menu
  1. Now create a new bucket, for this click on “Create Bucket”.
Create bucket button
  1. Now lets create the bucket and name it . I will call it “iotedu-tutorial-bucket” and click continue .
Bucket setup info
  1. Keep the location at “multi-region” and leave the rest of the options at the default state and click continue.
  2. Finally click on the “Create” button.

9. Setting up Google Dataflow

  1. Let’s now find google Dataflow in the hamburger menu on the left side or in the search bar on top.
Dataflow button in menu
  1. On the menu at the top click on the “Create Job From Template” button.
Cretae job button position
  1. Now enter a name for this job that we are creating. Let’s call it “send-to-bq”. Then choose the region that is closest to you geographically. Also, choose “Cloud Pub/Sub Topic to BigQuery” under Cloud Dataflow template. After which, more options will appear.
Create job info
  1. Under required parameters set the “topic” in the following format. projects/<project id>/topics/<topic name>
  2. Under the Big Query output table enter the table in the following format. <project_id>:<dataset_name>.<table_name>
  3. Finally enter the temporary location in the following format. (Remember the “bucket” that we had created in an earlier step ?) gs://<bucket_name>/tmp/
  4. Click on the show optional parameters and set the max workers as 2 and also set the machine type to “n1-standard-1”.
  5. Leave the rest of the options as it is and click “run job”.
  6. You should now have a running job that moves data from your Pub/Sub topic to your BigQuery table.

Note: Remember to remove all spaces that precede or succeed any of the data that you enter.

10. Setting up the Raspberry Pi

  1. Let’s first make all the connections required to run the distance sensor according to the diagram below(or any sensor of your choice).
Fritzing circuit diagram
circuit image IRL
  1. Create a directory called “cloud” and then “cd” into it.
mkdir cloud
cd cloud
  1. Now let’s write a python function to return the data from the distance sensor and save it as “sensordata.py”.
import RPi.GPIO as GPIO
import time
 
GPIO.setmode(GPIO.BCM)
 
GPIO_TRIGGER = 18
GPIO_ECHO = 24

GPIO.setwarnings(False)
GPIO.setup(GPIO_TRIGGER, GPIO.OUT)
GPIO.setup(GPIO_ECHO, GPIO.IN)
 
def FindDistance():
    GPIO.output(GPIO_TRIGGER, True)
 
    time.sleep(0.00001)
    GPIO.output(GPIO_TRIGGER, False)
 
    StartTime = time.time()
    StopTime = time.time()
 
    while GPIO.input(GPIO_ECHO) == 0:
        StartTime = time.time()
 
    while GPIO.input(GPIO_ECHO) == 1:
        StopTime = time.time()
 
    TimeElapsed = StopTime - StartTime
    distance = (TimeElapsed * 34300) / 2
 
    return distance
  1. Now let’s write the code to send messages to the Pub/Sub topic. Name the file as “sendmsg.py”. Enter your project id as the value of “project_id” variable (remember to enter project id, not the project name but they can be the same).
from google.cloud import pubsub_v1
import datetime
import json
import sensordata
import time

project_id = "<your prject id>" # enter your project id here
topic_name = "<topic name>" # enter the name of the topic that you created

publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path(project_id, topic_name)

futures = dict()

def get_callback(f, data):
    def callback(f):
        try:
            # print(f.result())
            futures.pop(data)
        except:
            print("Please handle {} for {}.".format(f.exception(), data))

    return callback

while True:
    time.sleep(3)
    distance = round(float(sensordata.FindDistance()),2)
    timenow = float(time.time())
    # timenow = datetime.datetime.now()
    data = {"time":timenow, "distance" : distance}
    print(data)
    # When you publish a message, the client returns a future.
    future = publisher.publish(
        topic_path, data=(json.dumps(data)).encode("utf-8")) # data must be a bytestring.
    # Publish failures shall be handled in the callback function.
    future.add_done_callback(get_callback(future, data))
    time.sleep(5)
# Wait for all the publish futures to resolve before exiting.
while futures:
    time.sleep(5)

print("Published message with error handler.")
  1. Do you see the dictionary stored in the variable called data ? You can change this dictionary to store whatever data you wish to send to Big Query. However, Remember that the key of the dictionary should match with the name of the field in the Big Query table.
  2. Now create a virtual environment using the command :
virtualenv venv
  1. activate the virtual environment with the command.
source venv/bin/activate
  1. now lets install our required pip packages.
pip install google-cloud-pubsub
pip install Rpi.GPIO
  1. Remember the credentials JSON file that you downloaded from the GCP website? Move this JSON file to the directory named “cloud”.
  2. At the moment your working directory (cloud) should look something like this in your Raspberry Pi.
tree of directory

11. Running the Code

  1. Now you have to set your JSON file as the value of an environment variable called “GOOGLE_APPLICATION_CREDENTIALS” you can do it with the following command :
export GOOGLE_APPLICATION_CREDENTIALS="(Path to JSON file)"

Remember this environment variable will only last for a single session. You will have to repeat this step for every session.

Setup environment variable
  1. now you can run your python script from the terminal with :
python3 sendmsg.py
  1. Your output should look something like this.
Code output sample
  1. Your BigQuery table should look something like this :
BigQuery sample output

12. Visualize the data

To visualize the data graphically we will use another google product called data studio. It is an analytics tool that perfectly suits our need to graph sensor data. It also seamlessly links with our BigQuery database. Let’s begin.

  1. Open the data studio website with the following link : https://datastudio.google.com
  2. Click on the “Blank Report” button.
Blank report button
  1. Now click on “Create New Data Source” button in the bottom right corner of your screen.
Create new datasource button
  1. Hover your mouse over the big query option and then click “Select”.
Select button
  1. Select the correct project, dataset and table and finally click on the blue “Connect” button on the top right side of the screen.
choose project
  1. We will change the data type of the time field to “Date Hour Minute” and the default aggregation of distance to “average” and click “Add To Report“.
Choose datatype
  1. Then click “Insert” on the top menu and select “Time Series“. We get a chart like this :
output graph(no constraints)
  1. If you look carefully you will notice that the graph is plotting the record count (serial number) instead of the distance so lets choose the “distance” field in the metric category (instead of Record count).
Change metric
  1. Once we do that we will get a graph like this :
required graph
  1. You can also change your option from average distance to max, min or many others.
change aggregation
  1. Here are a few examples :
  • Max :
Max graph
  • Min:
Min graph

There is a lot that you can do with Data Studio, spend some time exploring the different types of graphs and other data types, as this will help you learn more.

Hopefully this tutorial gave a clear insight on how to use pub/sub, BigQuery and Data Studio in harmony with your raspberry pi to easily visualize your sensor data.

Demo Video

Creating a multiplication Skill in Alexa using python

Written By Sashreek Shankar

Hey reader! I am Sashreek, a 16 year old programmer who loves Robotics, IoT and Open Source. I am extremely enthusiastic about the Raspberry Pi and Arduino hardware as well. I also believe in sharing any knowledge I have so feel free to ask me anything 🙂

RELATED POSTS

Creating REST API CRUD with Node.js, Express and MongoDB

Creating REST API CRUD with Node.js, Express and MongoDB

In our previous blog, we explored the fundamentals of creating a REST API with in-memory CRUD operations using Node.js and Express. Now, we're taking a significant step forward by integrating our product data with a database. This transition is pivotal for achieving...

How to create REST API using Node.js and Express?

How to create REST API using Node.js and Express?

In the vast landscape of web development, creating a robust API is a fundamental skill. APIs, or Application Programming Interfaces, serve as the communication bridge between different software applications. Today, we'll embark on a journey to build a simple blog API...

APIs Vs. Microservices: What is the difference?

APIs Vs. Microservices: What is the difference?

You've probably heard two extravagant terms thrown around when discussing software design and integrations: APIs and microservices. Both topics are important in web application development and design nowadays, and their applications overlap. In this article, we will...

How to Simulate IoT projects using Cisco Packet Tracer

How to Simulate IoT projects using Cisco Packet Tracer

In this tutorial, let's learn how to simulate the IoT project using the Cisco packet tracer. As an example, we shall build a simple Home Automation project to control and monitor devices. Introduction Firstly, let's quickly look at the overview of the software. Packet...

Understanding Salesforce IoT and Its Importance

Understanding Salesforce IoT and Its Importance

In this post, we are going to discuss Salesforce IoT. All across the globe, people are connecting to the Internet to access information, communicate with other people, and do business. But it's not just people that are using the Internet: objects use it too....

How to design a Wireless Blind Stick using  nRF24L01 Module?

How to design a Wireless Blind Stick using nRF24L01 Module?

Introduction Let's learn to design a low-cost wireless blind stick using the nRF24L01 transceiver module. So the complete project is divided into the transmitter part and receiver part. Thus, the Transmitter part consists of an Arduino Nano microcontroller, ultrasonic...

VIDEOS – FOLLOW US ON YOUTUBE

EXPLORE OUR IOT PROJECTS

IoT Smart Gardening System – ESP8266, MQTT, Adafruit IO

Gardening is always a very calming pastime. However, our gardens' plants may not always receive the care they require due to our active lifestyles. What if we could remotely keep an eye on their health and provide them with the attention they require? In this article,...

How to Simulate IoT projects using Cisco Packet Tracer

In this tutorial, let's learn how to simulate the IoT project using the Cisco packet tracer. As an example, we shall build a simple Home Automation project to control and monitor devices. Introduction Firstly, let's quickly look at the overview of the software. Packet...

All you need to know about integrating NodeMCU with Ubidots over MQTT

In this tutorial, let's discuss Integrating NodeMCU and Ubidots IoT platform. As an illustration, we shall interface the DHT11 sensor to monitor temperature and Humidity. Additionally, an led bulb is controlled using the dashboard. Besides, the implementation will be...

All you need to know about integrating NodeMCU with Ubidots over Https

In this tutorial, let's discuss Integrating NodeMCU and Ubidots IoT platform. As an illustration, we shall interface the DHT11 sensor to monitor temperature and Humidity. Additionally, an led bulb is controlled using the dashboard. Besides, the implementation will be...

How to design a Wireless Blind Stick using nRF24L01 Module?

Introduction Let's learn to design a low-cost wireless blind stick using the nRF24L01 transceiver module. So the complete project is divided into the transmitter part and receiver part. Thus, the Transmitter part consists of an Arduino Nano microcontroller, ultrasonic...

Sending Temperature data to ThingSpeak Cloud and Visualize

In this article, we are going to learn “How to send temperature data to ThingSpeak Cloud?”. We can then visualize the temperature data uploaded to ThingSpeak Cloud anywhere in the world. But "What is ThingSpeak?” ThingSpeak is an open-source IoT platform that allows...

Amaze your friend with latest tricks of Raspberry Pi and Firebase

Introduction to our Raspberry Pi and Firebase trick Let me introduce you to the latest trick of Raspberry Pi and Firebase we'll be using to fool them. It begins with a small circuit to connect a temperature sensor and an Infrared sensor with Raspberry Pi. The circuit...

How to implement Machine Learning on IoT based Data?

Introduction The industrial scope for the convergence of the Internet of Things(IoT) and Machine learning(ML) is wide and informative. IoT renders an enormous amount of data from various sensors. On the other hand, ML opens up insight hidden in the acquired data....

Smart Display Board based on IoT and Google Firebase

Introduction In this tutorial, we are going to build a Smart Display Board based on IoT and Google Firebase by using NodeMCU8266 (or you can even use NodeMCU32) and LCD. Generally, in shops, hotels, offices, railway stations, notice/ display boards are used. They are...

Smart Gardening System – GO GREEN Project

Automation of farm activities can transform agricultural domain from being manual into a dynamic field to yield higher production with less human intervention. The project Green is developed to manage farms using modern information and communication technologies....