Enterprise Android in a Hosted Cloud IoT Solution: Part 1: Google Cloud Platform (GCP)

diagrams.png

This is the first in a series of blog posts looking at integrating Zebra Android devices as IoT clients in a public cloud infrastructure.  For other posts in this series, please see the links below:

This post will walk through an example cloud IoT infrastructure using Google Cloud Platform: IoT core as the entry point.  A simulator will mimic device location, Android OS info and battery information and this data will be stored for later retrieval.

This example is based on the community tutorial for "Real time Data Processing with Cloud IoT Core",

https://cloud.google.com/community/tutorials/cloud-iot-rtdp whose source code can be found here.

Disclaimers:

  • This is a very simple example but the aim is to show the principles of subscribing to an MQTT topic, taking action on data posted to that topic by an IoT client and storing the IoT data for later retrieval and analysis.
  • Any hosted cloud computing solution may incur costs.  Please ensure you are aware of the costs associated with the cloud components you are using and how to monitor and set spending limits.

Objectives:

  • Deploy a cloud function which will monitor the incoming battery level and log a warning to Stackdriver Logging when it goes below a certain threshold.
  • Deploy a streaming application to Cloud Dataflow that stores the device data into a BigQuery database
  • Run an MQTT client to provide simulated data
  • Data that will be simulated and stored in BigQuery
FieldTypeExample
DeviceIDSTRINGTest-device
DateTimeSTRING2018-11-14T10:29:45
ModelSTRINGTC57
LatSTRING35.6602997
LongSTRING139.7282743
BattLevelINT (percentage)20
BattHealthINT (health percentage)50
OSVersionSTRING8.1.0
PatchLevelSTRINGFebruary 1, 2019
ReleaseVersionSTRING01-10-09.00-OG-U00-STD

Source Code:

The source code discussed in this tutorial is available from GitHub.

Pre-requisites:

  • OpenSSL (not strictly needed for this tutorial but you will need it later to generate your own IoT client credentials)
  • Git

Configure a GCP project and enable APIs

  1. Navigate to https://console.cloud.google.com/ and sign in with a Google account
  2. Create a new project as described at https://cloud.google.com/resource-manager/docs/creating-managing-projects .  This demo will use the project name “ent-android-iot-server-gcp”

create project.png

  1. Switch to your newly created project and enable billing for your project as described at https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project.  You can remove billing at any point without having to delete the associated project.
  2. Enable the following APIs for your project by selecting the hamburger icon and manually enabling each API:
  • Cloud IoT Core
  • Cloud Functions
  • Cloud Dataflow (Note: you will need to run through a tutorial and enable Google Cloud APIs but don’t worry about installing the cloud Dataflow samples on the Cloud Shell or setting up a cloud storage bucket etc., we will do that later in this tutorial)

Create the cloud storage bucket

The cloud storage bucket will be used to hold the cloud function we define later

  1. Open the Cloud Storage console
  2. Create a Cloud Storage bucket.  The name must be unique across cloud storage, this example will use “ent-android-iot-bucket-gcp” as the name of our storage bucket for consistency and create the bucket in the europe-west1 region.

create storage bucket.png

  1. Click ‘Create folder’ and enter a temporary folder name then click ‘Create’.  This tutorial will use 'temp'.

create storage bucket folder.png

Configure Cloud IoT Core

Cloud IoT core is the entry point into the system for MQTT traffic sent from IoT devices (a simulator or Android device in this tutorial).  Create a topic in the Cloud Pub/Sub and configure Cloud IoT Core to receive data from MQTT clients.

  1. Open the Cloud Pub/Sub console
  2. In the left navigation menu, click the “Topics” menu

pub sub topics.png

  1. Click 'Create a topic'.  In the Name box, enter a topic name and click 'create'.  In this tutorial I am using device_telemetry.

pub sub topic creation.png

  1. Open the Cloud IoT Core console
  2. Click “Create a device registry”
  3. In the Registry ID box, specify a registry ID (this tutorial will use ‘ent-android-iot-registry-gcp’), select a nearby cloud region and the pub/sub topic which was previously created (device_telemetry in the case of this tutorial)

create registry.png

  1. Click Create

Generating a key pair to allow secure device communication

A sample key / value pair is provided at https://github.com/darryncampbell/enterprise-android-iot/tree/master/server-gcp/streaming for testing.  In production, you would need to generate your own public / private key pair as follows:

openssl req -x509 -newkey rsa:2048 -keyout rsa_private.pem -nodes -out rsa_cert.pem -subj "/CN=unused"

Convert your private key to pkcs8:

openssl pkcs8 -outform der -in rsa_private.pem -topk8 -nocrypt -out rsa_private_pkcs8

Delete the rsa_private.pem file that was generated, you will not need this.

Create a Test Device

GCP needs to know which devices will be communicating with it.  In a production environment you would likely create these devices using a script but in this tutorial we will just define a single unit using the UI. Each device requires a unique ID.

  1. From the Registry details screen, click ‘Create device’
  2. Give the device an ID, here ‘test-device’
  3. Select RS256_X509 for the public key format and paste the contents of rsa_cert.pem generated in the previous step
  4. Click ‘Create’

We now have a single device, ‘test-device’ under the previously created registry, ‘ent-android-iot-registry-gcp’

registry details.png

Activate the Cloud shell and clone the github repository

Creating Cloud Functions and Dataflow jobs is best achieved via the command line.  Although it is possible to install the gcloud command line SDK and required tools locally, it is easier when getting started to use the cloud shell which Google provides.

  1. Open the Cloud shell using the icon on the top bar

Open cloud shell.png

  1. Clone the sample project from https://github.com/darryncampbell/enterprise-android-iot and navigate to the /server-gcp/ directory

cloud dataflow 1.png

Create a cloud function to alert whenever the battery drops below 15%

  1. Navigate to the /function folder in the cloned repository

cd function

  1. Deploy a function to Cloud Functions in the form

gcloud beta functions deploy iot --stage-bucket [BUCKET] --trigger-topic [TOPIC]

So, in our case the function would be

gcloud beta functions deploy iot --stage-bucket ent-android-iot-bucket-gcp --trigger-topic device_telemetry

  1. Open the Cloud Functions console.

  2. Confirm that you created a function

cloud function.png

Until we have actually sent data to the cloud function, nothing will show up in the logs.  The function that was deployed is here and compares the battery level with a hardcoded threshold (15%), logging an error if the battery level is below this threshold.

Deploy a streaming application to Cloud Dataflow

This will store all the data sent from our devices into a BigQuery table.  Data is received from the previously defined Cloud Pub/Sub topic and streamed to BigQuery using a Dataflow job.

The format of the command to create the Dataflow job responsible for converting the received Pub/Sub content into a BigQuery table is as follows:

mvn compile exec:java -Dexec.mainClass= com.darryncampbell.enterprise.android.iot.Converter

-Dexec.args="--project=[PROJECT] --stagingLocation=gs://[BUCKET]/[BUCKET_FOLDER]

--topic=[TOPIC] --runner=DataflowRunner --streaming=true --numWorkers=1

--zone=[ZONE] --workerMachineType=n1-standard-1"

  1. Navigate into the /streaming directory

cd ../streaming

  1. In the case of the tutorial the bucket name, bucket folder, topic name, and zone are specified as follows:

mvn compile exec:java -Dexec.mainClass=com.darryncampbell.enterprise.android.iot.Converter

-Dexec.args="--project=ent-android-iot-server-gcp

--stagingLocation=gs://ent-android-iot-bucket-gcp/temp --topic=device_telemetry

--runner=DataflowRunner --streaming=true --numWorkers=1 --zone=europe-west1-b

--workerMachineType=n1-standard-1"

cloud dataflow2.png

  1. Open the Cloud Dataflow console and confirm that a streaming job is running:

cloud dataflow job.png

Generate simulated IoT data (we will later replace this with a real device)

The server components should now all be configured successfully.  IoT core is listening for MQTT clients connecting and sending data through Pub/Sub.  Cloud functions and Dataflow are taking actions based on the content being published, logging to StackTrace or storing in BigQuery respectively.

Before moving to a real device, we will simulate an mqtt client to check everything is working. This simulator sends a single message and then exits, it will not continuously send messages.

  1. Back in the cloud shell, still in the cloned /server-gcp/streaming directory, type the following command:

mvn exec:java -Dexec.mainClass="com.darryncampbell.enterprise.android.iot.Simulator"

-Dexec.args="-project_id=[PROJECT] -registry_id=[REGISTRY] -device_id=[DEVICE_ID]

-private_key_file=[PRIVATE_KEY] -algorithm=[ALGORITHM] -cloud_region=[REGION]

-num_messages=10 -lat=35.660 -lng=139.728 etc for any parameters the simulator needs"

Warning: The private key needs to correspond with the previously created public key.  As previously stated, test keys are available in the GitHub repository but in production you would generate your own keys.

In the case of this tutorial, the command is as follows (with some test data):

mvn exec:java -Dexec.mainClass="com.darryncampbell.enterprise.android.iot.Simulator"

-Dexec.args="-project_id=ent-android-iot-server-gcp -registry_id=ent-android-iot-registry-gcp

-device_id=test-device -private_key_file=./rsa_private_pkcs8 -algorithm=RS256

-cloud_region=europe-west1 -num_messages=1 -model=TC57 -lat=35.660 -lng=139.728 -battLevel=50

-battHealth=70 -osVersion=8.1.0 -patchLevel=September1-2019 -releaseVersion=00-85-99-44-22"

create simulator.png

Viewing simulated data in Cloud Functions

We expect our cloud function to trigger when it receives a battery of 15% or lower so test that by invoking the simulator again but specifying a battery level of 10%

  1. Send a simulated message indicating a low battery level

mvn exec:java -Dexec.mainClass="com.darryncampbell.enterprise.android.iot.Simulator"

-Dexec.args="-project_id=ent-android-iot-server-gcp -registry_id=ent-android-iot-registry-gcp

-device_id=test-device -private_key_file=./rsa_private_pkcs8 -algorithm=RS256 -cloud_region=europe-west1

-num_messages=1 -model=TC57 -lat=35.660 -lng=139.728 -battLevel=10 -battHealth=70 -osVersion=8.1.0

-patchLevel=September1-2019 -releaseVersion=00-85-99-44-22"

  1. Open the Cloud Functions console.
  2. To confirm that a function is processing data, click the More options icon on the right side of your function, and then click View logs:

cloud function view logs.png

  1. You should see a couple of logs to say the function was created
  2. You should also observe a log to indicate that the battery level was too low (it may take a minute or so to appear)

Cloud functions battery too low.png

You can also view and analyse these warnings in the Error console, including the ability to optionally receive Error reporting notifications.

  1. Open the Error console.
  2. Select the Battery level error for the test-device
  3. View the results dashboard, if you have simulated sufficient errors you should see a historic graph.

Error reporting.png

Verify the Cloud DataFlow is working:

  1. Open the Cloud Dataflow console.
  2. To confirm that a streaming Cloud Dataflow job is processing data, click the job ID:

Cloud Dataflow working.png

  1. Open BigQuery
  2. Click the ‘Compose Query’ button
  3. To confirm the data is being stored:

select * FROM `ent-android-iot-server-gcp.iotds.device_data` LIMIT 10

Or (depending on the version of BigQuery in use):

select * FROM [ent-android-iot-server-gcp.iotds.device_data] LIMIT 10

Note the table is addressed by comprises of the project ID (ent-android-iot-server-gcp), the dataset, as defined by the Dataflow conversion function (iotds) and the table name, also defined by the Dataflow conversion function (device_data)

You should see an entry for each simulated message you sent (it may take a minute of so for these to appear)

BigQuery results.png

Conclusions / Next steps

Hopefully this tutorial has shown some of the capabilities of what you can do with IoT data using the Google Cloud Platform.  To refer back to the original tutorial on which this was based, some suggested next steps to learn more about GCP IoT, data processing and visualization are available from Google:

Variables used:

For ease of reference, the values used in this tutorial are as follows:

VariableValue
Project Ident-android-iot-server-gcp
Cloud storage bucket nameent-android-iot-bucket-gcp
Cloud storage bucket locationeurope-west1
Cloud storage bucket foldertemp
Pub Sub topicdevice_telemetry
Registry IDent-android-iot-registry-gcp
Registry regioneurope-west1
Device IDtest-device
Zoneeurope-west1-b
Attached Images: