Where the air is crisp

Now that we got a working platform, working modules and a working network, it’s time to actually make something! We want people to see how simple it is to both integrate with us and also process the data they collect themselves. Let’s become our own customers!

A cool project

The climate of the world is obviously changing, and Trondheim Municipality has worked with us in Exploratory Engineering to make a new version (upgraded connectivity from LoRaWAN to NB-IoT) of the bus sensor placed on top of Tromsø buses a year back. The sensors reports on temperature, humidity, CO2 equivalents, VOC equivalents, PM 10, PM 2.5 as well GPS coordinates. Some of these are more interesting than others, and while CO2 equivalents is not really saying much, it’s a cool metric to visualize.

APIs APIs APIs!

The IoT Platform has several well documented APIs which you as a user of the platform can use at your heart’s content. But we got something even better. We have client libraries. These libraries makes it easy to integrate your own projects. Basically you just configure the client with a URL(https://api.nbiot.engineering) and an API token and you’re good to go.

My own little backend

Thinking like a customer, I want my own backend where I store and process my data. This way I “own” my own data and doesn’t have to rely on external solutions. I looked through several options, and ended up on a super simple solution created in Kotlin (Java, but fun) with the addition of the library Javalin to create the REST layer so I could expose the data to a frontend later.

Step 1 - Getting the data

Kotlin is interoperable with Java, meaning I could simply use the Java client and I’d be on my way.

val NBIoTClient: Client = Client("https://api.nbiot.engineering", "awqfa443hahayouthoughtIpastedtherealtoken")

val clientDataPoints = NBIoTClient.data("27dh1cf44jfi2f") // Not the real collectionId

That was it. I had my data! That was basically two lines of code. Fun!

Step 2 - Processing the data

The data the devices is sending is in binary format and encoded with Base64 (we talked a little bit about it in Making sense out of nonsense), so we need to massage the data for it to make sense. Fortunately, with Kotlin that’s quite easy.

val device = Device.fromDto(outputMessage.device())
val byteBuffer = ByteBuffer.wrap(outputMessage.payload())
val dataPoint = TKAQDataPoint(
        collectionId = device.collectionId,
        deviceId = device.id,
        timestamp = outputMessage.received(),
        payload = payload,
        time = getTime(byteBuffer),
        longitude = getLongitude(byteBuffer),
        latitude = getLatitude(byteBuffer),
        altitude = getAltitude(byteBuffer),
        relHumidity = getRelativeHumidity(byteBuffer),
        temperature = getTemperature(byteBuffer),
        co2Equivalents = getCO2Equivalents(byteBuffer),
        vocEquivalents = getVOCEquivalents(byteBuffer),
        pm10 = getPM10(byteBuffer),
        pm25 = getPM25(byteBuffer)
    )

private fun getTime(byteBuffer: ByteBuffer) = byteBuffer.getFloat(0)
private fun getLongitude(byteBuffer: ByteBuffer) = (byteBuffer.getFloat(4) * (180 / Math.PI)).toFloat()
private fun getLatitude(byteBuffer: ByteBuffer) = (byteBuffer.getFloat(8) * (180 / Math.PI)).toFloat()
private fun getAltitude(byteBuffer: ByteBuffer) = byteBuffer.getFloat(12)
private fun getRelativeHumidity(byteBuffer: ByteBuffer) = byteBuffer.getFloat(16)
private fun getTemperature(byteBuffer: ByteBuffer) = byteBuffer.getFloat(20)
private fun getCO2Equivalents(byteBuffer: ByteBuffer) = byteBuffer.getShort(25).toInt()
private fun getVOCEquivalents(byteBuffer: ByteBuffer) = byteBuffer.getShort(27).toInt()
private fun getPM25(byteBuffer: ByteBuffer) = byteBuffer.getShort(33).toInt()
private fun getPM10(byteBuffer: ByteBuffer) = byteBuffer.getShort(35).toInt()

Here we take an single measurement, fetch some data about the device from it and interpret the bytes it sent to us. Since we are the one who programmed it, we know what’s behind every byte, and we can make a new class which we’ve conveniently called TKAQDataPoint (Trondheim Kommune Air Quality, creative I know) which stores the processed data.

Step 3 - Store the processed data

Now that we have the processed data, it would make sense to persist this as it would be quite an overhead to fetch the data from the IoT platform and do the transformation each time someone requested some data. I wanted a flexible framework that was also easily configurable so I could conveniently transition from development to “production”.

JDBI is a simple common interface for SQL databases which lets you just feed it with a driver and connection string and you’re on your way. This worked perfectly as during development I’m using SQLite. For “production” purposes I generally tend to lean towards PostgreSQL which could now share the same queries as development.

Step 4 - All together now

So this is what I came up with as a general happy path gist of the application:

  1. Initiate a DB with given connection string if the DB doesn’t exist
  2. Fetch the latest stored TKAQDataPoint
    1. If none found, fetch data since time = 0
    2. If found, fetch since that TKAQDataPoint
  3. With the result from 2, process the data from a raw outputMessage to a TKAQDataPoint
  4. Store the processed data in the DB.

Step 5 - BONUS OBJECTIVE, LIVE DATA

One could run the previous step with points 2-4 on repeat with a timeout, but that’s simply not cool. We want live data. Not only that, I wanted to have live data available in the frontend as well. Conveniently enough, you guessed it, the client has an offering for us here.

NBIoTClient.collectionOutput(collectionId) { handler ->
    handler.onConnect { session -> TKAQ_LOG.debug("Connected WebSocket session for $collectionId @${session.localAddress}") }
    handler.onClose { code, reason -> TKAQ_LOG.debug("Closed with code $code due to $reason") }
    handler.onMessage {
        try {
            val tkaqDataPoint = TKAQDataPoint.fromOutputMessage(it)

            DB.addDataPoint(tkaqDataPoint)
            WebSocketHandler.broadcastTKAQDataPoint(tkaqDataPoint)

            TKAQ_LOG.debug("Added data point from device ${it.device().id()}")
        } catch (ex: IllegalArgumentException) {
            TKAQ_LOG.warn("Could not map stream data to TKAQ data point", ex)
        }
    }
    handler.onError { session, throwable ->
        TKAQ_LOG.error("Received error from websocket.", throwable)
        if (session.isOpen) {
            session.close(CloseStatus(1006, "Got error from WebSocket"))
            session.disconnect()
        }
        reconnectCollectionStream(collectionId)
    }
}

This makes the client set up a WebSocket connection to the IoT Platform, and for each point received do the points 2-4 in step 4 on the fly. If you look close at the code example, you’ll also see the line WebSocketHandler.broadcastTKAQDataPoint(tkaqDataPoint); This line is part of a bigger mechanism, and what it allows for us to do, is broadcast the new data point to WebSockets directly connected to our backend.

CO2 bad, let’s make it look awesome

Now for the fun part, let’s visualize the data we got. We will focus on CO2 for the heatmap visualization and view the other measurements as charts underneath. But before we could get to making cool visualizations, we need a way to get the data.

APIs

Javalin makes it remarkably simple to create and expose APIs as well as WebSockets.

val app = Javalin.create().apply {
    port(Config.port.toInt())
    enableStaticFiles("/public")
    enableSinglePageMode("/", "/public/index.html")
    enableCorsForAllOrigins()
    defaultContentType("application/json")
}

app.routes {
    before("*") { ctx ->
        if (ctx.header("x-forwarded-proto") == "http") {
            val queryString = ctx.queryString()?.let { query -> "?$query" } ?: ""
            ctx.redirect("https://${ctx.header("host")}${ctx.path()}$queryString", 301)
        }
    }
    get("hc") {
        it.status(200).result("pong")
    }
    path("api/v1") {
        path("collections") {
            get(CollectionController::getCollections)
            path("/:collection-id") {
                get(CollectionController::getCollection)
                path("/stream") {
                    ws(WebSocketHandler::handle)
                }
                path("data") {
                    get(DataController::getDataForCollection)
                }
                path("/devices") {
                    get(DevicesController::getDevicesForCollection)
                    path("/:device-id") {
                        get(DevicesController::getDeviceInCollection)
                        path("/data") {
                            get(DataController::getDataForCollectionDevice)
                        }
                    }
                }
            }
        }
    }
}

This is the whole code enabling the API layer. We hierarchically define the endpoints and connects them to the corresponding code which will run when the endpoints are hit. The controller code has access to the storage layer and acts on the request. Clean and simple stuff.

Frontend

We’re using Aurelia with TypeScript to create a simple frontend which consumes the exposed APIs and visualizes both as a heatmap (through Leaflet) and as simple charts (through Plotly). We chose the libraries due to the need for extreme performance and handling up to 100 000 datapoints visually.

The result

We now have a fully working solution as built from an external developer. We have sensors which uses our NB-IoT network and sends data to our IoT Platform. From there it is routed to our own backend where we process the payloads and store them “locally” in our own database. From here we expose our own APIs which is consumed by a frontend. The frontend presents the data in all it’s visualization glory.

Phew. This is the result:

dashboard_demo

To sum up the whole experience; it was suprisingly simple. There’s nothing stopping you from changing the nature of every step, be it processing, where to store it and in what format and how to visualize it. It’s all up to you, and everything is available right at your fingertips.

As always, happy hacking! :)

You can find the code for the project at Exploratory Engineerings Github. The final result can be found hosted at Heroku. The only component that isn’t in the free tier of Herokus is the database due to the amount of datapoints available (over 10 000). The hosted solution is mainly to show how it can almost be free to get something like this from the prototype stage to a PoC (it will break if put through heavy load).