DEV Community

Uendi Hoxha
Uendi Hoxha

Posted on

Project Overview: Real-Time Smart Building Monitoring System with Amazon Kinesis

Architecture Overview

Components
IoT Sensors - High-fidelity sensors monitor environmental variables such as temperature, humidity, light levels and occupancy.
Kinesis Data Stream - Collects real-time data from various IoT sensors deployed in the building.
AWS SQS: Acts as a buffer to handle traffic spikes by queuing incoming sensor data, ensuring reliable message delivery and smoothing out the data flow to the downstream Lambda function.
AWS Lambda - Processes the incoming data, applies transformations and performs analytics.
DynamoDB - Stores processed data for structured queries and historical analysis.
Data Visualization Tools - Grafana of Amazon Athena for analyzing sensor metrics and insights.

Image description

Use Case Scenarios

1. Predictive Maintenance
Utilize real-time environmental data and historical trends to predict when equipment (like HVAC systems) may require maintenance. By analyzing temperature fluctuations and operational patterns, the system can forecast potential failures, allowing for proactive maintenance scheduling.

2. Energy Optimization
Collect data on occupancy and environmental conditions to dynamically adjust HVAC systems, optimizing energy consumption and reducing costs. For example, if sensors detect that a room is unoccupied, the HVAC system can be adjusted accordingly.

3. Space Utilization
Monitor occupancy data in real-time to understand space utilization, enabling better planning and resource allocation within the building. Analyzing patterns over time can inform decisions about office layout or space reallocation.

Data Flow and Processing

Data Ingestion
IoT sensors send real-time data (temperature, humidity, light level, occupancy) to the Kinesis Data Stream.
Sensor Data Format:

{
  "sensor_id": "sensor_1",
  "temperature": 22.5,
  "humidity": 45.0,
  "light_level": 70,
  "occupancy": true,
  "timestamp": 1694658000
}
Enter fullscreen mode Exit fullscreen mode

Data Processing with AWS SQS
The Kinesis Data Stream triggers a Lambda function, which sends the data to an SQS queue. Another Lambda function, triggered by the SQS queue processes the messages by applying necessary transformations such as unit conversions or data normalization.

import os
import json
import boto3
from decimal import Decimal

# Use environment variable for the table name
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table(os.environ['DYNAMODB_TABLE_NAME'])
sqs = boto3.client('sqs')
queue_url = os.environ['SQS_QUEUE_URL']

def lambda_handler(event, context):
    for record in event['Records']:
        payload = json.loads(record['kinesis']['data'])

        # Data validation logic
        if validate_data(payload):
            transformed_data = transform_data(payload)
            # Send data to SQS for further processing
            send_message_to_sqs(transformed_data)
        else:
            print(f"Invalid data: {payload}")

    return {
        'statusCode': 200,
        'body': json.dumps('Data processed successfully')
    }

def validate_data(data):
    return 'sensor_id' in data and 'temperature' in data

def transform_data(data):
    return {
        'sensor_id': data['sensor_id'],
        'temperature': Decimal(data['temperature']),
        'humidity': Decimal(data['humidity']),
        'light_level': data['light_level'],
        'occupancy': data['occupancy'],
        'timestamp': int(data['timestamp'])
    }

def send_message_to_sqs(data):
    # Send transformed data to SQS
    try:
        response = sqs.send_message(
            QueueUrl=queue_url,
            MessageBody=json.dumps(data)
        )
        print(f"Message sent to SQS: {response['MessageId']}")
    except Exception as e:
        print(f"Error sending message to SQS: {e}")

Enter fullscreen mode Exit fullscreen mode

Data Storage
Processed data is stored in DynamoDB for structured querying and historical analysis. The data structure allows efficient retrieval and aggregation of sensor data.
DynamoDB Table Schema:
Table Name: SensorData
Partition Key: sensor_id (String)
Sort Key: timestamp (Number)
Attributes: temperature (Decimal), humidity (Decimal), light_level (Number), occupancy (Boolean)

DynamoDB’s query capability should be able to perform structured queries on the collected data, like this one:

response = table.query(
    KeyConditionExpression=Key('sensor_id').eq('sensor_1'),
    FilterExpression=Attr('occupancy').eq(True)
)
Enter fullscreen mode Exit fullscreen mode

Purposes of Analyzing Collected Data

Analyzing the collected data serves multiple purposes, enhancing the overall efficiency and management of the smart building system. Historical temperature and humidity data, along with occupancy patterns, enable dynamic adjustments to HVAC settings via AWS IoT, ensuring optimal comfort while conserving energy.

By correlating sensor data with equipment operational metrics, the system can identify trends that precede potential failures, facilitating proactive maintenance scheduling.

Implementing thresholds for temperature anomalies in DynamoDB allows for triggering alerts using AWS SNS when limits are exceeded, thus preventing equipment damage.

Additionally, monitoring energy usage patterns relative to occupancy levels drives energy-efficient upgrades, with reports created in Amazon QuickSight to visualize energy consumption against occupancy over time. This analysis also identifies under-utilized areas through aggregation queries in DynamoDB, informing decisions about office layout and resource allocation.

Furthermore, historical data is stored for longitudinal studies, with AWS Glue used to periodically batch process data from DynamoDB into Amazon S3 for deeper analytical queries via Amazon Athena.

Lastly, anomaly detection algorithms can be implemented using Amazon SageMaker, flagging unusual conditions based on historical data patterns to enhance safety and operational reliability.

Time for some demo...

Image description
The real-time temperature is streamed via Kinesis and processed by AWS Lambda. The processed temperature data is then queried from DynamoDB by the chatbot, which provides the response.

The historical data for the conference room is stored in DynamoDB. AWS Lambda processed and stored this data when it was collected yesterday. The chatbot queries this stored data to provide the historical temperature.
This scenario aligns with the "Predictive Maintenance" and "Space Utilization" use cases from the architecture, where the system can analyze trends and historical patterns.


Image description
This question is outside the scope of the data being collected and analyzed by the system. The chatbot appropriately responds with a fallback message, indicating its primary focus is on sensor-related data.


Image description
While this question goes beyond the basic temperature or environmental monitoring capabilities, it can be tied to an extended use case where occupancy sensors (part of the IoT network) could detect whether a room is occupied. This information could then be used to check availability. In this scenario, the chatbot is querying the occupancy data stored in DynamoDB for a booking system.

Top comments (0)