Light Gesture Recognizer With TinyML

Scheduled Appliance Controller

DESCRIPTION:

This project continues to demonstrate the PicUNO board's capabilities by running a Machine Learning (ML) model that interprets gestures based on light patterns. The system uses a Light Dependent Resistor (LDR) to detect dynamic changes in light intensity and classifies them into distinct gestures like cover, blink, and ambient in real-time.

TECHNOLOGY STACK:

Hardware: PicUNO board, HW-486 Light Dependent Resistor (LDR) Module.
ML Platform: Edge Impulse (a TinyML platform for data collection, training, and deployment).
Development Environment: Arduino® IDE (for C++ programming and deployment to the board).
Connectivity: Edge Impulse CLI for bridging the hardware with the cloud platform.

HARDWARE REQUIRED:

  • PICUNO Microcontroller board
  • 1 × HW-486 Light Dependent Resistor (LDR) module
  • Jumper wires
  • USB cable

CIRCUIT DIAGRAM:

Light Gesture Recognizer

LDR MODULE:

  • Connect the VCC / + pin to 5V pin on the board.
  • Connect the GND / - pin to GND pin on board.
  • Connect the Signal / S pin to Analog Pin A0 (GPIO 26).

SCHEMATIC:

LDR VCC / (+) → 5V

LDR GND / (-) → GND

LDR Signal / (S) → A0 (GPIO 26)

ENVIRONMENT AND SOFTWARE SETUP:

1. Flash PicUNO Firmware:

  • Download the special Edge Impulse firmware for the RP2040 from https://cdn.edgeimpulse.com/firmware/raspberry-rp2040.zip
  • Put the Picuno into bootloader mode: hold down the BOOTSEL button, plug it into your computer, then release the button. It will appear as a USB drive.
  • Drag and drop the downloaded firmware file (.uf2) onto that drive. The board will automatically reboot.

2. Node.js Installation:

  • Install Node.js LTS version from the official website: https://nodejs.org. Run the installer, accepting all the default options.
  • After it's installed, close and open the Command Prompt and type these commands one after the other: node -v and npm -v.
  • You could see the version numbers (e.g., v18.17.0), that denotes the Node.js (LTS version) was installed, which provides the npm package manager.

3. Build Tools Installation:

4. Final Installation:

  • After restarting, open a new Command Prompt as Administrator and run the installation command:
  • npm install -g edge-impulse-cli

5. Setting up Edge Impulse:

  • Create an Account: Go to edgeimpulse.com and sign up for a free account. Once you're in, create a new project and give it a name "Light Gesture Recognizer."
  • Set a Password in Edge Impulse Studio: Once you are on your project dashboard, look at the menu on the right side. Click on "Account settings". You will see a section to "Set password". Create a new password for your account and save it.
  • Connect the PicUNO to Edge Impulse: Ensure the Picuno board is running the Edge Impulse firmware. Open a new Command Prompt (it does not need to be an administrator prompt). Run the daemon command: edge-impulse-daemon --clean
  • Follow the on-screen prompts to log in to your account and select your project. And the given link when selected will redirect you to Edge Impulse Studio. Your Picuno should now appear as a connected device in your Edge Impulse Studio under the Devices tab.

ML DEVELOPMENT MODEL WORKFLOW (EDGE IMPULSE):

1. Data Collection:

  • Go to the Data acquisition tab in your Edge Impulse project.
  • In the "Collect data" box on the right:
    • Device: Select picuno.
    • Label: Type the name of the gesture you are about to perform.
    • Sample length: Set to 2000 (2 seconds).
    • Sensor: Select "ADC sensor". This will record data from all available analog pins, including the A0 you've connected.
  • Collect gesture samples (20-30 of each):
    • Ambient: The normal, steady light in the room.
    • Cover: Covering the LDR with a hand or placing it in a dark space.
    • Blink: Pulsing a flashlight on and off over the sensor.

2. Impulse Design:

  • A pipeline was designed to process the time-series data from the LDR:
  • Create Impulse: Go to "Create impulse".
    • Input block: Time series data.
    • Processing block: Spectral Analysis.
    • Learning block: Classification (Keras).
    • Click Save Impulse.
  • Processing Block: Go to the "Spectral features" tab and click Generate features. Spectral Analysis was chosen to extract key features (like frequency and power) from the motion data.
  • Learning Block: Go to the "Classifier" tab and click Start training. A Classification (Keras) block was used to create a neural network to learn the patterns from the extracted features.

3. Model Training:

After generating features, the neural network was trained. The model achieved 100% accuracy on the validation set, demonstrating a perfect ability to distinguish between the three distinct light patterns.

4. Testing the Model:

  • On the left menu, go to the "Live classification" tab.
  • Your picuno device should be connected. Click "Start sampling" and perform one of your gestures.
  • The model will analyze the movement in real-time and show you its prediction.

DEPLOYMENT TO PICUNO:

1. Deploy Model:

Go to the "Deployment" tab, select "Arduino® library", and click Build. This will generate a .zip file containing the complete, optimized model and all necessary functions.

2. Arduino® Setup:

  • Open a New Sketch in Arduino® IDE.
  • Go to Sketch → Include Library → Add .zip library… and select the downloaded .zip file.
  • You can find the added library file in File → Examples → Light_gesture_Recognizer_inferencing

FINAL CODE:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
/* Includes ---------------------------------------------------------------- */
// IMPORTANT: Make sure this line matches the name of your downloaded library.
// It should be something like "Light_Gesture_Recognizer_inferencing.h"
#include <Light_gesture_Recognizer_inferencing.h>

/* Constant defines -------------------------------------------------------- */
// The model expects a single input value from the LDR on pin A0
#define EI_CLASSIFIER_RAW_SAMPLES_PER_FRAME   1

// The number of times we sample the sensor per second (must match your training)
#define FREQUENCY_HZ                            100
#define EI_CLASSIFIER_INTERVAL_MS               (1000 / FREQUENCY_HZ)

/* Private variables ------------------------------------------------------- */
static const bool debug_nn = false; // Set to true to see features and timings

/**
 * @brief      Arduino® setup function
 */
void setup()
{
    // Initialize serial port for printing the results
    Serial.begin(115200);
    while (!Serial);
    Serial.println("Edge Impulse Light Gesture Recognizer");
}

/**
 * @brief      Get data from the sensor and run inferencing
 */
void loop()
{
    ei_printf("\nStarting inferencing in 2 seconds...\n");
    delay(2000);
    ei_printf("Perform a light gesture now...\n");

    // Create a buffer to hold the raw sensor data
    float buffer[EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE] = { 0 };

    // Collect data for one full window (e.g., 2 seconds at 100Hz)
    for (size_t ix = 0; ix < EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE; ix += EI_CLASSIFIER_RAW_SAMPLES_PER_FRAME) {
        // Get the start time for this sample
        uint64_t next_tick = micros() + (EI_CLASSIFIER_INTERVAL_MS * 1000);

        // Read the analog value from the LDR sensor connected to pin A0
        buffer[ix] = (float)analogRead(A0);

        // Wait until the next sampling interval is due to maintain frequency
        uint64_t wait_time = next_tick - micros();
        if(wait_time > 0) {
           delayMicroseconds(wait_time);
        }
    }

    // Turn the raw buffer into a signal which we can classify
    signal_t signal;
    int err = numpy::signal_from_buffer(buffer, EI_CLASSIFIER_DSP_INPUT_FRAME_SIZE, &signal);
    if (err != 0) {
        ei_printf("Failed to create signal from buffer (%d)\n", err);
        return;
    }

    // Run the classifier to get a prediction
    ei_impulse_result_t result = { 0 };
    err = run_classifier(&signal, &result, debug_nn);
    if (err != EI_IMPULSE_OK) {
        ei_printf("ERR: Failed to run classifier (%d)\n", err);
        return;
    }

    // Print the prediction results to the Serial Monitor
    ei_printf("Predictions:\n");
    for (uint16_t i = 0; i < EI_CLASSIFIER_LABEL_COUNT; i++) {
        ei_printf("  %s: %.5f\n", result.classification[i].label, result.classification[i].value);
    }
}

Code Explanation:

#include <Light_gesture_Recognizer_inferencing.h> - Includes the machine learning model library generated by Edge Impulse.


FREQUENCY_HZ = 100 - Samples the LDR sensor 100 times per second to match the training data frequency.


buffer[ix] = (float)analogRead(A0) - Reads the analog value from the LDR sensor and stores it in the buffer for classification.


run_classifier(&signal, &result, debug_nn) - Runs the trained neural network model on the collected sensor data and stores predictions in the result structure.


result.classification[i].value - Contains the confidence score (0.0 to 1.0) for each gesture class (ambient, cover, blink).

RESULTS:

By opening the Arduino® Serial Monitor at 115200 baud, the live classification results were successfully displayed, with the board accurately predicting the performed gestures.

CONCLUSION:

This project successfully demonstrated that the PicUNO board is fully capable of running a modern, real-time machine learning model. The use of the Edge Impulse platform streamlined the development process, from data collection to deployment. The final system accurately classifies light gestures, proving the viability of the hardware for TinyML applications.

Download Related Files