Using Edge Impulse AI Inferences to Trigger Events in Arduino C++
By adding a few lines of code to an Edge Impulse Arduino library you can use AI to interact with the world around you (video below).
This tutorial assumes that you are familiar with Edge Impulse, and that you have successfully deployed an Arduino library. In this example I will be using an audio model that I trained to recognize the words “on” and “off”, running on an Arduino Nano 33 BLE Sense.
First thing’s first — navigate to your Arduino .ino
file and open it in the Arduino IDE (in my case I’ll be looking under nano_ble33_sense_microphone
in the examples directory). Our primary area of interest will be the loop()
function, around line 68. Let’s take a look:
Before the block of prints near the bottom, we can add a few lines of code to take advantage of the predictions just made.
How are the predictions stored?
After the recording, the predictions are stored in a class called result
and can be accessed in the following way:
// Label
result.classification[i].label// Value
result.classification[i].value
If you’re not new to coding, you probably already see where this is going. What if we were to loop over all of the predictions and print those which passed a certain threshold of certainty?
Now that you have a confident inference, you can use its value to trigger a variety of events. In this case, I’ll be using it to turn a lamp on/off. You could just as easily use an LED (in fact, the setup would be the same).
Now for the results:
Here is the full loop()
function — paste it into your code, adjust to suit your needs, and share the results! Good luck!