Smart Trash Can
Building a Smarter Trash Can using ARTIK 710 and Amazon Rekognition. In this project you'll create a smart trash can system that distinguishes between recyclables and trash by using an ARTIK 710 development kit and the Amazon Rekognition API.
There are three trash cans and a camera. When an object is presented to the camera, the system determines the object type and opens the corresponding trash can lid.
The receptacle numbers used here are:
How it works. A USB camera is hooked up to the ARTIK 710 kit. Users place an item in front of the camera and press a button to take a picture of it. A Python application then uploads the picture to the Amazon AWS Cloud S3 account.
When the upload is done, AWS Rekognition functionality is triggered to recognize the objects in the uploaded picture. The results are returned in JSON format. A parser application returns the top 3 confidence results. Receptacle lids open as follows.
- Receptacle #1: paper/document/diary
- Receptacle #2: bottles/cans/tin
- Receptacle #3: all other results
ARTIK 710 GPIO pins control the lid actions.
You can purchase trash cans with motion detection capability from a store such as Home Depot and adapt them for this use case. The mechanical part of each trash can is kept as it is. Disable the motion detection part by unplugging the connectors from the circuit. The motor is connected directly to the IC L293D on the breadboard as shown in the diagram.
Follow the AWS Services articles to get an AWS account, set up the AWS S3 photo bucket, and try out AWS Rekognition.
A universal USB web camera is recommended in this project. It is connected to a USB port of ARTIK 710. As long as the camera is UVC-compatible, it can be automatically recognized by the OS. The camera setup and full view of the control part are shown.
Using Python to drive the camera, the code is pretty simple – just put the lines below into a file named
photo.py. To run it, Node-RED will execute
python photo.py every time you push the button to take a picture.
1 2 3 4 5 6 7
import cv2 cap = cv2.VideoCapture(0) ret, frame = cap.read() cv2.imwrite("img_trashcan.jpg", frame) cap.release()
Refer to the OpenCV Python article if you'd like additional information.
Seeed interface board setup
ARTIK 710 needs to work with an interface board and break out the external interface signals to larger connectors. GPIO pin mappings for ARTIK 710/530 are used as follows, with direction control as noted below.
||(paper / cardboard)|
||(bottle / can)|
(Big Red Button)
An ARTIK 710 kit with interface board, breadboard and battery pack are shown.
H-Bridge trash can motor control
The H-Bridge IC L293D, also called a ‘Motor Driver’, is a perfect choice for controlling the motor movement. The following diagram shows the wiring of L293D. For example, for GPIO 0 and 1 controlling the motor for Paper / Cardboard :
|GPIO 0||GPIO 1||Motor direction|
The motor control diagram is shown.
Configuring Amazon Technologies
The most interesting part of this Smart Trash Can system is the object recognition.
Follow the Amazon Rekognition API tutorial to set up and check out this functionality, then return here and continue below.
Setting Up Node-RED Workflow
Now we need to integrate all parts together by using Node-RED .
Create a GPIO entry for the Big Red Button. Here we are using GPIO 25.
echo 25 > /sys/class/gpio/export
echo in > /sys/class/gpio/gpio25/direction
Open and copy the file SmarterTrashCan.txt to your clipboard.
Import the clipboard to your Node-RED canvas. You will see the workflow like this:
Deploy the workflow.
Trying It Out
Put the trash in front of the camera and press the Big Red Button. In around 5 seconds you will see a Rekognition result like the following.
In this case, the item is recognized as a piece of paper, so trash can #1 will open.
The integration of hardware and Web services is a clear trend in the IoT field. Amazon AWS, Microsoft Azure and other cloud-based tools provide handy APIs that can dramatically enhance the potential for ARTIK modules.