top of page


March 2020

Sound on!


Tricerabot is a robotic arm that uses a model made with Google's teachable machine to distinguish between, and then sort, different colored and shaped lego bricks.

The Bones

The large black background that the robot sports (and from which it got its name) allows the AI model to better predict the shape/color of the brick, independent of the room and lighting that the robot is in. The robotic arm itself has three degrees of motion: 360° swivel on top of base, opening and closing of the shovel, and up/down motion of the arm itself.

The Brains

The tensorflow model generated with teachable machine was imported into LabView. The LabView code (available below for download, together with the model) uses the computer's camera to look at the bricks being presented, uses the model to decide which brick it is being shown, and finally sends that information to SystemLink. The EV3 brick, which controls the robotic arm, then downloads that information from SystemLink and sorts the bricks into the corresponding cup. 

Code Overview

Below is a glimpse of what the LabView code looks like, and the full VS code for controling EV3.

Basic idea

Import model, take a picture with computer's camera, process and compare against model, send model output to SystemLink.


Full LabView code here!

VS Code for robot control
bottom of page