Project : 


Project Briefing


        If you ask us, PiLampBot rates as the second coolest project after CRETA. Starting off as just an elementary school project used in our robotics demonstration classes, we now use it to showcase our computer vision and intelligence prowess used in various applications including our self-driving car. It’s basically a lamp that moves like a robot developed with a Raspberry Pi (RPi). The idea was drawn from that sad lamp in the logo of every Pixar movie!
        There had been three versions of the PiLampBot, each successor getting more and more interesting for the audience. The first version, during its humble beginnings, was just a simple tabletop lamp with two servos, allowing a 2-dimensional movement of its head controlled by an RPi 3. However, the RPi was short on robotics compared to an Arduino, making it no easy feat. In the end, we had to develop our own algorithm to improve the controls over the servos. We downsized the complexity of the mathematics enough to only include the head’s movement in a Cartesian plane, since the aim of the project is to showcase it as STEM Education in schools. A smartphone embedded glove controls the movement of the lamp head, giving you that force-wielding feeling for all you Star Wars fans. Sensors from the smartphone transfer data via Wifi to the RPi to make it move.
        The second version of this project, which was a little backwards, used two potentiometers—two knobs that can be twisted—to control the head. Although it is quite a downgrade, one thing at this point of this project that we can proudly consider an upgrade is the new algorithm that allows the lamp to appear like it is a living thing ‘waking up’ naturally.
        Today’s version of the PiLampBot ditched all controllers as we try to move in line with technology. Using a camera, video recognition software and artificial intelligence, the PiLampBot can now move on its own! The servos move following faces that it detects on the camera fluid and seamlessly.

Research & Development Status


Teaching students on science and technology



  • 1
  • 2
  • 3
  • 4
  • 5

Development Phases

Overall Readiness



Types Percentile
Artificial Intelligence 35%
Mechanical 15%
Computer Vision 20%
Robotics 30%



  • Second version of the PiLampBot

    The second version of the PiLampBot

  • PiLampBot with smartphone embedded in glove

    Movement of PiLampBot controlled by sensors in a smartphone

  • PiLampBot with computer vision

    PiLampBot follows the movement of a subject via computer vision

Download Full Report





  • Video Recognition Software

    Application of computer vision on the PiLampBot allows video recognition, allowing the lamp head to follow the movements of a subject captured on camera.

  • Mechanical Components

    Servos and potentiometers allow control of movements of the lamp head, making the lamp look as it it is alive.

  • Improved Algorithm

    The second version of the lamp includes an upgraded algorithm that makes the lamp appear as if it is a living, breathing creature waking up when it is turned on.

  • Sensors

    Using the sensors from a glove embedded mobile device, the data of the position of the smart phone allows the mapping of movement of the lamp.



Image Name Rank
Haziq The.Ignition
Muzzammil The.