Created under 48h during Hack The North 2016.
AVA, also known as Augmented Virtual Assistant, is a portable companion utilizing the Pepper's Ghost Illusion concept from more than a hundred years ago.
This assistant makes use of three main components: a Raspberry Pi connected with three pressure sensors, an Android application for speech-to-text recognition to communicate with AVA as well as the screen and recycled plastic duo to give a three-dimensional component to our project.
The Raspberry Pi is programmed in Python to read the sensors' pressure values; these are evaluated on a variable scale of values to give the user a more intuitive experience. The accumulated data is pushed onto a real-time database provided by Firebase every second to simulate real-time transferring and updating.
The Android application utilizes the Google speech-to-text API to recognize the user's commands to AVA. The text received is later passed to be analyzed: if there are occurrences of any known commands, AVA will execute that command. Every time a command is chosen, it is pushed onto the Firebase database. The main logic for all assistant tasks is processed in this application.
The screen used to display the holographic-like projection of AVA was created in Unity. The 3D models as well as animations are distributed for free by mixamo as well as other sites such as turbosquid and Autodesk. As the viewer of our project, the program behind the screen projection verifies every second for any updated values or commands to understand which animation and scene to display.