See Holiday Robot in Action

Watch Holliday in operation in this short video.

Watch on YouTube

Project Overview

RoboFace is an experimental project that explores the intersection of artificial intelligence, robotics, and human expression. The robotic face, named "Holliday", serves as our test subject. At its core, this project investigates: How does an AI language model behave when given control over a human-like face and made to understand that it is controlling a human face?

Core Research Question: Can an LLM develop meaningful, context-aware behaviors when controlling facial expressions, understanding the logical constraints of human facial anatomy (e.g., if eyelids are closed, moving pupils makes no sense)? Can Holliday, under AI control, exhibit human-like expression patterns, or will it reveal the limitations of AI reasoning?

This project consists of two main components: an Arduino-based control system that manages Holliday's physical servomotors, and a Python script that interfaces with a local LLM to make autonomous decisions about facial expressions and movements.

System Architecture

Data flow:

Local LLM (LlamaCPP)Python Controller ScriptArduino Uno R4 WiFiPCA9685 Servo DriverHolliday (Robotic Face)

Hardware

  • Arduino Uno R4 WiFi
  • PCA9685 PWM Servo Driver
  • 16 Servomotors
  • WiFi Connectivity

Software

  • Arduino C++ Firmware
  • Python 3 Controller
  • REST API Interface
  • Web Interface

AI Integration

  • Local LLM (LlamaCPP)
  • Context Management
  • Action Planning
  • Logical Constraints

The Arduino Control Kernel

The Arduino Uno R4 WiFi serves as the control kernel, bridging AI decisions and the physical servomotors. The firmware exposes a REST API for manual and AI-driven operation.

Key Features

Servo Channel Mapping

Channel 0: Head (Rotation: 47°-180°, Center: 114°)
Channels 1-4: Eyelids (Rest: 10°, Closed: 90°)
Channel 5: Horizontal Pupils (60°-110°, Center: 85°)
Channel 6: Vertical Pupils (70°-120°, Center: 95°)
Channel 7: Mouth (Open: 47°, Closed: 80°)
Channels 8-15: Reserved

API Endpoints

The Python LLM Controller

The Python script is the brain of the system. It queries a local LLM (LlamaCPP) to decide the next facial expression or movement based on context, in a continuous loop (about once per second), executing up to 3 simultaneous actions when logically consistent.

Key Innovation: The LLM is instructed that it is controlling a human face and must respect logical constraints (e.g., if eyelids are closed, moving pupils is pointless).

How It Works

LLM Interaction Flow

1. Script sends available actions, limits, context, and constraints to the LLM. 2. LLM returns one or more actions (test or servo). 3. Script validates, enforces limits, executes via REST API, updates history. 4. Process repeats every second.

Experimental Nature

This project is experimental. Holliday is a research tool to observe:

Research Goal: Contribute to understanding how AI behaves when given autonomous control over physical systems with logical constraints, especially when those systems are anthropomorphized. Through Holliday, we observe whether an LLM can develop an intuitive sense of human facial expression logic.

Observations and Conclusions

Initial Findings

Key Conclusion

Primary Finding: LLMs can follow logical constraints and produce contextually appropriate expressions, but they do not demonstrate true understanding of human facial anatomy. They rely on pattern matching and instruction following. With clear constraints and context, they can still produce surprisingly human-like sequences that appear intentional to an observer.

Technical Specifications

Hardware

Software

Communication

Future Research Directions

Need More Information?

If you need additional information about Holiday Robot / RoboFace or want to discuss this project, please send an email.