California Advanced Lab for Human-Computer Interaction
Welcome to the GitHub organization for CALHCI – a research and development lab dedicated to advancing the field of Human-Computer Interaction (HCI) through the integration of cutting-edge technologies such as affective computing, brain-computer interfaces (BCIs), extended reality (XR), intelligent systems, and robotics.
At CALHCI, we aim to push the boundaries of how humans interact with machines by building systems that are:
- Emotionally and cognitively aware
- Adaptable in real-time
- Grounded in user-centered design
- Engineered for real-world impact
- Affective Computing: Systems that sense and adapt to human emotions
- Brain-Computer Interfaces (BCI): Real-time interaction via neural signals
- XR & Spatial Computing: Mixed, augmented, and virtual reality environments
- Human-Robot Interaction: Real-time collaborative robotics and digital twins
- Accessible and Inclusive Design: Empowering all users through equitable technology
- VERO – Virtual Environment for Robotics Operations
- GazeOverlayApp – Real-time gaze visualization from Pupil Labs eye trackers
- HoloMuse – Eye-tracking and spatial analysis of visitor behavior in cultural spaces
- Affect-Tutor – Emotion-aware intelligent tutoring systems
- TowerVision – Real-time Tower of Hanoi tracking using depth cameras and color segmentation