Skip to content
@CALHCI

CALHCI

California Advanced Lab for Human-Computer Interaction

CALHCI

California Advanced Lab for Human-Computer Interaction

Welcome to the GitHub organization for CALHCI – a research and development lab dedicated to advancing the field of Human-Computer Interaction (HCI) through the integration of cutting-edge technologies such as affective computing, brain-computer interfaces (BCIs), extended reality (XR), intelligent systems, and robotics.

🧭 Mission

At CALHCI, we aim to push the boundaries of how humans interact with machines by building systems that are:

  • Emotionally and cognitively aware
  • Adaptable in real-time
  • Grounded in user-centered design
  • Engineered for real-world impact

🔬 Areas of Research

  • Affective Computing: Systems that sense and adapt to human emotions
  • Brain-Computer Interfaces (BCI): Real-time interaction via neural signals
  • XR & Spatial Computing: Mixed, augmented, and virtual reality environments
  • Human-Robot Interaction: Real-time collaborative robotics and digital twins
  • Accessible and Inclusive Design: Empowering all users through equitable technology

📁 Featured Projects

  • VERO – Virtual Environment for Robotics Operations
  • GazeOverlayApp – Real-time gaze visualization from Pupil Labs eye trackers
  • HoloMuse – Eye-tracking and spatial analysis of visitor behavior in cultural spaces
  • Affect-Tutor – Emotion-aware intelligent tutoring systems
  • TowerVision – Real-time Tower of Hanoi tracking using depth cameras and color segmentation

Pinned Loading

  1. edu-datasets edu-datasets Public

    A collection of sample datasets, including eye-tracking, brain signals, and other human-centered interaction data for student exploration and analysis.

Repositories

Showing 4 of 4 repositories

Top languages

Loading…

Most used topics

Loading…