This system converts any EEG file into robot arm movements using brain wave patterns.
-
EEG Data File (
.edfformat)- Emotiv headset recordings
- Clinical EEG recordings
- Any standard EDF file with EEG channels
-
UR Robot Simulator (running in Docker)
docker ps | grep ursim # Check it's running
-
Python Environment (already set up in
eeg_pipeline/venv)
cd ursim_test_v1
./run_eeg_robot_test.shThis will:
- ✅ Connect to robot at 127.0.0.1
- ✅ Process your EEG data
- ✅ Generate movement commands in real-time
- ✅ Watch robot move based on brain waves!
The system analyzes 5 frequency bands and creates 3D movements:
| Brain Wave | Frequency | Mental State | Robot Movement |
|---|---|---|---|
| Delta δ | 0.5-4 Hz | Deep sleep/rest | ⬅️ Move Backward (−X) |
| Theta θ | 4-8 Hz | Drowsy/meditation | ⬇️ Move Down (−Z) |
| Alpha α | 8-13 Hz | Relaxed/calm | ➡️ Move Forward (+X) |
| Beta β | 13-30 Hz | Focused/alert | ⬆️ Move Up (+Z) |
| Gamma γ | 30-45 Hz | High concentration |
- Primary movement: Strongest brain wave (>30% power)
- Secondary movement: 2nd strongest if >20% power
- Result: Diagonal/curved movements when multiple bands are active!
Example:
- 40% Alpha + 25% Beta = Move forward AND up (diagonal)
- 35% Gamma + 22% Theta = Move right AND down
Edit run_eeg_robot_test.sh and change the EDF file path:
EDF_FILE="/path/to/your/recording.edf"cd eeg_pipeline
source venv/bin/activate
python eeg_to_movements.py \
--edf-file "/path/to/your/eeg_data.edf" \
--output ../ursim_test_v1/asynchronous_deltas.jsonl \
--speed 2.0 \
--velocity-scale 0.03Terminal 1 - Start Robot:
cd ursim_test_v1
source ../eeg_pipeline/venv/bin/activate
python ur_asynchronous.py --robot-ip 127.0.0.1 --json-file asynchronous_deltas.jsonlTerminal 2 - Process EEG:
cd eeg_pipeline
source venv/bin/activate
python eeg_to_movements.py \
--edf-file "YOUR_FILE.edf" \
--output ../ursim_test_v1/asynchronous_deltas.jsonlpython eeg_to_movements.py \
--edf-file <PATH> # Your EEG file (required)
--output <PATH> # Output JSONL file (default: movements.jsonl)
--speed <FLOAT> # Playback speed multiplier (default: 1.0)
# 1.0 = real-time, 2.0 = 2x faster
--velocity-scale <FLOAT> # Movement speed in m/s (default: 0.05)
# 0.02 = 20mm/s (slow & safe)
# 0.05 = 50mm/s (medium)
# 0.10 = 100mm/s (fast)
--window-size <FLOAT> # Analysis window in seconds (default: 2.0)Slow & Precise:
python eeg_to_movements.py --edf-file data.edf --velocity-scale 0.02 --speed 1.0Fast Testing:
python eeg_to_movements.py --edf-file data.edf --velocity-scale 0.05 --speed 5.0Maximum Detail:
python eeg_to_movements.py --edf-file data.edf --window-size 1.0 --velocity-scale 0.03The system includes automatic safety limits:
- X-axis: ±500mm (prevents forward/back crashes)
- Y-axis: ±500mm (prevents left/right crashes)
- Z-axis: 100-800mm (prevents hitting table or ceiling)
When approaching limits:
⚠️ X limit approaching (0.51m), stopping X motion
The robot will automatically stop that axis to prevent damage!
- Emotiv FLEX2 recordings (32 channels @ 256 Hz)
- Standard EDF files with EEG channels
- Clinical EEG recordings (PhysioNet, etc.)
- Any file that
mne.io.read_raw_edf()can load
The system automatically:
- ✅ Detects EEG channels (excludes metadata like timestamps)
- ✅ Works with 14, 32, 64+ channel systems
- ✅ Filters out non-EEG channels automatically
The EEG processor shows:
Command #10: alpha(45%) + beta(23%) → vel=(0.030, 0.000, 0.015) m/s | pos=(0.15, 0.00, 0.31)m
↑ ↑ ↑ ↑ ↑ ↑
Dominant Secondary Forward Right Up Current position
asynchronous_deltas.jsonl- Movement commands (one per line)- Each line contains:
{"dx": 0.03, "dy": 0, "dz": 0, "drx": 0, "dry": 0, "drz": 0}
Open in your browser:
http://localhost:6080/vnc.html
You'll see the UR robot simulator executing your brain-controlled movements in real-time!
- Alpha-heavy EEG: Robot moves forward (relaxed state)
- Beta-heavy EEG: Robot moves up (focused state)
- Mixed patterns: Diagonal/curved movements
- Changing patterns: Robot adapts in real-time!
# Check if URSim is running
docker ps | grep ursim
# Check if ports are accessible
curl -v http://localhost:6080- Robot might be at a safety limit
- Check the EEG processor output for warnings
- Verify file is being updated:
tail -f asynchronous_deltas.jsonl
Adjust --velocity-scale:
- Too fast: Use
0.01or0.02 - Too slow: Use
0.05or0.10
2025-10-08 12:10:20 - EEG → Movement converter initialized
2025-10-08 12:10:20 - Velocity scale: 0.03 m/s (30.0 mm/s)
2025-10-08 12:10:20 - Safety limits enabled: X=(-0.5, 0.5), Y=(-0.5, 0.5), Z=(0.1, 0.8)
2025-10-08 12:10:20 - Processing 512 samples per window
2025-10-08 12:10:26 - Cmd #10: alpha(42%) + delta(28%) → vel=(0.025, 0.000, 0.000) m/s | pos=(0.13, 0.00, 0.30)m
What this means:
- alpha(42%): Alpha waves are dominant at 42% of total power → move forward
- delta(28%): Delta is secondary at 28% → slight backward component
- vel=(0.025, 0.000, 0.000): Net velocity is 25mm/s forward (X-axis)
- pos=(0.13, 0.00, 0.30): Robot is at 130mm forward, 0mm left/right, 300mm height
Currently using band power analysis. You can train a deep learning model:
cd model
python train_eeg_model.py --data-dir /path/to/training/dataRecord EEG while:
- 🧘 Meditating (high alpha) → smooth forward motion
- 🎯 Concentrating (high beta) → upward motion
- 😴 Drowsy (high theta) → downward motion
- 🔬 Problem-solving (mixed) → complex 3D paths
Replace file playback with live LSL stream:
python producer/emotiv_producer.py --emotiv-streamcd ursim_test_v1
./run_eeg_robot_test.sh
# Uses included Emotiv FLEX2 recordingcd eeg_pipeline
source venv/bin/activate
python eeg_to_movements.py \
--edf-file ~/my_eeg_data/recording_2025.edf \
--output ../ursim_test_v1/asynchronous_deltas.jsonl \
--speed 1.5 \
--velocity-scale 0.04python eeg_to_movements.py \
--edf-file data.edf \
--output commands.jsonl \
--speed 10.0 # 10x real-timeIf you see:
- ✅ "Connected to UR at 127.0.0.1"
- ✅ "Cmd #10: alpha(45%) → vel=..."
- ✅ Robot moving in VNC viewer
Congratulations! Your brain is controlling the robot! 🧠🤖
- Load EEG → MNE library reads EDF file
- Window Analysis → 2-second sliding windows (50% overlap)
- Power Spectral Density → Welch's method computes frequency content
- Band Power Extraction → Integrate PSD in each frequency band
- Movement Generation → Map band ratios to velocities
- Safety Check → Verify limits before applying
- Output → Append to JSONL file
Each command line in asynchronous_deltas.jsonl:
{
"dx": 0.030, // X velocity (m/s)
"dy": 0.000, // Y velocity (m/s)
"dz": 0.000, // Z velocity (m/s)
"drx": 0.0, // X rotation (rad/s)
"dry": 0.0, // Y rotation (rad/s)
"drz": 0.0, // Z rotation (rad/s)
"timestamp": "2025-10-08T12:10:20",
"command_number": 42,
"dominant_band": "alpha"
}Ready to control robots with your mind? Let's go! 🚀