a real-time slam implementation using line segment extraction from lidar data. I extract line features from 2d laser scans, track them as landmarks, and visualize everything as the robot moves around.
robot exploring the environment with line features being extracted in real-time
- extracts line segments from lidar point clouds using seeded region growing
- prevents bridging across doors/windows with gap detection
- tracks landmarks over time with data association
- lets you drive a robot around with keyboard controls
- visualizes everything: point cloud, line segments, landmarks, robot path
The current implememntation uses a topdown cros0section of an indoor environment shown below.

# install dependencies
pip install -r requirements.txt
# run it
python main.pywhen it starts, click anywhere to position your robot. then use wasd to drive around.
click to place the robot before starting
movement:
W/↑- forwardS/↓- backwardA/←- rotate leftD/→- rotate rightQ- forward + leftE- forward + right
visualization:
1- point cloud only2- seed segments3- final line segments4- everything (default)5- toggle path on/off
other:
P- reposition robotR- reset path
I use a seeded region growing algorithm that:
- finds seed segments (small groups of collinear points)
- grows them by adding nearby points that fit the line
- checks for gaps to avoid bridging across openings
- validates segments to filter out bad ones
the key improvement here is gap detection - I check three things before connecting points:
- spatial gap (are they too far apart?)
- angular gap (sudden direction change?)
- range discontinuity (depth jump indicating an opening?)
this prevents the classic problem where line extractors connect points across doors and windows.
line segments extracted from lidar scan - notice how they stop at doorways
extracted line segments become landmarks. I:
- associate new observations with existing landmarks
- track how many times I've seen each landmark
- only trust "strong" landmarks (seen multiple times)
- remove old landmarks that haven't been seen recently
landmarks that persist across multiple frames are shown in cyan.
-
green dots - raw lidar points
-
yellow lines - seed segments (initial line fits)
-
red lines - final extracted line segments
-
cyan lines - strong landmarks (seen multiple times)
-
magenta - robot's path
-
blue circle - robot (yellow line shows heading)
edit config.py to tune things - here are the important ones:
lidar settings:
LIDAR_BEAMS- number of laser beams (420 default, try 720 for denser clouds)LIDAR_NOISE_SIGMA- sensor noise[distance, angle](increase for testing robustness)
line extraction:
LINE_EXTRACTION_EPSILON- how tight points must fit the lineLINE_EXTRACTION_G_MAX- max gap betIen consecutive pointsLINE_EXTRACTION_L_MIN- minimum line length to keepLINE_EXTRACTION_ANGLE_GAP_MAX- max angular jump (prevents bridging)LINE_EXTRACTION_DISTANCE_JUMP_RATIO- max depth change ratio (prevents bridging)
robot:
MAX_VELOCITY- forward/backward speedMAX_ANGULAR_VELOCITY- rotation speed
I include a few test maps in world/maps/:
indoor1.png- office-like environmentcave1.png,cave2.png,cave3.png- cave environments
you can add your own maps - just use black for walls and white for free space.
runs at 60 fps with:
- 420 lidar beams
- real-time line extraction
- collision detection
- landmark tracking
- full visualization
if it's slow, try reducing LIDAR_BEAMS in config.
- robot can get stuck in tight corners (collision detection is conservative)
- path grows forever (could add max length)
- no loop closure yet
- ekf-slam is just a stub for now
things I could add:
- full ekf-slam implementation
- uncertainty visualization (covariance ellipses)
- loop closure detection
- map saving/loading
- multiple robots
- path planning
line extraction algorithm from: "A line segment extraction algorithm using laser data based on seeded region growing" by Xiao Zhang, Guokun Lai, Xianqiang Yang (2018) Read the paper
I implemented algorithms 1-3 from the paper with additional gap detection to prevent bridging across openings.
built with pygame and numpy.
do whatever you want with it. 😁️️