A collection of Python scripts for capturing point cloud data from Intel RealSense cameras and integrating it into Grasshopper (Rhino 8) and Blender.
This project explores real-time and near–real-time point cloud acquisition using Python-based pipelines built around Open3D, enabling integration with computational design and 3D modeling environments.
Two parallel workflows are provided:
- Grasshopper Python components
- A Blender Python script with a custom UI panel
- Camera: Intel RealSense D435 (tested)
- Other RealSense cameras may work but have not been tested.
- CAD Files
Important:
The geometric accuracy of the generated point clouds has not been formally verified.
The point clouds produced by these scripts appear deformed compared to those generated directly in the RealSense Viewer. This may be related to camera intrinsics handling or projection logic, but this has not been conclusively confirmed.
If accuracy is critical, consider the verified Grasshopper plugin below.
If you need a Grasshopper solution that produces point clouds identical to the RealSense Viewer output, use Radii Capture – RealSense, which directly accesses the RealSense SDK DLLs.
Before using any script, you must install the official RealSense SDK 2.0.
This installation includes already include RealSense Viewer and Depth Quality Tool. Use these tools first to verify that your camera is working correctly.
Notes:
- Open3D is the main library used for point cloud acquisition and processing.
- pyrealsense2 is used to retrieve camera intrinsics, which do not appear to be accessible through Open3D alone (only used in Camera Intrinsics Inspection GH defintion)
Install the required Python libraries listed above using the Grasshopper Python Script Component Package Installer.
Install the required Python libraries listed above using Luigi Pacheco’s BlenderPipInstaller
File: RealSenseCapture_Examples.gh
Lists available RealSense devices and supported configurations.
Library: Open3D
Retrieves camera intrinsics related to calibration and point cloud projection.
Library: pyrealsense2
Computes the point cloud directly inside the Grasshopper Python component.
Points are generated individually and converted into a point cloud object.
This method is slower but fully internal and can be triggered continuously.
Inputs:
- Color Mode:
0= no color,1= RGB,2= depth gradient - Min Depth (meters)
- Max Depth (meters)
- Voxel Size (meters)
Libraries: Open3D, NumPy
Saves the point cloud as a .ply file in a temporary folder, then loads it using a C# script.
This method is significantly faster and suitable for real-time capture. It can be triggered continuously.
Inputs:
- Color Mode:
0= no color,1= RGB,2= depth gradient - Min Depth (meters)
- Max Depth (meters)
- Voxel Size (meters)
Libraries: Open3D, NumPy, Matplotlib
File: PointcloudUtilityScripts.gh
Scripts for cleaning and preprocessing point cloud geometry to prepare it for computational design and robotic fabrication workflows. Features include:
Point Cloud Operations:
- Cull points based on a specific Z value
- Cluster points by distance to isolate key regions
- Merge multiple point clouds
- Record point cloud data
- Remove duplicate points based on proximity
Mesh Operations:
- Mesh point clouds while preserving color
- Clean meshes by deleting abnormal faces
File: RealSenseCapture.py
Load the script in Blender’s Script Editor.
Access parameters from the RealSense Capture tab in the viewport side panel
(open with hotkey N).
- Color Mode:
0= no color,1= RGB,2= depth gradient - Min Depth (meters)
- Max Depth (meters)
- Voxel Size (meters)
Libraries: Open3D, NumPy, Matplotlib
The script currently creates mesh vertices instead of a native point cloud object.
As a result, point color is not displayed, although color data is still present internally.
If you know how to convert this into a proper point cloud object in Blender, contributions are welcome.
Grasshopper.mp4
Blender.mp4
Example_RobotCapture.mp4
Conceptual and technical background: Obtaining Point Cloud from Depth Images with Intel RealSense D-435 Camera.
Contributions and suggestions are welcome.
Please submit a pull request or open an issue to discuss improvements.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Special thanks to Luigi Pacheco director of the Interactive Machines Lab at School of Architecture of Florida Atlantic University for lending me his RealSense Camera.







