Gyroflow: Ability to accurately stabilize video with motion data. Gyroscope data recorded internally from a wide range of modern cameras (e.g., GoPro, Sony, Insta360, etc.) is supported, as well as gyroscope data provided by external devices (e.g., Betaflight black box).
Key features include real-time preview, parameter adjustment, GPU processing and rendering, scroll shutter correction, and more, allowing stabilization effects to be applied directly in video editing software without the need for transcoding.
1. What is the project?
- Full name: Gyroflow — “Video stabilization using gyroscope data”.
- Function: It can read gyroscope data (sometimes with accelerometer/accelerometer data) recorded in the video or from the outside, and then use this data to help correct the shaking in the video, making the picture smoother and more stable.
- Open Source Free: This is an open-source project and is licensed under GPL-3.0. You are free to use, modify, and distribute as permitted by open source licenses.
2. Main features/advantages
Here are some of its key features (i.e., where it is better than regular software or relying solely on video image algorithms for stabilization):
| Characteristics | Role / Advantage |
|---|---|
| Use gyroscope / accelerometer + video | The jitter in the image can be compensated more accurately with physical motion data, such as the gyro data can tell you exactly how the camera rotates, so that the stabilization is more natural and accurate. |
| Lens correction | Different lenses have different distortions that affect the correction effect, and Gyroflow includes a database of Lens-profiles to correct this. |
| Rolling shutter correction | When moving quickly with a camera with a CMOS sensor, rolling shutter can cause distortion/tilt effects, which Gyroflow can correct. |
| Supports a wide range of hardware/formats | Support many brands/models of action cameras, drones, drones, RAW video formats, external gyroscope logs, etc. |
| User interface + live preview + GPU acceleration | User-friendly interface, with real-time preview, hardware-accelerated rendering. This allows you to see the effect quickly when adjusting the parameters. |
| Plugin support | It can be used as a plug-in in video editing software, without the need to render stabilization before importing the editor, such as DaVinci Resolve, Adobe Premiere/After Effects, Final Cut Pro, etc. as project files/plug-ins. |
3. Architecture/implementation
Here’s how it was designed and what technical points are worth noting:
- Core Library (Gyroflow Core): Handles all stabilization algorithms + pixel processing logic. GUI/front-end interfaces, plugins, decoding and encoding, etc. are built on top of this core.
- Modular design: The core library does not assume the specific way the video source, UI, decoder/encoder is made. That is, you can use it on different platforms and different frontends.
- Multithreading + hardware acceleration: Many computations (image processing, synchronization, filtering, etc.) are multithreaded, and GPUs are used to speed up rendering/output.
- Synchronization: Video frames and gyroscope data must be synchronized (including timestamps, offsets, etc.) to compensate for motion correctly. This synchronization process is important.
- Optical flow algorithms / optional supplements: Some algorithms (optical flow) can assist when the gyroscope or accelerometer data is insufficient and/or detail correction is required.
4. Limitations/challenges
While Gyroflow is strong, it’s not a panacea, and here are some issues you might encounter or what to look out for:
- Gyroscope data required: If there is no gyroscope data in the video, or if there are no proper logs/logs externally, its full functionality cannot be used. Only some estimation or auxiliary methods can be used.
- Data synchronization problems: Even if there is gyroscope data, if the timestamp is out of sync, the delay is large, or the recording is inaccurate, it will lead to inaccurate correction, which may cause the picture to “drift” or still shake.
- Cropping / zooming: In order to stabilize the frame, it is usually necessary to crop the edges of the frame or zoom in to compensate for the movement, which means that the effective frame will shrink/lose information at the edges of the frame.
- Processing performance requirements: High-resolution video + high frame rate + use of optical flow + large lens distortion correction and other functions will have high requirements for computer/GPU performance. Real-time preview or high-volume processing may require better hardware.
- Visual distortion or distortion issues: If the lens is not corrected accurately, or if the image is distorted strongly (such as a fisheye wide angle lens), the correction effect will be limited and sometimes cause unnatural distortion.
- Learning curve: While the interface and documentation try to be easy to use, there is some trial and error with adjusting parameters (such as lens correction strength, smoothing factor, horizon lock, etc.) to make the picture stable and natural.
5. Typical usage process
A common process for using Gyroflow might look like this:
- Shoot a video
Shoot video with a camera that supports gyroscope data logging. Make sure gyroscope input is turned on, and if it’s an action camera, drone, or certain drones, make sure the input mode is correct (e.g. don’t turn on some electronic stabilizers / stabilization inside the camera, or avoid conflicts caused by overlapping stabilization features). - Get gyroscope logs/data (if external)
If the camera does not have good gyroscope data built-in, it needs to be recorded externally, such as a drone flight controller black box, or a dedicated motion logger. - Import video + gyro data to Gyroflow
Open the video in the Gyroflow software while importing the synchronized gyroscope data. If the lens configuration is supported by the camera, the lens profile may be automatically loaded or selectable from the database. - Synchronize video frames + motion data
Adjusting the time offset, etc., to align the motion with the video frame, is a critical step. - Adjust the stabilization parameters
For example, smoothness, horizon leveling, lens distortion correction strength, roll shutter correction, rotation limits (pitch, yaw, roll) for different axes, etc. It may also be assisted by optical flow. - Preview/check the effect in real time
Observe the effect in the preview window to see if the picture is smooth and natural, whether there is too much cropping, and whether there is any residual jitter or distortion. - Export
Export the video when satisfied, or if you use a plug-in method, you can process it directly within the video editing software. Select the appropriate format/resolution/bitrate, etc. when exporting.
6. Usage scenarios
- FPV Aerial Photography/Drone Photography: There will be a lot of jitter in flight, and Gyroflow can make aerial videos smoother and more “cinematic”.
- Action/Motion Camera: Sports scenes such as running and cycling are also often shaky, and it is useful to use gyroscope data to assist.
- Photographers/creators want to avoid using pure picture algorithms to “pull” the image too badly or distort it in the later stages, and gyroscope data can provide a physical basis.
- Scenes that require high video quality, such as documentaries, video art, natural landscape shooting, etc.
Github:https://github.com/gyroflow/gyroflow
Tubing: