I then create a centered rectangular selection of the blend width I want, must be less than the blend width used in fusion2sphere (which is currently 5 degrees). To generate them you need to know the 1/4 and 3/4 positions of the output equirectangular output video horizontally. You also need two blend mask files, one for the front, the other for the back. Note: I’ve also included files for photo mode that can be used if you want to turn GoPro Fusion fisheye images into a video, like described in this post. The fusion2sphere repository has these pre-generated for each camera mode on the GoPro Fusion camera:
#GOPRO FISHEYE HOW TO#
These files tell ffmpeg how to map the fisheye onto an equirectangular projection. pgm (Portable Gray Map) files for each fisheye. Method 2: Using Fusion specific ffmpeg filters (recommended) It is possible to use ffmpeg filters to achieve a blend.
#GOPRO FISHEYE SERIES#
You can see why it is 5 degrees in our previous series on converting dual fisheye frames from the Fusion here. The blend zone is around 5 degrees (full photo width is 360 degrees). It is at these points we need to provide a blend. Some of this is due to overlapping pixels and slight variations in the field of view. Looking closely, you can the stitch lines as well as some duplicate pixels visible in both outputs (this one taken from the first stitched video):
#GOPRO FISHEYE SOFTWARE#
Some people have suggested using video editing software to take out the fisheye effect. And the wide angle will make capturing some experiments more difficult. The GoPro also comes with an array of mounting hardware to easily stick it anywhere. For physics experiments this makes it possible to do some cool things with relative motion by placing the camera on the moving object or on some other reference frame. The primary advantage of the GoPro is it's small size and high resolution. Re: Re: Can we correct for a fisheye lens? - Jul 3 2:21PM UTC