Encoding FB360 video without FB360 Encoder

There is an issue with FB360 Encoder installer. Many people were asking is there any workaround to encode FB360 with 2V1H spatial audio content without using this app. I think I found a way to do it using FFmpeg.

If you have your equirectangular video already it’s time to prepare audio in TBE 8 channel format. To do so you can mix your audio in 2rd order ambisonics and convert it into the TBE format using the formula proposed by Angelo Farina, which is:

TBE(1) = 0.488603 * Ambix(0); W
TBE(2) = -0.488603 * Ambix(1); Y
TBE(3) = 0.488603 * Ambix(3); X
TBE(4) = 0.488603 * Ambix(2); Z
TBE(5) = -0.630783 * Ambix(8); U
TBE(6) = -0.630783 * Ambix(4); V
TBE(7) = -0.630783 * Ambix(5); T
TBE(8) = 0.630783 * Ambix(7); S

Mind that the above source is in AmbiX format. You can do the conversion using Reaper and a set of JSFX plugins written by Bruce Wiggins (WigWare), available here >>>.

If you use MP4 container you will need to use AAC audio. TBE format operates in three audio streams. First two streams have four channels (1-4 and 5-8) and the last stream has two channels for headlocked stereo.

For simplicity I am using three separate WAV files, two with four channels and one with two channels.

To mux video and encode audio files I am using ffmpeg command:

ffmpeg -i INPUT_VIDEO.mp4 -i 1_4_audio_INPUT.wav -i 5_8_audio_INPUT.wav -i stereo_audio_INPUT.wav -c:v copy -c:a aac -b:a 128k -map 0:v -map 1 -map 2 -map 3 OUTPUT_VIDEO.mp4

There is a problem with the quality of the native FFmpeg AAC encoder even at higher bitrates (-b:a 128k option stands for bitrate, you can change it to higher values like 384k). It is especially significant with high frequencies like sounds of hi-hats, etc. For better results I have used FDK-AAC codec by Fraunhofer IIS. It is possible to build a custom version of FFmpeg that uses FDK-AAC, but it was faster and simpler for me to use aac-enc package which is available on macOS via Homebrew. The command to encode WAVs into AACs is:

aac-enc -r 800000 -t 2 -a 1 input.wav output.aac

where 800000 stands for max available stream bitrate, that is 800k. Having these three AACs I used FFmpeg with a command:

ffmpeg -i INPUT_VIDEO.mp4 -i 1_4_audio_INPUT.aac -i 5_8_audio_INPUT.aac -i stereo_audio_INPUT.aac -c:v copy -c:a copy -map 0:v -map 1 -map 2 -map 3 OUTPUT_VIDEO.mp4

The file plays nicely through Oculus TV on Quest 2 (and this is the only headset/player I have tested for now).

The only thing is that there is no metadata for the player to recognize video format, so you need to set it on the first play in the player. So setting the metadata is still a work in progress. Also, I had no success yet with muxing mkv file and opus files. It’s not recognized as spatial audio by the player. If you will have some advances on these two issues let me know, so I can update this tutorial.

Good luck with your encoding!