Android NDK. How to integrate pre-built libraries in case of the FFmpeg.

Android SDK provides API which allows interacting with Application and Application Frameworks layers of Android Platform Architecture. It is enough to use this API for most common cases which developer faces every day. However, sometimes we need more low-level functionality which can be provided by C/C++ libraries. Thanks to Android NDK we can use JNI to invoke native code from Java/Kotlin and vice versa. So, we can reuse existed C/C++ code in the form of pre-built libraries in our Android apps.
This article contains next topics:
- FFmpeg software libraries
- The sample of building C/C++ libraries in the case of FFmpeg.
- Integration of pre-built C/C++ libraries to an Android project using Gradle and CMake.
- Creating of C/C++ wrapper to make using of pre-built libraries easier.
The material requires some basic knowledge about NDK which you can get from my previous articles:
FFmpeg software libraries
In general, to process the video, we have to be familiar with remuxing, and transcoding processes.
What is the remuxing?

Remuxing is a process that includes two other ones: demuxing and muxing. They are shorts for multiplexing and demultiplexing. Multiplexing means combining different types, for instance, audio and video, of data in a container. Demultiplexing means splitting from the container the video and audio out into separate streams.
What is the container?
The container is specification describes how different elements of data and metadata coexist in a file. In simple words, *.mp4, *.avi or *.flv files are containers, because they hold different types of data. The extension of the file is a name of the container.
Containers contain encoded data. The most popular codec for video is H.264. For audio the most popular codec is mp3. So, files with *.mp3 extensions which we use every day are audio data that was encoded using the mp3 codec. I clarify the reason why we should encode data below.
What is the raw audio data?
After audio recording, we have an analog signal. It is valued vary continuously, and the signal exists at every point in time. So, we have to convert the analog signal into a series of numbers representing its amplitude. This process is known as analog-to-digital conversion. The result of this process is a pulse code modulation format, abbreviated PCM.
What is the raw video data?
The raw video data is a sequence of pictures. It is called “raw” because they contain unprocessed (or minimally processed) raw data from an image sensor. Below you can find more info about it.
What is the transcoding?

The transcoding is the process that includes 4 steps: demuxing, decoding, encoding and muxing.
What is the video processing?
The video processing is process of manipulation of encoded packets or decoded frames/samples. You can choose first or second variant depends on your task. For instance, if all you need is just trim video you have to do only remuxing and edit FFmpeg’s AVPackets after demuxing. And if you want to change video resolution, you have to edit video frames which you retrieve after decoding.
FFmpeg structure
FFmpeg contains set of libraries:
- The libavcodec is a library of codecs for decoding and encoding multimedia.
- The libavformat is a library containing demuxers and muxers for audio/video container formats.
- The libavfilter is a library of filters that provides the media filtering layer.
- The libavdevice is a special devices muxing/demuxing library and is a complement to libavformat library.
- The libavutil is a helper library containing routines for different parts of FFmpeg.
- The libpostproc is a library containing video postprocessing routines.
- The libswresample library is capable of handling different sample formats, sample rates and a different number of channels and different channel layout.
- The libswscale is a library containing video image scaling routines and provides a fast and modular scaling interface.
So, if you want to do remuxing you have to use libavformat, which depends on other libraries:

It is important to know which libraries you need to prevent including unneeded dependencies to your project and decrease a size of APK.
Compile FFmpeg for Android
To generate libraries from FFmpeg source code is not an easy task. I understood this fact when I faced this challenge. Fortunately, I met this a handful tutorial. In simple words, to compile FFmpeg for Android we have to do two steps:
After these steps you will have result as follow:

What is H.264 and why do we need it?
H.264 is a video compression standard. It is the most popular codec for a video that is used in Android. The raw video is a sequence of frames. One frame is a matrix that contains info about each pixel. The size of each pixel is 3 bytes because RGB has 3 color channels by 1 byte. So, the size of one 1080p frame: 1920 x 1080 x 3 = ~5.9 megabytes. Moreover, if we want to show 24 frames per second(FPS): 5.9 MB * 24 = ~141.6 MB per one second. An encoded frame can have size about several kilobytes instead of 5.9 megabytes. So, we need to compress the raw video to decrease size. x264 is a program that can do it according to H.264. The result of video encoding is a file with extension *.h264.
Application Binary Interfaces(ABIs)
As you can see, ffmpeg-3.3.2 folder contains android folder now, which, in turn, contains folders for different ABIs. ABI defines the CPU instruction set(s) that the machine code should use. FFmpeg supports only ARM CPU architectures, but it is not a big problem because a majority amount of smartphones use ARM. Each ABI folder contains two interesting for us folders: include and lib. Include contains header files, and lib contains shared libraries.
C/C++ libraries
There are two types of libraries: shared and static.
Shared libraries have extension *.so. Programs at run-time refer to all code from this library. A program using a shared library only refers to the code that it uses in the shared library.
Static libraries have *.a extension. All code is directly linked into the program at compile time. A program using a static library takes copies of the code that it uses from the static library and makes it part of the program.
Integration of pre-built C/C++ libraries to an Android project
To include FFmpeg’s headers and *.so files to the project I use folder structure as follow:

However, you can do it in another way. This project does remuxing, so, I added only needed libraries.
To link *.so files to our project we have to configure Gradle as follow:
Also, we have to add the next lines to CMakeLists.txt:
I use set
function to define a new variable with name ffmpeg_DIR
and ${CMAKE_SOURCE_DIR}/src/main/cpp/ffmpeg
value. CMAKE_SOURCE_DIR
is a variable of CMake that contains the path to CMakeLists.txt file.
ANDROID_ABI
is another variable of CMake that contains a name of ABI. It changes during build time and can take values as follow: armeabi, armeabi-v7a, arm64-v8a, x86, x86_64, mips, mips64. We can specify the certain ABIs using Gradle:
ndk {
abiFilters "armeabi", "armeabi-v7a", "arm64-v8a"
}
So now build process is going only for these ABIs.
The add_library
function adds a library to our project. Using SHARED
constant we specify that we use *.so file and we use IMPORTED
const to show that library file is located outside the project. So we use set_target_properties
to define a path to the library file.
To add FFmpeg’s headers we can use this function:
include_directories(${ffmpeg_DIR}/lib/${ANDROID_ABI}/include)
After this we can link pre-built libraries: avutil-55, avformat-57, avcodec-57 to our library:
Creating of C/C++ wrapper to make using of pre-built libraries easier
So now we are ready to use FFmpeg’s libraries in our project. The structure of cpp’s subfolders looks as follow:

We have already linked FFmpeg’s libraries to our vpl.cpp wrapper using target_link_libraries
function in CMakeLists.txt.
To use functions from libraries we have to include headers to our *.cpp:
Please, pay attention to the fact that we have to use extern "C" {}
block.
Now we can use these samples or find out ffmpeg.c file to create new functions:

It is source code of FFmpeg’s command line tool.
In this way I created several functions which can do demuxing, rotation of display matrix, trimming of video, merging of audio and video streams from different files. The sample of function that returns duration:
I created the small library with these functions, so, if you need, you can add it to your project.
You can use it as follow:
I am going to extend this library and I we will be glad to merge you pull requests.
Conclusions
FFmpeg is a handy library for video and audio processing. There a lot of open source projects use it. You can check my sample out to be sure that FFmpeg is a fast and powerful tool. FFmpeg is a low-level library, so, you have to have base knowledge about video and audio processing. You have to know that operation of transcoding takes much time. So, don’t do transcoding if it is unnecessary.