GPUImage source code analysis of Meiyan heavyweight Technology

When it comes to GPU based image processing and real-time filter, you will surely think of the famous GPUImage. This project really provides a lot of convenience for the follow-up development, and basic image processing tools are all available. But learning from GPUImage's project structure can help us a lot.

GPUImage project structure

GPUImage's project structure is actually very simple, and the Android version is even more simple, with the following structure:

  • A pile of filters (shader s and codes for matching setting parameters)
  • FilterGroup (multiple processing of the same image with FBO)
  • EGL management class (mainly used for off screen rendering)

Although the main value of GPUImage lies in that pile of filters, we mainly analyze the latter two. This is the framework of GPUImage, and filters are just like plug-ins. If you want to plug them in, you can plug them in: D, we can also customize our own filters according to the rules of the gourd.

Why off screen rendering

The main purpose of off screen rendering is to process the data in the background. Anyone who has done the Camera application knows that if SurfaceView is used for preview, then the Camera data will have to be displayed. In order not to display, the SurfaceView must be reduced to a small size, which is troublesome and wasteful of resources. After Android 3.0, there are SurfaceTexture and GLSurfaceView, and then TextureView. You can freely process Camera data without displaying it, but there is still a process of display and drawing. In other words, TextureView and GLSurfaceView are not obedient enough to fulfill all our requirements.

What if we just want to use GPU to process a picture, but don't display it?

Raise a chestnut. Let's take a look at the interface of Camera360 Lite: These pictures can be seen by selecting the filter after opening. They can also be seen without networking. Are they brought by APK? Why is it all the same person? However, after a round of searching, I can only find these in APK: Where are the big sisters of different colors? This shows that these different filter effects are actually generated by APK on the user's mobile phone when it runs for the first time. (you can view the data folder of Camera360 by yourself) this has many advantages, such as greatly reducing the volume of APK, the same set of code can also be used to complete different functions, etc. Of course, this is just one advantage of off screen rendering.

Before using GLSurfaceView, GLSurfaceView helped us to complete EGL environment configuration. Now, instead of using GLSurfaceView, we need to manage ourselves. Let's see how GPUImage works: GPUImage refers to GLSurfaceView and configures OpenGL environment by itself

Later, when we analyze the code of GLSurfaceView, we will talk about how to do off screen rendering (after all, environment configuration is a routine)

Filter group and frame cache object (FBO)

GPUImage filter group can be said to be the best reuse of these filters. With the help of frame buffer object (FBO), we can use different filter combinations on an image to get the desired results.

Another Chestnut: I wrote a gray filter, which can turn the picture into black-and-white effect. The code is as follows:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec3 centralColor = texture2D(sTexture, vTextureCoord).rgb;
    gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
}

One day when I was free, I wrote another anti color filter:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec4 centralColor = texture2D(sTexture, vTextureCoord);
    gl_FragColor = vec4((1.0 - centralColor.rgb), centralColor.w);
}

Now Boss asked me to process the video stream in black and white first, and then in reverse. How could this little thing come to me? Then I spent 10 minutes to write the following code:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec4 centralColor = texture2D(sTexture, vTextureCoord);
    gl_FragColor =vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
    gl_FragColor = vec4((1.0 - gl_FragColor.rgb), gl_FragColor.w);
}

These two filters are relatively simple (only one line). What if each filter is complex? What if there are many combinations?

We wrote the two functions into the same filter, so we had to modify the shader every time, which was not elegant at all, and did not reflect the OO concept that the university teachers worked hard to instill.

In GPUImage, the frame cache object is used to solve this problem. Previously, we drew the result to the screen after one-time processing. Now, we can save the result in the frame cache, and then take the result as the next input data for processing. So my code becomes:

filterGroup.addFilter(new GrayScaleShaderFilter(context));
filterGroup.addFilter(new InvertColorFilter(context));

What if there is a third step? Another new one! Is it convenient?

FBO creation and drawing process

First we need two arrays to hold the FBO ID and the texture ID of the result.

protected int[] frameBuffers = null;
protected int[] frameBufferTextures = null;

Yes, FBO is like texture, which is represented by a number.

if (frameBuffers == null) {
    frameBuffers = new int[size-1];
    frameBufferTextures = new int[size-1];

    for (int i = 0; i < size-1; i++) {
        GLES20.glGenFramebuffers(1, frameBuffers, i);

        GLES20.glGenTextures(1, frameBufferTextures, i);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, frameBufferTextures[i]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
                filters.get(i).surfaceWidth, filters.get(i).surfaceHeight, 0,
                GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
        GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
                GLES20.GL_TEXTURE_2D, frameBufferTextures[i], 0);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
    }
}

The code here is relatively long, but it is very similar to the code we used to generate texture (students without OpenGL ES foundation can see this)

  • GLES20.glGenFramebuffers used to generate frame cache objects
  • The next large segment is actually to generate a texture and configure it with the length and width we are currently painting, and specify the boundary processing, the strategy of zooming in and out, etc
  • Here's the key: we use GLES20.glFramebufferTexture2D to associate a texture image with a frame cache object, and tell OpenGL that this FBO is used to associate a 2D texture. frameBufferTextures[i] is the texture associated with this FBO
  • Why size-1? Because our last texture is directly drawn to the screen~

Draw

After the FBO is generated, we can rewrite our drawing code in this way

if (i < size - 1) {
    GLES20.glViewport(0, 0, filter.surfaceWidth, filter.surfaceHeight);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    filter.onDrawFrame(previousTexture);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
    previousTexture = frameBufferTextures[i];
}else{
    GLES20.glViewport(0, 0 ,filter.surfaceWidth, filter.surfaceHeight);
    filter.onDrawFrame(previousTexture);
}
  • Before each painting, use glBindFramebuffer to bind FBO, and then the result of our painting will not be displayed on the screen, but become a texture object just bound with FBO, and then use this texture to the next filter as input
  • The first filter input is the texture of our camera or player
  • The last filter doesn't need to be output to FBO, so just draw it directly.

Filter set full code

package com.martin.ads.omoshiroilib.filter.base;

import android.opengl.GLES20;
import android.util.Log;

import java.util.ArrayList;
import java.util.List;

/**
 * Created by Ads on 2016/11/19.
 */

public class FilterGroup extends AbsFilter {
    private static final String TAG = "FilterGroup";
    protected int[] frameBuffers = null;
    protected int[] frameBufferTextures = null;
    protected List<AbsFilter> filters;
    protected boolean isRunning;

    public FilterGroup() {
        super("FilterGroup");
        filters=new ArrayList<AbsFilter>();
    }

    @Override
    public void init() {
        for (AbsFilter filter : filters) {
            filter.init();
        }
        isRunning=true;
    }

    @Override
    public void onPreDrawElements() {
    }

    @Override
    public void destroy() {
        destroyFrameBuffers();
        for (AbsFilter filter : filters) {
            filter.destroy();
        }
        isRunning=false;
    }

    @Override
    public void onDrawFrame(int textureId) {
        runPreDrawTasks();
        if (frameBuffers == null || frameBufferTextures == null) {
            return ;
        }
        int size = filters.size();
        int previousTexture = textureId;
        for (int i = 0; i < size; i++) {
            AbsFilter filter = filters.get(i);
            Log.d(TAG, "onDrawFrame: "+i+" / "+size +" "+filter.getClass().getSimpleName()+" "+
                    filter.surfaceWidth+" "+filter.surfaceHeight);
            if (i < size - 1) {
                GLES20.glViewport(0, 0, filter.surfaceWidth, filter.surfaceHeight);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
                GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
                filter.onDrawFrame(previousTexture);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
                previousTexture = frameBufferTextures[i];
            }else{
                GLES20.glViewport(0, 0 ,filter.surfaceWidth, filter.surfaceHeight);
                filter.onDrawFrame(previousTexture);
            }
        }
    }

    @Override
    public void onFilterChanged(int surfaceWidth, int surfaceHeight) {
        super.onFilterChanged(surfaceWidth, surfaceHeight);
        int size = filters.size();
        for (int i = 0; i < size; i++) {
            filters.get(i).onFilterChanged(surfaceWidth, surfaceHeight);
        }
        if(frameBuffers != null){
            destroyFrameBuffers();
        }
        if (frameBuffers == null) {
            frameBuffers = new int[size-1];
            frameBufferTextures = new int[size-1];

            for (int i = 0; i < size-1; i++) {
                GLES20.glGenFramebuffers(1, frameBuffers, i);

                GLES20.glGenTextures(1, frameBufferTextures, i);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, frameBufferTextures[i]);
                GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
                        filters.get(i).surfaceWidth, filters.get(i).surfaceHeight, 0,
                        GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
                GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
                        GLES20.GL_TEXTURE_2D, frameBufferTextures[i], 0);

                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            }
        }
    }

    private void destroyFrameBuffers() {
        if (frameBufferTextures != null) {
            GLES20.glDeleteTextures(frameBufferTextures.length, frameBufferTextures, 0);
            frameBufferTextures = null;
        }
        if (frameBuffers != null) {
            GLES20.glDeleteFramebuffers(frameBuffers.length, frameBuffers, 0);
            frameBuffers = null;
        }
    }

    public void addFilter(final AbsFilter filter){
        if (filter==null) return;
        //If one filter is added multiple times,
        //it will execute the same times
        //BTW: Pay attention to the order of execution
        if (!isRunning){
            filters.add(filter);
        }
        else
            addPreDrawTask(new Runnable() {
            @Override
            public void run() {
                filter.init();
                filters.add(filter);
                onFilterChanged(surfaceWidth,surfaceHeight);
            }
        });
    }

    public void addFilterList(final List<AbsFilter> filterList){
        if (filterList==null) return;
        //If one filter is added multiple times,
        //it will execute the same times
        //BTW: Pay attention to the order of execution
        if (!isRunning){
            for(AbsFilter filter:filterList){
                filters.add(filter);
            }
        }
        else
            addPreDrawTask(new Runnable() {
                @Override
                public void run() {
                    for(AbsFilter filter:filterList){
                        filter.init();
                        filters.add(filter);
                    }
                    onFilterChanged(surfaceWidth,surfaceHeight);
                }
            });
    }
}

Original author: Martin, original link: https://blog.csdn.net/Martin20150405/article/details/55520358

Welcome to my WeChat official account, "code breakout", sharing technology like Python, Java, big data, machine learning, AI, etc., focusing on code farming technology upgrading, workplace breakout, thinking leap, 200 thousand + code farm growth charging station, and growing up with you.

Tags: Android Java SurfaceView Mobile

Posted on Mon, 30 Mar 2020 06:57:16 -0700 by scottshane