其他分享
首页 > 其他分享> > flutter与unity的碰撞--opengl纹理共享实现flutter与unity界面的融合

flutter与unity的碰撞--opengl纹理共享实现flutter与unity界面的融合

作者:互联网

最近在学习untiy游戏引擎的知识,在学习过程中突发奇想,unity和flutter都是可以通过opengl和vulkan绘制界面,那有没有一种方法可以使得二者界面互相融合,即将flutter的界面渲染到unity的物体中,或者将unity的界面渲染到flutter的widget上。由于这两种渲染方式大体相同,下面我们就着重讲下如何将flutter界面渲染到unity中。
首先我们想到的是将flutter界面截屏成bitmap,然后通过交互将bitmap传递给unity,并在unity中使用该bitmap,并且我们也可以立马发现flutter自带截屏函数。但是我们会立刻发现此方案的弊端,此方案首先需要将flutter界面从gpu中下载到内存,然后再通过unity与java通信将bitmap传递到unity中,最后再在unity中将bitmap上传至gpu中作为纹理使用。如此一番折腾,需要在内存中倒腾很多遍,更何况bitmap通常很大,来回传递严重影响效率。那有没有一种更好的方案来解决这个问题呢,当然有,那就是–

纹理共享

我们知道opengl上下文通常是和线程绑定的,不同上下文之间环境比较独立,无法共享内容,但是为了更好的在多线程环境下工作,opengl提供了纹理共享的方式来弥补上诉问题,提高多线程下工作效率。使用方法也比较简单,即在新建opengl环境的时候传入一个已有opengl上下文和configs,这样就可以在两个opengl环境下共享使用同一份纹理。
至于如何在安卓上实现与unity的纹理共享,大家可以参考该文章https://blog.csdn.net/jnlws/article/details/113176253
该文章讲解的比较详细,并且代码比较完善,大家可以仔细阅读并自己实现一遍即可,这边我大概讲解下相机纹理共享的实现流程(因为flutter共享过程与之类似)。

  1. 在unity线程中通过与安卓的通信方式回调java的方法
  2. 在java方法中获取该unity线程的opengl上下文和参数配置,同时新建一个java 的线程用来作java的渲染线程,并在java线程中新建一个opengl上下文环境,传入unity的opengl上下文,以此来实现纹理共享
  3. 将安卓相机数据输出到surfacetexture,同时将该surfacetexture绑定到opengl中新建的一个textureid上,通过fbo离屏渲染将相机数据渲染到新的textureid(因为安卓中surfacetexture输出的纹理是安卓特有的纹理格式GL_TEXTURE_EXTERNAL_OES,无法直接在unity中使用,因此需要通过离屏渲染将其转换成unity可以使用的纹理格式),返回新的textureid给unity
  4. unity收到textureid后将其渲染到gameobject上
    关键代码如下
    unity调用java方法
    创建opengl上下文
    开启相机预览
    绑定相机纹理
    纹理共享是opengl的方法,对于vulkan和metal这样天生就支持多线程的渲染管线,不需要这种方式,不过目前对于vulkan不是很熟悉,所以这里就先不进行讲解。

flutter界面渲染到纹理中

上面我们已经分析了如何将相机数据通过纹理共享到unity中,我们知道可以通过surfacetexture和fbo离屏渲染将纹理共享给unity,因此我们只需要找到如何将flutter界面渲染到surfacetexture中,即可实现flutter和unity界面的融合。接下来我们来分析flutter源码,接下来的源码都是基于flutter1.16来分析的。
首先flutter在新的版本中,为了方便进行混合接入,抽离出了flutter engine,而且我们不需要将flutter界面直接渲染出来,所以我们只需要创建一个没有view的flutter fragment放入unity activity中就可以了。我们将flutter fragment源码拷贝出来并进行改造,与之对应的还需要将FlutterActivityAndFragmentDelegate,接下来我们顺着代码分析flutter界面是如何渲染到安卓上的,首先flutter是通过FlutterView加载到安卓界面的在这里插入图片描述
而FlutterView分为两种模式,我们只需要分析surface模式,我们发现有个FlutterSurfaceView用来实现和flutter关联,我们发现在surfaceview surface创建的时候通过FlutterRender将surface传入flutter中。
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
这下分析下来就比较明了了,我们只需要创建一个surface,通过FlutterRender将其传给flutter,同时通过将其数据输出到surfacetexture,并通过fbo离屏渲染将其输出到unity可以使用的纹理id中即可。代码如下

 public void attachToFlutterEngine(FlutterEngine flutterEngine) {
        this.flutterEngine = flutterEngine;
        FlutterRenderer flutterRenderer = flutterEngine.getRenderer();

        flutterRenderer.startRenderingToSurface(GLTexture.instance.surface);
        flutterRenderer.surfaceChanged(GLTexture.instance.getStreamTextureWidth(), GLTexture.instance.getStreamTextureHeight());
        FlutterRenderer.ViewportMetrics viewportMetrics = new FlutterRenderer.ViewportMetrics();
        viewportMetrics.width = GLTexture.instance.getStreamTextureWidth();
        viewportMetrics.height = GLTexture.instance.getStreamTextureHeight();
        viewportMetrics.devicePixelRatio = GLTexture.instance.context.getResources().getDisplayMetrics().density;
        flutterRenderer.setViewportMetrics(viewportMetrics);
        flutterRenderer.addIsDisplayingFlutterUiListener(new FlutterUiDisplayListener() {
            @Override
            public void onFlutterUiDisplayed() {
                GLTexture.instance.setNeedUpdate(true);
                GLTexture.instance.updateTexture();
            }

            @Override
            public void onFlutterUiNoLongerDisplayed() {

            }
        });
        GLTexture.instance.attachFlutterSurface(this);
    }

这里需要注意的是需要给flutter传递宽和高,不然flutter界面可能显示不出来
GLTexture代码如下,主要用于创建surface和unity通信

package com.example.unitylibrary;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.SurfaceTexture;
import android.opengl.EGL14;
import android.opengl.EGLConfig;
import android.opengl.EGLContext;
import android.opengl.EGLDisplay;
import android.opengl.EGLSurface;
import android.opengl.GLES20;
import android.opengl.Matrix;
import android.os.Handler;
import android.os.Looper;
import android.util.Log;
import android.view.Surface;

import com.unity3d.player.UnityPlayer;

import java.io.BufferedOutputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

public class GLTexture {
    public static GLTexture instance = null;
    private static final String TAG = "GLTexture";

    private int mTextureID = 0;
    private int mTextureWidth = 0;
    private int mTextureHeight = 0;

    SurfaceTexture mCameraInputSurface;
    SurfaceTexture mOutputSurfaceTexture;
    int mOutputTex[];

    private ByteBuffer mBuffer;

    private volatile EGLContext mSharedEglContext;
    private volatile EGLConfig mSharedEglConfig;

    private EGLDisplay mEGLDisplay;
    private EGLContext mEglContext;
    private EGLSurface mEglSurface;
    private FBO fbo;
    private float[] mMVPMatrix = new float[16];
    GLTextureOES glTextureOES;
    Surface surface;
    SurfaceTexture surfaceTexture;

    private boolean needUpdate = true;

    // 创建单线程池,用于处理OpenGL纹理
    private final ExecutorService mRenderThread = Executors.newSingleThreadExecutor();
    // 使用Unity线程Looper的Handler,用于执行Java层的OpenGL操作
    private Handler mUnityRenderHandler;

    public int getStreamTextureWidth() {
        //Log.d(TAG,"mTextureWidth = "+ mTextureWidth);
        return mTextureWidth;
    }

    public int getStreamTextureHeight() {
        //Log.d(TAG,"mTextureHeight = "+ mTextureHeight);
        return mTextureHeight;
    }

    public int getStreamTextureID() {
        Log.d(TAG, "getStreamTextureID sucess = " + mTextureID);
        return mTextureID;
    }

    public Context context;

    public GLTexture() {
        this(UnityPlayer.currentActivity);
    }

    public GLTexture(Context context) {
        instance = this;
        this.context = context;
    }

    private void glLogE(String msg) {
        Log.e(TAG, msg + ", err=" + GLES20.glGetError());
    }

    public boolean isNeedUpdate() {
        return needUpdate;
    }

    public void setNeedUpdate(boolean needUpdate) {
        this.needUpdate = needUpdate;
    }

    // 被unity调用
    public void setupOpenGL() {
        Log.d(TAG, "setupOpenGL called by Unity ");

        // 注意:该调用一定是从Unity绘制线程发起
        if (Looper.myLooper() == null) {
            Looper.prepare();
        }
        mUnityRenderHandler = new Handler(Looper.myLooper());

        // Unity获取EGLContext
        mSharedEglContext = EGL14.eglGetCurrentContext();
        if (mSharedEglContext == EGL14.EGL_NO_CONTEXT) {
            glLogE("eglGetCurrentContext failed");
            return;
        }
        glLogE("eglGetCurrentContext success");

        EGLDisplay sharedEglDisplay = EGL14.eglGetCurrentDisplay();
        if (sharedEglDisplay == EGL14.EGL_NO_DISPLAY) {
            glLogE("sharedEglDisplay failed");
            return;
        }
        glLogE("sharedEglDisplay success");

        // 获取Unity绘制线程的EGLConfig
        int[] numEglConfigs = new int[1];
        EGLConfig[] eglConfigs = new EGLConfig[1];
        if (!EGL14.eglGetConfigs(sharedEglDisplay, eglConfigs, 0, eglConfigs.length,
                numEglConfigs, 0)) {
            glLogE("eglGetConfigs failed");
            return;
        }
        mSharedEglConfig = eglConfigs[0];
        mRenderThread.execute(new Runnable() {
            @Override
            public void run() {
                // 初始化OpenGL环境
                initOpenGL();
                // 生成OpenGL纹理ID
//                int textures[] = new int[1];
//                GLES20.glGenTextures(1, textures, 0);
//                if (textures[0] == 0) {
//                    glLogE("glGenTextures failed");
//                    return;
//                } else {
//                    glLogE("glGenTextures success");
//                }
                mTextureWidth = 2140;
                mTextureHeight = 1080;
                glTextureOES = new GLTextureOES(context, mTextureWidth, mTextureHeight);
                surfaceTexture = new SurfaceTexture(glTextureOES.getTextureID());
                

                surfaceTexture.setDefaultBufferSize(mTextureWidth, mTextureHeight);
                surface = new Surface(surfaceTexture);
                surfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
                    @Override
                    public void onFrameAvailable(SurfaceTexture surfaceTexture) {
                        needUpdate = true;
                    }
                });
                fbo = new FBO(mTextureWidth, mTextureHeight);
                mTextureID = fbo.textureID;

                mBuffer = ByteBuffer.allocate(mTextureWidth * mTextureHeight * 4);


            }
        });


    }

    private void initOpenGL() {
        mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
        if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
            glLogE("eglGetDisplay failed");
            return;
        }
        glLogE("eglGetDisplay success");

        int[] version = new int[2];
        if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
            mEGLDisplay = null;
            glLogE("eglInitialize failed");
            return;
        }
        glLogE("eglInitialize success");

        int[] eglContextAttribList = new int[]{
                EGL14.EGL_CONTEXT_CLIENT_VERSION, 3, // 该值需与Unity绘制线程使用的一致
                EGL14.EGL_NONE
        };
        // 创建Java线程的EGLContext时,将Unity线程的EGLContext和EGLConfig作为参数传递给eglCreateContext,
        // 从而实现两个线程共享EGLContext
        mEglContext = EGL14.eglCreateContext(mEGLDisplay, mSharedEglConfig, mSharedEglContext,
                eglContextAttribList, 0);
        if (mEglContext == EGL14.EGL_NO_CONTEXT) {
            glLogE("eglCreateContext failed");
            return;
        }
        glLogE("eglCreateContext success");

        int[] surfaceAttribList = {
                EGL14.EGL_WIDTH, mTextureWidth,
                EGL14.EGL_HEIGHT, mTextureHeight,
                EGL14.EGL_NONE
        };
        // Java线程不进行实际绘制,因此创建PbufferSurface而非WindowSurface
        // 创建Java线程的EGLSurface时,将Unity线程的EGLConfig作为参数传递给eglCreatePbufferSurface
        mEglSurface = EGL14.eglCreatePbufferSurface(mEGLDisplay, mSharedEglConfig, surfaceAttribList, 0);
        if (mEglSurface == EGL14.EGL_NO_SURFACE) {
            glLogE("eglCreatePbufferSurface failed");
            return;
        }
        glLogE("eglCreatePbufferSurface success");

        if (!EGL14.eglMakeCurrent(mEGLDisplay, mEglSurface, mEglSurface, mEglContext)) {
            glLogE("eglMakeCurrent failed");
            return;
        }
        glLogE("eglMakeCurrent success");

        GLES20.glFlush();
    }

    public void updateTexture() {
        //Log.d(TAG,"updateTexture called by unity");
        mRenderThread.execute(new Runnable() {
            @Override
            public void run() {
//                final Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(), R.drawable.app_icon);
//                if(bitmap == null)
//                    Log.d(TAG,"bitmap decode faild" + bitmap);
//                else
//                    Log.d(TAG,"bitmap decode success" + bitmap);
//                mUnityRenderHandler.post(new Runnable() {
//                    @Override
//                    public void run() {
                if (!needUpdate) return;
                needUpdate = false;
                surfaceTexture.updateTexImage();
                Matrix.setIdentityM(mMVPMatrix, 0);
                fbo.FBOBegin();
                GLES20.glViewport(0, 0, mTextureWidth, mTextureHeight);
                glTextureOES.draw(mMVPMatrix);

//                GLES20.glReadPixels(0, 0, mTextureWidth, mTextureHeight, GLES20.GL_RGBA,
//                        GLES20.GL_UNSIGNED_BYTE, mBuffer);
//                Bitmap bitmap1 = Bitmap.createBitmap(mTextureWidth, mTextureHeight, Bitmap.Config.ARGB_8888);
//                bitmap1.copyPixelsFromBuffer(mBuffer);
//                saveBitmap(context, bitmap1, "test");
//                mBuffer.clear();
                fbo.FBOEnd();


//                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mTextureID);
//                GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
//                GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
//                GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
//                GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
//                GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
//                GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
//                GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
//                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
//                bitmap.recycle();
//                    }
//                });
            }
        });
    }

    public static void saveBitmap(Context context, final Bitmap b, String name) {


        String path = context.getExternalCacheDir().getPath();
        long dataTake = System.currentTimeMillis();
        final String jpegName = path + "/" + name + ".jpg";
        try {
            FileOutputStream fout = new FileOutputStream(jpegName);
            BufferedOutputStream bos = new BufferedOutputStream(fout);
            b.compress(Bitmap.CompressFormat.JPEG, 100, bos);
            bos.flush();
            bos.close();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }


    }

    public void destroy() {
        mRenderThread.shutdownNow();
    }

    public Surface getSurface() {
        return surface;
    }

    private FlutterSurface flutterSurface;

    public void attachFlutterSurface(FlutterSurface surface) {
        flutterSurface = surface;
    }

    public void onTouchEvent(final int type,final double x,final double y) {
        UnityPlayer.currentActivity.runOnUiThread(new Runnable() {
            @Override
            public void run() {
                if (flutterSurface != null)
                    flutterSurface.onTouchEvent(type, x, y);
            }
        });

    }
}

接下来我们只需要和相机一样,在unity中接收纹理并渲染到gameobject就可以了。
最终实现效果如下,我们可以看到flutter界面完美的渲染到unity中了
flutter渲染到untiy中

点击事件

我们接下来还需要处理flutter点击事件,我们需要在unity中获取点击事件,并将其传递给安卓,然后传递给flutter即可。
unity点击处理代码如下

void Update()
    {
        #if UNITY_ANDROID
        if(mGLTexCtrl.Call<bool>("isNeedUpdate"))
            mGLTexCtrl.Call("updateTexture");
        if (Input.touches.Length > 0){
            if(haveStartFlutter == 1){
            Debug.Log(Input.touches[0].position);
            double x = Input.touches[0].position.x;
            double y = Input.touches[0].position.y;
            x = x-pos_zero.x;
            y = y-pos_zero.y;
            y = -y;
            x = x/w;
            y = y/h;
            Debug.Log("x:"+x+"&y:"+y+"&type:"+Input.touches[0].phase);
            if(Input.touches[0].phase == TouchPhase.Began){
                mGLTexCtrl.Call("onTouchEvent",0,x,y);
            }else if(Input.touches[0].phase == TouchPhase.Moved){
                mGLTexCtrl.Call("onTouchEvent",1,x,y);

            }else if(Input.touches[0].phase == TouchPhase.Ended){
                mGLTexCtrl.Call("onTouchEvent",2,x,y);

            }
            }else{
                mFlutterApp.Call("startFlutter");
                haveStartFlutter = 1;
            }

        }
        #endif 
        
        if(Input.GetMouseButtonDown(1)){
            Debug.Log(Input.mousePosition);
        }
    }

安卓传递给flutter代码如下。这里我们可以在flutter源码中发现如何传递点击事件,这里也借助于flutter源码来实现

 public void onTouchEvent(int type, double x, double y) {
        ByteBuffer packet =
                ByteBuffer.allocateDirect(1 * POINTER_DATA_FIELD_COUNT * BYTES_PER_FIELD);
        packet.order(ByteOrder.LITTLE_ENDIAN);
        double x1, y1;
        x1 = GLTexture.instance.getStreamTextureWidth() * x;
        y1 = GLTexture.instance.getStreamTextureHeight() * y;
        Log.i("myyf", "x:" + x1 + "&y:" + y1 + "&type:" + type);
        addPointerForIndex(x1, y1, type + 4, 0, packet);
        if (packet.position() % (POINTER_DATA_FIELD_COUNT * BYTES_PER_FIELD) != 0) {
            throw new AssertionError("Packet position is not on field boundary");
        }

        flutterEngine.getRenderer().dispatchPointerDataPacket(packet,packet.position());

    }


    // TODO(mattcarroll): consider creating a PointerPacket class instead of using a procedure that
    // mutates inputs.
    private void addPointerForIndex(
            double x, double y, int pointerChange, int pointerData, ByteBuffer packet) {
        if (pointerChange == -1) {
            return;
        }

        int pointerKind = 0;

        int signalKind =
                0;

        long timeStamp = System.currentTimeMillis() * 1000; // Convert from milliseconds to microseconds.

        packet.putLong(timeStamp); // time_stamp
        packet.putLong(pointerChange); // change
        packet.putLong(pointerKind); // kind
        packet.putLong(signalKind); // signal_kind
        packet.putLong(pointerId); // device
        packet.putLong(0); // pointer_identifier, will be generated in pointer_data_packet_converter.cc.
        packet.putDouble(x); // physical_x
        packet.putDouble(y); // physical_y
        packet.putDouble(
                0.0); // physical_delta_x, will be generated in pointer_data_packet_converter.cc.
        packet.putDouble(
                0.0); // physical_delta_y, will be generated in pointer_data_packet_converter.cc.

        long buttons = 0;

        packet.putLong(buttons); // buttons

        packet.putLong(0); // obscured

        packet.putLong(0); // synthesized

        packet.putDouble(1.0); // pressure
        double pressureMin = 0.0;
        double pressureMax = 1.0;
//        if (event.getDevice() != null) {
//            InputDevice.MotionRange pressureRange =
//                    event.getDevice().getMotionRange(MotionEvent.AXIS_PRESSURE);
//            if (pressureRange != null) {
//                pressureMin = pressureRange.getMin();
//                pressureMax = pressureRange.getMax();
//            }
//        }
        packet.putDouble(pressureMin); // pressure_min
        packet.putDouble(pressureMax); // pressure_max


        packet.putDouble(0.0); // distance
        packet.putDouble(0.0); // distance_max


        packet.putDouble(0.5); // size

        packet.putDouble(6); // radius_major
        packet.putDouble(7); // radius_minor

        packet.putDouble(0.0); // radius_min
        packet.putDouble(0.0); // radius_max

        packet.putDouble(0); // orientation


        packet.putDouble(0.0); // tilt


        packet.putLong(pointerData); // platformData


        packet.putDouble(0.0); // scroll_delta_x
        packet.putDouble(0.0); // scroll_delta_x

    }

这样我们就可以实现在unity中点击flutter界面了最终实现效果如下
在这里插入图片描述

总结

上面我们分析了如和将flutter界面渲染到unity中,通过opengl的纹理共享和安卓的surface即可实现,同理如何将unity界面渲染到flutter也是一样,我们只需要自定义UnityPlayer将其输出到纹理中,并在flutter中使用即可,可以更广泛的推广,我们可以通过这种方法将安卓界面渲染到flutter和unity中。

标签:opengl,packet,public,unity,flutter,GL,GLES20
来源: https://blog.csdn.net/a582816317/article/details/115425715