How to Stream GLSurfaceView Output from FaceUnity Using RootEncoder?
Hello,
I am using the RootEncoder library to stream content, typically from the phone's camera. Now, I need to apply some filters and effects, so I am integrating FaceUnity (FULiveDemoDroid).
FaceUnity renders using android.opengl.GLSurfaceView, and I would like to use this GLSurfaceView output for streaming with RootEncoder.
Implementation Details:
FaceUnity has a CameraRenderer class that extends BaseFURenderer and implements ICameraRenderer.
CameraRenderer uses a GLSurfaceView for rendering camera input with effects.
The camera is configured using FUCameraConfig, which sets properties like resolution, frame rate, and camera type.
The rendering process is managed by OnGlRendererListener, which provides callbacks such as onRenderBefore (for raw input data) and onRenderAfter (for processed data).
My Goal: I want to take the output from GLSurfaceView (after FaceUnity has processed the camera feed) and stream it using RootEncoder.
Question:
- How can I access the processed frame data from FaceUnity's GLSurfaceView?
- What is the best way to pass this data to RootEncoder for streaming?
Any guidance or sample code would be greatly appreciated! Thank you.
Hello,
If you have a way to get a buffer data from that library in a buffer, you can use BufferVideoSource to handle it.
Ideally, you should find a way to render a surface or surfacetexture with that library and create a new VideoSource because this provide a better performance. If you have a code example working with surfaceview or textureview we can try this last way.
Yep, I can get getSurfaceTexture. How to create a VideoSource with a surfacetexture. Do you give me a example. Thanks
This is a example code about sufaceview:
mCameraRenderer = new CameraRenderer(mSurfaceView, getCameraConfig(), mOnGlRendererListener);
CameraRenderer, have a function:
override fun updateTexImage() {
val surfaceTexture = fUCamera.getSurfaceTexture()
try {
surfaceTexture?.updateTexImage()
} catch (e: Exception) {
e.printStackTrace()
}
}
And, I can get surfacetexture frome FUCamera:
override fun getSurfaceTexture(): SurfaceTexture? {
return mFaceUnityCamera?.mSurfaceTexture
}
Thanks @pedroSG94
Hello,
After check the library. I can't find a way to render a SurfaceTexture properly but maybe we can use onRenderAfter callback like this:
//we are using BufferVideoSource to send data to RootEncoder library as YUV images, bitrate depend of your resolution. The value is equivalent to the prepareVideo method.
private val bufferVideoSource = BufferVideoSource(format = BufferVideoSource.Format.NV12, bitrate = 1200 * 1000)
//buffers to NV12
private fun toNv12(y: ByteArray, u: ByteArray, v: ByteArray): ByteBuffer {
//NV12 is Y buffer and then V and U interleaved
val nv12 = ByteBuffer.allocate(y.size + u.size + v.size)
nv12.put(y)
//V and U must have the same size according with YUV standard
for (i in u.indices) {
nv12.put(v[i])
nv12.put(u[i])
}
return nv12
}
override fun onRenderAfter(
outputData: FURenderOutputData,
frameData: FURenderFrameData
) {
val dataY = outputData.image?.buffer ?: return
val dataU = outputData.image?.buffer1 ?: return
val dataV = outputData.image?.buffer2 ?: return
bufferVideoSource.setBuffer(toNv12(dataY, dataU, dataV))
}
This could have a limited performance because you need to do this conversion in each frame.
To do the other way we need to find a way where the FaceUnity library output the image into a Surface or a SurfaceTexture. Similar to when you want to play a video into a Surface from a SurfaceView using MediaPlayer class. This has a better performance because you skip buffer conversion and encoding used in BufferVideoSource class. Basically we want receive image into the SurfaceTexture provided by start method in VideoSource class
how about gpupixel https://github.com/pixpark/gpupixel/releases/tag/v1.3.0-beta5
this is my code, but why always push camera without surfaceview to rtmp server?
import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;
import android.Manifest;
import android.content.pm.PackageManager;
import android.media.MediaRecorder;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.os.Handler;
import android.os.Looper;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.WindowManager;
import android.widget.SeekBar;
import android.widget.Toast;
import com.pedro.common.ConnectChecker;
import com.pedro.encoder.input.sources.audio.MicrophoneSource;
import com.pedro.encoder.input.sources.video.BufferVideoSource;
import com.pedro.encoder.input.sources.video.Camera2Source;
import com.pedro.encoder.input.sources.video.ScreenSource;
import com.pedro.encoder.input.sources.video.VideoSource;
import com.pedro.encoder.input.video.CameraHelper;
import com.pedro.library.generic.GenericStream;
import com.pedro.library.rtmp.RtmpCamera2;
import com.pedro.library.rtmp.RtmpDisplay;
import com.pedro.library.rtmp.RtmpStream;
import com.pixpark.GPUPixelApp.databinding.ActivityMainBinding;
import com.pixpark.gpupixel.GPUPixel;
import com.pixpark.gpupixel.filter.BeautyFaceFilter;
import com.pixpark.gpupixel.filter.FaceReshapeFilter;
import com.pixpark.gpupixel.GPUPixelSourceCamera;
import com.pixpark.gpupixel.GPUPixelView;
import com.pixpark.gpupixel.filter.LipstickFilter;
public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback, ConnectChecker {
private static final int CAMERA_PERMISSION_REQUEST_CODE = 200;
private static final String TAG = "GPUPixelDemo";
private GPUPixelSourceCamera sourceCamera;
private GPUPixelView surfaceView;
private BeautyFaceFilter beautyFaceFilter;
private FaceReshapeFilter faceReshapFilter;
private LipstickFilter lipstickFilter;
private SeekBar smooth_seekbar;
private SeekBar whiteness_seekbar;
private SeekBar face_reshap_seekbar;
private SeekBar big_eye_seekbar;
private SeekBar lipstick_seekbar;
// private RtmpCamera2 rtmpCamera2;
private RtmpStream rtmpStream;
private ActivityMainBinding binding;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = ActivityMainBinding.inflate(getLayoutInflater());
setContentView(binding.getRoot());
// get log path
String path = getExternalFilesDir("gpupixel").getAbsolutePath();
Log.i(TAG, path);
GPUPixel.setContext(this);
// 保持屏幕常亮
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
// preview
surfaceView = binding.surfaceView;
surfaceView.setMirror(true);
smooth_seekbar = binding.smoothSeekbar;
smooth_seekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
beautyFaceFilter.setSmoothLevel(progress / 10.0f);
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
whiteness_seekbar = binding.whitenessSeekbar;
whiteness_seekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
beautyFaceFilter.setWhiteLevel(progress / 10.0f);
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
face_reshap_seekbar = binding.thinfaceSeekbar;
face_reshap_seekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
faceReshapFilter.setThinLevel(progress / 200.0f);
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
big_eye_seekbar = binding.bigeyeSeekbar;
big_eye_seekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
faceReshapFilter.setBigeyeLevel(progress / 100.0f);
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
lipstick_seekbar = binding.lipstickSeekbar;
lipstick_seekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
lipstickFilter.setBlendLevel(progress / 10.0f);
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
}
});
//
this.checkCameraPermission();
// startStreaming("rtmp://livetw2.test.com/live/test666888?secret=abc123");
}
public void startCameraFilter() {
// 美颜滤镜
beautyFaceFilter = new BeautyFaceFilter();
faceReshapFilter = new FaceReshapeFilter();
lipstickFilter = new LipstickFilter();
// camera
sourceCamera = new GPUPixelSourceCamera(this.getApplicationContext());
//
sourceCamera.addSink(lipstickFilter);
lipstickFilter.addSink(faceReshapFilter);
faceReshapFilter.addSink(beautyFaceFilter);
beautyFaceFilter.addSink(surfaceView);
sourceCamera.setLandmarkCallbck(new GPUPixel.GPUPixelLandmarkCallback() {
@Override
public void onFaceLandmark(float[] landmarks) {
faceReshapFilter.setFaceLandmark(landmarks);
lipstickFilter.setFaceLandmark(landmarks);
}
});
// set default value
beautyFaceFilter.setSmoothLevel(0.5f);
beautyFaceFilter.setWhiteLevel(0.4f);
startStreaming("rtmp://livetw2.test.com/live/test666888?secret=abc123");
}
public void checkCameraPermission() {
// 检查相机权限是否已授予
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
// 如果未授予相机权限,向用户请求权限
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.CAMERA}, CAMERA_PERMISSION_REQUEST_CODE);
}else if (ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
// 如果未授予相机权限,向用户请求权限
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO}, CAMERA_PERMISSION_REQUEST_CODE);
} else {
startCameraFilter();
}
}
@Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == CAMERA_PERMISSION_REQUEST_CODE) {
if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
startCameraFilter();
} else {
Toast.makeText(this, "No Camera permission!", LENGTH_LONG);
}
}
}
public void surfaceCreated(SurfaceHolder holder) {
sourceCamera.setPreviewHolder(holder);
}
@Override
public void surfaceChanged(@NonNull SurfaceHolder holder, int format, int width, int height) {
}
@Override
public void surfaceDestroyed(@NonNull SurfaceHolder holder) {
}
// 開始推流方法
private void startStreaming(String rtmpUrl) {
rtmpStream = new RtmpStream(GPUPixel.getInstance().getGLSurfaceView().getContext(), this);
rtmpStream.prepareVideo(1280, 720, 128 * 1024, 30);
rtmpStream.prepareAudio(44100, true, 128 * 1024);
rtmpStream.startStream(rtmpUrl);
}
// 停止推流方法
private void stopStreaming() {
// if (rtmpCamera2.isStreaming()) {
// rtmpCamera2.stopStream();
// }
}
@Override
public void onConnectionStarted(@NonNull String s) {
}
@Override
public void onConnectionSuccess() {
}
@Override
public void onConnectionFailed(@NonNull String s) {
}
@Override
public void onDisconnect() {
}
@Override
public void onAuthError() {
}
@Override
public void onAuthSuccess() {
}
}