通过Twilio视频流式传输CustomView

通过Twilio视频流式传输CustomView

本文介绍了通过Twilio视频流式传输CustomView ARcore的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

当我想使用twilio video api和ARcore流自定义视图时,我遇到了问题,基本上它流了黑屏.我从示例中使用ViewCapturer类到此链接 https://github.com/twilio/video-quickstart-android/tree/master/exampleCustomVideoCapturer ,但不适用于arcore,可能是由于arFragment中存在表面视图.

I've a problem when i want to streaming a custom view with twilio video api along with ARcore, basically it stream a black screen.I used ViewCapturer class from example to this link https://github.com/twilio/video-quickstart-android/tree/master/exampleCustomVideoCapturer from the official documentation, but doen't work with arcore, probably due to the presence of the surface view in the arFragment.

感谢您的支持.

activity_camera.xml

activity_camera.xml

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:id="@+id/container"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".CameraARActivity">

    <fragment
        android:id="@+id/ux_fragment"
        android:name="com.google.ar.sceneform.ux.ArFragment"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

    <android.support.v7.widget.RecyclerView
        android:id="@+id/recycler_view"
        android:layout_width="match_parent"
        android:layout_height="100dp"
        android:layout_alignParentBottom="true"
        android:background="#c100a5a0"
        android:visibility="gone" />

    <ImageButton
        android:id="@+id/btnCloseChat"
        android:layout_width="24dp"
        android:layout_height="24dp"
        android:layout_alignParentBottom="true"
        android:layout_alignParentEnd="true"
        android:layout_marginBottom="86dp"
        android:layout_marginEnd="13dp"
        android:background="@android:color/transparent"
        android:contentDescription="Close chat button"
        android:src="@drawable/ic_close_black_24dp"
        android:visibility="gone" />

</RelativeLayout>

localVideo创建行:

Row of localVideo creation:

screenVideoTrack = LocalVideoTrack.create(CameraARActivity.this, true, new ViewCapturer(mArFragment.getArSceneView()));

和ViewCapturer类

and the ViewCapturer class

import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.os.Handler;
import android.os.Looper;
import android.os.SystemClock;
import android.view.View;

import com.twilio.video.VideoCapturer;
import com.twilio.video.VideoDimensions;
import com.twilio.video.VideoFormat;
import com.twilio.video.VideoFrame;
import com.twilio.video.VideoPixelFormat;

import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;

/**
 * ViewCapturer demonstrates how to implement a custom {@link VideoCapturer}. This class
 * captures the contents of a provided view and signals the {@link VideoCapturer.Listener} when
 * the frame is available.
 */
public class ViewCapturer implements VideoCapturer {
    private static final int VIEW_CAPTURER_FRAMERATE_MS = 100;

    private final View view;
    private Handler handler = new Handler(Looper.getMainLooper());
    private VideoCapturer.Listener videoCapturerListener;
    private AtomicBoolean started = new AtomicBoolean(false);
    private final Runnable viewCapturer = new Runnable() {
        @Override
        public void run() {
            boolean dropFrame = view.getWidth() == 0 || view.getHeight() == 0;

            // Only capture the view if the dimensions have been established
            if (!dropFrame) {
                // Draw view into bitmap backed canvas
                int measuredWidth = View.MeasureSpec.makeMeasureSpec(view.getWidth(),
                        View.MeasureSpec.EXACTLY);
                int measuredHeight = View.MeasureSpec.makeMeasureSpec(view.getHeight(),
                        View.MeasureSpec.EXACTLY);
                view.measure(measuredWidth, measuredHeight);
                view.layout(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());

                Bitmap viewBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),
                        Bitmap.Config.ARGB_8888);
                Canvas viewCanvas = new Canvas(viewBitmap);
                view.draw(viewCanvas);

                // Extract the frame from the bitmap
                int bytes = viewBitmap.getByteCount();
                ByteBuffer buffer = ByteBuffer.allocate(bytes);
                viewBitmap.copyPixelsToBuffer(buffer);
                byte[] array = buffer.array();
                final long captureTimeNs =
                        TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());

                // Create video frame
                VideoDimensions dimensions = new VideoDimensions(view.getWidth(), view.getHeight());
                VideoFrame videoFrame = new VideoFrame(array,
                        dimensions, VideoFrame.RotationAngle.ROTATION_0, captureTimeNs);

                // Notify the listener
                if (started.get()) {
                    videoCapturerListener.onFrameCaptured(videoFrame);
                }
            }

            // Schedule the next capture
            if (started.get()) {
                handler.postDelayed(this, VIEW_CAPTURER_FRAMERATE_MS);
            }
        }
    };

    public ViewCapturer(View view) {
        this.view = view;
    }

    /**
     * Returns the list of supported formats for this view capturer. Currently, only supports
     * capturing to RGBA_8888 bitmaps.
     *
     * @return list of supported formats.
     */
    @Override
    public List<VideoFormat> getSupportedFormats() {
        List<VideoFormat> videoFormats = new ArrayList<>();
        VideoDimensions videoDimensions = new VideoDimensions(view.getWidth(), view.getHeight());
        VideoFormat videoFormat = new VideoFormat(videoDimensions, 30, VideoPixelFormat.RGBA_8888);

        videoFormats.add(videoFormat);

        return videoFormats;
    }

    /**
     * Returns true because we are capturing screen content.
     */
    @Override
    public boolean isScreencast() {
        return true;
    }

    /**
     * This will be invoked when it is time to start capturing frames.
     *
     * @param videoFormat the video format of the frames to be captured.
     * @param listener capturer listener.
     */
    @Override
    public void startCapture(VideoFormat videoFormat, Listener listener) {
        // Store the capturer listener
        this.videoCapturerListener = listener;
        this.started.set(true);

        // Notify capturer API that the capturer has started
        boolean capturerStarted = handler.postDelayed(viewCapturer,
                VIEW_CAPTURER_FRAMERATE_MS);
        this.videoCapturerListener.onCapturerStarted(capturerStarted);
    }

    /**
     * Stop capturing frames. Note that the SDK cannot receive frames once this has been invoked.
     */
    @Override
    public void stopCapture() {
        this.started.set(false);
        handler.removeCallbacks(viewCapturer);
    }
}

解决方案

package com.bitdrome.dionigi.eragle.utils;

import android.graphics.Bitmap;
import android.os.Handler;
import android.os.Looper;
import android.os.SystemClock;
import android.view.PixelCopy;
import android.view.SurfaceView;
import android.view.View;

import com.twilio.video.VideoCapturer;
import com.twilio.video.VideoDimensions;
import com.twilio.video.VideoFormat;
import com.twilio.video.VideoFrame;
import com.twilio.video.VideoPixelFormat;

import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;

/**
 * ViewCapturer demonstrates how to implement a custom {@link
 VideoCapturer}. This class
 * captures the contents of a provided view and signals the {@link
 VideoCapturer.Listener} when
 * the frame is available.
 */
public class ViewCapturer implements VideoCapturer,
    PixelCopy.OnPixelCopyFinishedListener {
    private static int VIEW_CAPTURER_FRAMERATE_MS = 10;

private final View view;
private Bitmap viewBitmap;
private Handler handler = new Handler(Looper.getMainLooper());
private Handler handlerPixelCopy = new Handler(Looper.getMainLooper());
private VideoCapturer.Listener videoCapturerListener;
private AtomicBoolean started = new AtomicBoolean(false);

public ViewCapturer(View view) {
    this(view, 24);
}

public ViewCapturer(View view, int framePerSecond) {
    if (framePerSecond <= 0)
        throw new IllegalArgumentException("framePersecond must be greater than 0");
    this.view = view;
    float tmp = (1f / framePerSecond) * 1000;
    VIEW_CAPTURER_FRAMERATE_MS = Math.round(tmp);
}

private final Runnable viewCapturer = new Runnable() {
    @Override
    public void run() {
        boolean dropFrame = view.getWidth() == 0 || view.getHeight() == 0;

        // Only capture the view if the dimensions have been established
        if (!dropFrame) {
            // Draw view into bitmap backed canvas
            int measuredWidth = View.MeasureSpec.makeMeasureSpec(view.getWidth(),
                    View.MeasureSpec.EXACTLY);
            int measuredHeight = View.MeasureSpec.makeMeasureSpec(view.getHeight(),
                    View.MeasureSpec.EXACTLY);
            view.measure(measuredWidth, measuredHeight);
            view.layout(0, 0, view.getMeasuredWidth(), view.getMeasuredHeight());

            viewBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(), Bitmap.Config.ARGB_8888);
            try {
                PixelCopy.request((SurfaceView) view, viewBitmap, ViewCapturer.this, handlerPixelCopy);
            } catch (IllegalArgumentException e) {
            }
        }
    }
};

/**
 * Returns the list of supported formats for this view capturer. Currently, only supports
 * capturing to RGBA_8888 bitmaps.
 *
 * @return list of supported formats.
 */
@Override
public List<VideoFormat> getSupportedFormats() {
    List<VideoFormat> videoFormats = new ArrayList<>();
    VideoDimensions videoDimensions = new VideoDimensions(view.getWidth(), view.getHeight());
    VideoFormat videoFormat = new VideoFormat(videoDimensions, 30, VideoPixelFormat.RGBA_8888);

    videoFormats.add(videoFormat);

    return videoFormats;
}

/**
 * Returns true because we are capturing screen content.
 */
@Override
public boolean isScreencast() {
    return true;
}

/**
 * This will be invoked when it is time to start capturing frames.
 *
 * @param videoFormat the video format of the frames to be captured.
 * @param listener    capturer listener.
 */
@Override
public void startCapture(VideoFormat videoFormat, Listener listener) {
    // Store the capturer listener
    this.videoCapturerListener = listener;
    this.started.set(true);

    // Notify capturer API that the capturer has started
    boolean capturerStarted = handler.postDelayed(viewCapturer,
            VIEW_CAPTURER_FRAMERATE_MS);
    this.videoCapturerListener.onCapturerStarted(capturerStarted);
}

/**
 * Stop capturing frames. Note that the SDK cannot receive frames once this has been invoked.
 */
@Override
public void stopCapture() {
    this.started.set(false);
    handler.removeCallbacks(viewCapturer);
}

@Override
public void onPixelCopyFinished(int i) {

    // Extract the frame from the bitmap
    int bytes = viewBitmap.getByteCount();
    ByteBuffer buffer = ByteBuffer.allocate(bytes);
    viewBitmap.copyPixelsToBuffer(buffer);
    byte[] array = buffer.array();
    final long captureTimeNs = TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime());

    // Create video frame
    VideoDimensions dimensions = new VideoDimensions(view.getWidth(), view.getHeight());
    VideoFrame videoFrame = new VideoFrame(array,
            dimensions, VideoFrame.RotationAngle.ROTATION_0, captureTimeNs);

    // Notify the listener
    if (started.get()) {
        videoCapturerListener.onFrameCaptured(videoFrame);
    }
    if (started.get()) {
        handler.postDelayed(viewCapturer, VIEW_CAPTURER_FRAMERATE_MS);
    }
}
}

推荐答案

上面的代码已修改,以按照Twilio文档提供最接近VGA分辨率的流.并增加了交换曲面和使用null的功能.

Modified above code to provide stream at closest VGA resolution per Twilio docs. And added capability to swap surfaces and use null.

package com.company.app

import android.graphics.Bitmap
import android.os.Handler
import android.os.Looper
import android.os.SystemClock
import android.view.PixelCopy
import android.view.SurfaceView
import com.twilio.video.*
import java.nio.ByteBuffer
import java.util.concurrent.TimeUnit
import java.util.concurrent.locks.ReentrantLock
import kotlin.math.roundToLong

class SurfaceViewCapturer : VideoCapturer {
    private var surfaceView: SurfaceView? = null

    private lateinit var viewBitmap: Bitmap
    private lateinit var videoCapturerListener: VideoCapturer.Listener

    private val handler = Handler(Looper.getMainLooper())
    private val handlerPixelCopy = Handler(Looper.getMainLooper())
    private var started: Boolean = false

    // Twilio selects closest supported VideoFormat to 640x480 at 30 frames per second.
    // https://media.twiliocdn.com/sdk/android/video/releases/1.0.0-beta17/docs/com/twilio/video/LocalVideoTrack.html

    private val framesPerSecond: Int = 30
    private val streamWidth: Int = VideoDimensions.VGA_VIDEO_WIDTH
    private val streamHeight: Int = VideoDimensions.VGA_VIDEO_HEIGHT

    private val viewCapturerFrameRateMs: Long =
        (TimeUnit.SECONDS.toMillis(1).toFloat() / framesPerSecond.toFloat()).roundToLong()

    private val reentrantLock = ReentrantLock()

    fun changeSurfaceView(surfaceView: SurfaceView?) {
        reentrantLock.lock()
        this.surfaceView = surfaceView
        reentrantLock.unlock()
    }

    private val viewCapturer: Runnable = object : Runnable {
        override fun run() {
            reentrantLock.lock()

            val surfaceView = surfaceView

            if (started.not()) {
                reentrantLock.unlock()
                return
            }

            if (surfaceView == null ||
                surfaceView.width == 0 ||
                surfaceView.height == 0 ||
                surfaceView.holder.surface.isValid.not()
            ) {
                handler.postDelayed(this, viewCapturerFrameRateMs)
                reentrantLock.unlock()
                return
            }

            // calculate frame width with fixed stream height while maintaining aspect ratio
            val frameWidthFixedHeight: Int = (surfaceView.width * streamHeight) / surfaceView.height

            // calculate frame height with fixed stream width while maintaining aspect ratio
            val frameHeightFixedWidth: Int = (surfaceView.height * streamWidth) / surfaceView.width

            // choose ratio that has more pixels
            val (frameWidth, frameHeight) =
                if (frameWidthFixedHeight * streamHeight >= frameHeightFixedWidth * streamWidth) {
                    Pair(frameWidthFixedHeight, streamHeight)
                } else {
                    Pair(streamWidth, frameHeightFixedWidth)
                }

            viewBitmap = Bitmap.createBitmap(frameWidth, frameHeight, Bitmap.Config.ARGB_8888)

            // mutex.unlock() happens in callback
            PixelCopy.request(
                surfaceView,
                viewBitmap,
                {
                    val buffer = ByteBuffer.allocate(viewBitmap.byteCount)
                    viewBitmap.copyPixelsToBuffer(buffer)

                    // Create video frame
                    val dimensions = VideoDimensions(frameWidth, frameHeight)

                    val videoFrame =
                        VideoFrame(
                            buffer.array(),
                            dimensions,
                            VideoFrame.RotationAngle.ROTATION_0,
                            TimeUnit.MILLISECONDS.toNanos(SystemClock.elapsedRealtime())
                        )

                    // Notify the listener
                    videoCapturerListener.onFrameCaptured(videoFrame)
                    handler.postDelayed(this, viewCapturerFrameRateMs)
                    reentrantLock.unlock()
                },
                handlerPixelCopy
            )
        }
    }

    override fun getSupportedFormats(): List<VideoFormat> =
        listOf(
            VideoFormat(
                VideoDimensions(streamWidth, streamHeight),
                framesPerSecond,
                VideoPixelFormat.RGBA_8888
            )
        )

    override fun isScreencast(): Boolean {
        return true
    }

    override fun startCapture(
        captureFormat: VideoFormat,
        capturerListener: VideoCapturer.Listener
    ) {
        reentrantLock.lock()
        // Store the capturer listener
        videoCapturerListener = capturerListener
        started = true

        // Notify capturer API that the capturer has started
        val capturerStarted = handler.postDelayed(viewCapturer, viewCapturerFrameRateMs)
        videoCapturerListener.onCapturerStarted(capturerStarted)
        reentrantLock.unlock()
    }

    /**
     * Stop capturing frames. Note that the SDK cannot receive frames once this has been invoked.
     */
    override fun stopCapture() {
        reentrantLock.lock()
        started = false
        handler.removeCallbacks(viewCapturer)
        reentrantLock.unlock()
    }
}

这篇关于通过Twilio视频流式传输CustomView ARcore的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 16:44