This is described in the previous article. Android -> Windows Diversified Screen Projection
Record the implementation here
(1) Screen capture
MediaProjection/VirtualDisplay
VirtualDisplay of mirror type (VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR) cannot be created directly due to permission issues and requires prompting the user for authorization through MediaProjection.
MediaProjectionManager mediaManager = (MediaProjectionManager) getSystemService( Context.MEDIA_PROJECTION_SERVICE); startActivityForResult( mediaManager.createScreenCaptureIntent(), 100, null);
Once confirmed by the user, create a MediaProjection and keep it for subsequent use
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) { mMediaProjection = mMediaManager.getMediaProjection(resultCode, data); }
VirtualDisplay can be created multiple times with the same MediaProjection
mMirrorDisplay = mMediaProjection.createVirtualDisplay("Mirror", REQUEST_DISPLAY_WIDTH, REQUEST_DISPLAY_HEIGHT, mMetrics.densityDpi, DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR, null, null, null);
Presentation/VirtualDisplay
Presentation is a Dialog that is output to VirtualDisplay for expanded screen functionality.This Dialog is not visible on Android devices.
Create Visual Display first, with almost the same parameters as through MediaProjection, but set VIRTUAL_DISPLAY_FLAG_PRESENTATION.
mPresentationDisplay = mDisplayManager.createVirtualDisplay("presentation", REQUEST_DISPLAY_WIDTH, REQUEST_DISPLAY_HEIGHT, mMetrics.densityDpi, null, DisplayManager.VIRTUAL_DISPLAY_FLAG_PRESENTATION, null, null);
Then create the Presentation dialog and show it out
Presentation mPresentation = new Presentation(mContext, mPresentationDisplay.getDisplay()); mPresentation.setContentView(dialogView); mPresentation.show();
(2) OpenGL synthetic dual screen
OpenGL has a fixed code framework, GLThread + GLRenderer + GLFilter + RenderScript, which is not detailed here.Extract the part related to screen composition.
GLThread
Android does not have an existing GLThread class, so we use GLSurfaceView to implement GL rendering threads.
public final class GLThread implements Renderer { }
public GLThread(Context context, SurfaceHolder holder) { mHolder = holder; mGLView = new GLSurfaceView(context) { @Override public SurfaceHolder getHolder() { return mHolder; } }; mGLView.setEGLContextClientVersion(2); mGLView.setRenderer(this); mGLView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); }
GLSurfaceView can only set Renderer once. To switch Renderer, we let GLThread manage Renderer internally and forward onDraw calls as the Renderer of GLSurfaceView itself.
The external SurfaceHolder is also used instead of the GLSurfaceView's SurfaceHolder because we want to output graphics to the external Surface.SurfaceHolder needs to be implemented by itself, as well as using java to reflect the Surface.tramsform method.
GLDisplayRenderer
Two GLTexture s are created inside GLDisplayRenderer to be used by two VirtualDisplay s.
mPresentTexture = new GLTexture(); mPresentTexture.setOESImage(); mMirrorTexture = new GLTexture(); mMirrorTexture.setOESImage(); // stand-alone work thread Runnable work = new Runnable() { @Override public synchronized void run() { mMirrorSurfaceTexture = new SurfaceTexture(mMirrorTexture.id()); mPresentSurfaceTexture = new SurfaceTexture(mPresentTexture.id()); notify(); } }; synchronized (work) { sWorkThread.post(work); try { work.wait(); } catch (InterruptedException e) { } } mMirrorSurfaceTexture.setOnFrameAvailableListener(this); mPresentSurfaceTexture.setOnFrameAvailableListener(this); onTextureReady(mMirrorSurfaceTexture, mPresentSurfaceTexture);
Once SurfaceTexture is ready, bind it to VirtualDisplay:
protected void onTextureReady(SurfaceTexture mirrorTexture, SurfaceTexture presentTexture) { mirrorTexture.setDefaultBufferSize(REQUEST_DISPLAY_WIDTH, REQUEST_DISPLAY_HEIGHT); presentTexture.setDefaultBufferSize(REQUEST_DISPLAY_WIDTH, REQUEST_DISPLAY_HEIGHT); mMirrorDisplay.setSurface(new Surface(mirrorTexture)); mPresentationDisplay.setSurface(new Surface(presentTexture)); }
setDefaultBufferSize is required here, otherwise VirtualDisplay has no content output
(3) Video coding
Select Encoder
Refer to the method in libstreaming, use it first, the details are not analyzed.
https://github.com/fyhertz/libstreaming/tree/master/src/net/majorkernelpanic/streaming/hw
All three of these classes need to be used in the following ways:
mEncoder = MediaCodec.createByCodecName( EncoderDebugger.debug(mContext,1024,768).getEncoderName());
Configure Encoder
int bitrate = width * heigth * (int) frameRate / 8; format = MediaFormat.createVideoFormat("video/avc", width, heigth); format.setFloat(MediaFormat.KEY_FRAME_RATE, frameRate); format.setInteger(MediaFormat.KEY_COLOR_FORMAT, // TODO: from mine type MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface); format.setInteger(MediaFormat.KEY_BIT_RATE, bitrate); format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5); format.setInteger(MediaFormat.KEY_LATENCY, 0); mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
Connect with OpenGL
Encoder's input Surface is given to OpenGL's wrapper SurfaceHolder, triggering onSurfaceCreated, and the connection to OpenGL is completed.
Coded Output
Drive output with one thread
WritableByteChannel c = Channels.newChannel(os); while (!Thread.interrupted()) { if (popSample(c)) { os.flush(); ++numTotal; } }
The popSample implementation is as follows:
int index = mEncoder.dequeueOutputBuffer(mBufferInfo, timeout * 1000); if (index >= 0) { ByteBuffer bytes = mEncoder.getOutputBuffer(index); channel.write(bytes); mEncoder.releaseOutputBuffer(index, false); }
(4) HTTP output