我的 OpenGL/Skia Android Camera2 应用程序遇到奇怪的问题。
我的相机将帧渲染为SurfaceTexture,这是GL_TEXTURE_EXTERNAL_OESOpenGL 中的纹理。
然后,我可以使用简单的直通着色器简单地将这个 OpenGL 纹理渲染到所有输出(1920x1080 Preview EGLSurface、4000x2000 Video Recorder )。EGLSurface
Camera --> GL_TEXTURE_EXTERNAL_OES
GL_TEXTURE_EXTERNAL_OES --> PassThroughShader --> Preview Output EGLSurface
GL_TEXTURE_EXTERNAL_OES --> PassThroughShader --> Video Recorder Output EGLSurface
Run Code Online (Sandbox Code Playgroud)
现在我想将 Skia 引入其中,它允许我在将其传递到输出之前渲染到相机框架上(例如,在框架上绘制一个红色框)。由于我无法GL_TEXTURE_EXTERNAL_OES再次直接渲染到相同的对象上,因此我创建了一个单独的离屏纹理 ( GL_TEXTURE_2D) 和一个单独的离屏帧缓冲区 (FBO1) 并附加了它们。
现在,当我渲染到 FBO1 时,屏幕外纹理GL_TEXTURE_2D会更新,然后我想将其传递GL_TEXTURE_2D到我的输出:
Camera --> GL_TEXTURE_EXTERNAL_OES
GL_TEXTURE_EXTERNAL_OES --> Skia to FBO1 + drawing a red box --> GL_TEXTURE_2D
GL_TEXTURE_2D --> PassThroughShader --> Preview Output EGLSurface
GL_TEXTURE_2D …Run Code Online (Sandbox Code Playgroud) 我正在尝试创建一个CameraCaptureSession具有四个输出的 Camera2:
SurfaceView,最高 1080p)ImageReader,最多 8k 照片)MediaRecorder/ MediaCodec,最多 4k 视频)ImageReader最多 4k 视频帧)不幸的是,Camera2 不支持同时连接所有这四个输出(表面),因此我必须做出妥协。
对我来说最合乎逻辑的妥协是将两个视频捕获管道合并为一个,以便帧处理输出(#4,ImageReader)将帧重定向到视频捕获输出(#3,MediaRecorder)。
如何从 ImageReader 写入图像:
val imageReader = ImageReader.newInstance(4000, 2256, ImageFormat.YUV_420_888, 3)
imageReader.setOnImageAvailableListener({ reader ->
val image = reader.acquireNextImage() ?: return@setOnImageAvailableListener
callback.onVideoFrameCaptured(image)
}, queue.handler)
val captureSession = device.createCaptureSession(.., imageReader.surface)
Run Code Online (Sandbox Code Playgroud)
..进入Surface从MediaRecorder?
val surface = MediaCodec.createPersistentInputSurface()
val recorder = MediaRecorder(context)
..
recorder.setInputSurface(surface)
Run Code Online (Sandbox Code Playgroud)
我想我可能需要一个带有直通着色器的 OpenGL 管道 - 但我不知道如何从ImageReader …
android opengl-es surfaceview android-camera android-camera2
我使用 AVFoundation 创建了一个相机,它能够使用AVCaptureVideoDataOutput和录制视频和音频AVCaptureAudioDataOutput。我创建我的捕获会话,连接所有输入以及视频和音频数据输出,然后相机处于空闲状态。用户现在可以开始视频录制。
问题在于,在我开始捕获会话 ( captureSession.startRunning()) 后,背景音乐立即断断续续。我认为这是因为一旦捕获会话开始运行,AVCaptureAudioDataOutput内部就会激活 AVAudioSession ( AVAudioSession.setActive(...)),我不希望它这样做。我希望它处于空闲状态(并且不提供任何音频输出缓冲区),直到我明确激活音频会话(一旦用户开始录制)。
这真的很烦人,因为相机是我们应用程序的开始屏幕,每次用户打开或关闭应用程序时,他的音乐都会断断续续。
我知道这在某种程度上是可能的,因为 Snapchat 就是这样工作的——你打开应用程序,背景音频继续流畅地播放。开始录制后,背景音乐会出现轻微的卡顿,但相机会顺利运行并在短暂的卡顿结束后开始录制。
我的代码:
func configureSession() {
captureSession.beginConfiguration()
// Video, Photo and Audio Inputs
...
// Video Output
...
// Audio Output
audioOutput = AVCaptureAudioDataOutput()
guard captureSession.canAddOutput(audioOutput!) else {
throw CameraError.parameter(.unsupportedOutput(outputDescriptor: "audio-output"))
}
audioOutput!.setSampleBufferDelegate(self, queue: audioQueue)
captureSession.addOutput(audioOutput!)
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord,
options: [.mixWithOthers,
.allowBluetoothA2DP,
.defaultToSpeaker,
.allowAirPlay])
captureSession.commitConfiguration()
}
Run Code Online (Sandbox Code Playgroud)
AVAudioSession.sharedInstance()我尝试首先AVAudioSession.sharedInstance使用 category进行配置AVAudioSession.Category.playback,然后在.playAndRecord我想开始录制音频时切换到。
这不起作用,并且在AVCaptureSessionRuntimeError使用错误代码启动相机后立即调用该事件 …
我正在尝试使用 TypeScript 为导航库定义自定义类型,但我还没有专注于创建一个函数,该navigate函数将屏幕的名称作为第一个参数,将屏幕的属性作为第二个参数,同时仍然是类型安全的。
我试图:
keyof Stack) 作为第一个参数navigate<LoginStack>('HomeScreen')应该永远不会工作,只有屏幕LoginStack应该是可能的)keyof Stack作为第二个参数我目前的做法:
ScreenDefinitions.ts:
export interface LoginStackScreens {
LoginScreen: BaseProps & { isHeadless?: boolean };
ForgotPasswordScreen: BaseProps & { email: string };
}
export interface HomeStackScreens {
HomeScreen: BaseProps & { isHeadless?: boolean };
SavedScreen: BaseProps;
ChatsListScreen: BaseProps;
MyProfileScreen: BaseProps;
PostDetailsScreen: BaseProps & {
post: Post;
};
// ... some more
}
Run Code Online (Sandbox Code Playgroud)
Navigation.ts:
type ValueOf<T> = …Run Code Online (Sandbox Code Playgroud) 我正在为 React Native 开发一个本机模块,它包装了 CameraX API。CameraX API 是一个生命周期感知 API,因此它要求我将 androidx Lifecycle(或 androidx LifecycleOwner)传递到它的构造函数中。
由于这些是 androidx 类,因此无法从 React Native 上下文中获取Lifecycle( 或) 。LifecycleOwner
然而ReactContext::addLifecycleEventListener,它是由 React Native ( ) 实现的自定义生命周期事件监听器LifecycleEventListener,我现在正在尝试将其“转换”/映射到 androidx Lifecycle(或LifecycleOwner),但我不知道如何实现。
val lifecycle: Lifecycle = ???
reactContext.addLifecycleEventListener(object : LifecycleEventListener {
override fun onHostResume() {
TODO("Not yet implemented")
}
override fun onHostPause() {
TODO("Not yet implemented")
}
override fun onHostDestroy() {
TODO("Not yet implemented")
}
})
cameraProvider.bindToLifecycle(lifecycle, cameraSelector, preview)
Run Code Online (Sandbox Code Playgroud)
我现在的问题是:如何从我的 React …
我正在开发一个使用presentationDimensions(...) API的相机应用程序:
if #available(iOS 13.0, *) {
let leftVideo = self.formatDescription.presentationDimensions()
let rightVideo = other.formatDescription.presentationDimensions()
// ...
}
Run Code Online (Sandbox Code Playgroud)
现在,当我尝试构建项目时,出现以下错误:
未定义符号:(CoreMedia 中的扩展):__C.CMFormatDescriptionRef.presentationDimensions(usePixelAspectRatio: Swift.Bool, useCleanAperture: Swift.Bool) -> __C.CGSize
这是最后的 Xcode 日志(错误):
Ld /Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/VisionCameraExample.app/VisionCameraExample normal (in target 'VisionCameraExample' from project 'VisionCameraExample')
cd /Users/mrousavy/Projects/react-native-vision-camera/example/ios
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -target arm64-apple-ios12.0 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS14.4.sdk -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift/iphoneos -L/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib/swift-5.0/iphoneos -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/CocoaAsyncSocket -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/DoubleConversion -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/FBReactNativeSpec -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper-DoubleConversion -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper-Folly -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper-Glog -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper-PeerTalk -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Flipper-RSocket -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/FlipperKit -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Folly -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/RCTTypeSafety -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-Core -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-CoreModules -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTAnimation -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTBlob -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTImage -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTLinking -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTNetwork -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTSettings -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTText -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-RCTVibration -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-cxxreact -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-jsi -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-jsiexecutor -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/React-jsinspector -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/ReactCommon -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/Yoga -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/YogaKit -L/Users/mrousavy/Library/Developer/Xcode/DerivedData/VisionCameraExample-aeiaddcztqvxgygupinrzjkkdodr/Build/Products/Debug-iphoneos/glog …Run Code Online (Sandbox Code Playgroud) android ×3
avfoundation ×2
ios ×2
opengl-es ×2
swift ×2
androidx ×1
c++ ×1
javascript ×1
kotlin ×1
linker ×1
objective-c ×1
react-native ×1
reactjs ×1
skia ×1
surfaceview ×1
types ×1
typescript ×1
xcode ×1