Kul*_*lur 5 screenshot opengl-es ios wkwebview swift3
当有硬件加速内容(一些在 iframe 内运行的特定赌场游戏)时,我在截取WKWebview内容的屏幕截图时遇到了严重的问题。到目前为止,我使用了每个人建议的标准截屏方式:
UIGraphicsBeginImageContextWithOptions(containerView.frame.size, true, 0.0)
containerView.layer.render(in: UIGraphicsGetCurrentContext()!)
//This line helps to fix view rendering for taking screenshot on older iOS devices
containerView.drawHierarchy(in: containerView.bounds, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
Run Code Online (Sandbox Code Playgroud)
这种方法非常有效,直到我在我的 WKWebview 中获得了一些由 GPU 呈现的内容。GPU 渲染的内容在屏幕截图上显示为黑色。我尝试了这种方法可能的所有技巧,但没有任何帮助。甚至 XCode 视图层次结构调试器也无法显示硬件加速的内容。因此,类似于Android,我需要另一种截屏方式。我已经通过开始记录屏幕上发生的所有事情并在获得第一张图像后停止屏幕记录来解决 Android 上的类似问题。
我经历了许多 Stack Overflow 问题和解决方案,但它们大部分都在 Obj-C 中(我完全不擅长),已经过时或不够具体,无法满足我的需求。
现在我发现,我可以使用glReadPixels从 OpenGL 中读取像素(如果我的内容是硬件加速的,那么我可以从显卡读取这些像素是有意义的,对吗??)
到目前为止,我已经设法创建了一个 Swift 代码段,它执行类似renderBuffer -> frameBuffer -> glReadPixels -> image 的操作
let width = Int(containerView.frame.size.width)
let height = Int(containerView.frame.size.height)
//BeginImageContext code was run above
let api = EAGLRenderingAPI.openGLES3
let context2 = EAGLContext(api: api)
EAGLContext.setCurrent(context2)
// Setup render buffer
var renderBuffer : GLuint = GLuint()
let size = GLsizei(10)
glGenRenderbuffers(size, &renderBuffer)
glBindRenderbuffer(GLenum(GL_RENDERBUFFER), renderBuffer)
let bufferWidth = GLsizei(width * 1)
let bufferHeight = GLsizei(height * 1)
let bufferFormat = GLenum(GL_RGBA8)
glRenderbufferStorage(GLenum(GL_RENDERBUFFER), bufferFormat, bufferWidth, bufferHeight)
// Setup frame buffer
var frameBuffer = GLuint()
glGenFramebuffers(GLsizei(10), &frameBuffer)
glBindFramebuffer(GLenum(GL_FRAMEBUFFER), frameBuffer)
glFramebufferRenderbuffer(GLenum(GL_FRAMEBUFFER), GLenum(GL_COLOR_ATTACHMENT0), GLenum(GL_RENDERBUFFER), renderBuffer)
// Draw
glReadBuffer(GLenum(GL_RENDERBUFFER))
glClearColor(0.1, 0.2, 0.3, 0.2)
glClear(GLbitfield(GL_COLOR_BUFFER_BIT))
//--------------
let bytes = malloc(width*height*4)
let bytes2 = malloc(width*height*4)
let x : GLint = GLint(0)
let y : GLint = GLint(0)
let w : GLsizei = GLsizei(width)
let h : GLsizei = GLsizei(height)
glReadPixels(x, y, w, h, GLenum(GL_RGBA), GLenum(GL_UNSIGNED_BYTE), bytes)
let data = NSData(bytes: bytes, length: width * height * 4)
let dataProvider = CGDataProvider(data: data)
//let dataProvider2 = CGDataProvider(dataInfo: nil, data: bytes!, size: width * height * 4, releaseData: )
let colorspace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo: CGBitmapInfo = [.byteOrder32Little, CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue)]
let aCGImage = CGImage(
width: Int(width),
height: Int(height),
bitsPerComponent: 8,
bitsPerPixel: 32,
bytesPerRow: 4 * Int(width),
space: colorspace,
bitmapInfo: bitmapInfo,
provider: dataProvider!,
decode: nil,
shouldInterpolate: true,
intent: .defaultIntent
)!
let imaag = UIImage(cgImage: aCGImage)
//I get the image of the same color that is defined at clearColor
Run Code Online (Sandbox Code Playgroud)
现在我的问题是,我走对了路吗?我是否有可能以某种方式获取我的 WKWebview(或拍摄它的快照并将其放入 UIView)来渲染缓冲区/帧缓冲区,以便glReadPixels 实际读取屏幕上的内容?
附注!我已经看到很多关于从 CAEAGLView 中获取 UIImage 的问题,但在我的情况下,它不是 CAEGLView,而是 WKWebview。非常感激。
小智 0
我有一个简单的方法来拍摄快照,有代码。
- (void)screenLoogShoot:(void (^)(UIImage *fullImage))snapShotHandler {
WKWebView *webView = self.webView;
UIScrollView *scrollView = webView.scrollView;
CGFloat boundsWidth = scrollView.bounds.size.width;
CGFloat contentHeight = scrollView.contentSize.height;
CGFloat scale = [UIScreen mainScreen].scale;
CGRect oldFrame = self.webView.frame;
if (@available(iOS 11.0, *)) {
self.webView.frame = CGRectMake(0, 0, boundsWidth, contentHeight);
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
WKSnapshotConfiguration *configuration = [WKSnapshotConfiguration new];
configuration.rect = CGRectMake(0, 0, boundsWidth, contentHeight);
configuration.snapshotWidth = @(boundsWidth);
[self.webView takeSnapshotWithConfiguration:configuration completionHandler:^(UIImage * _Nullable snapshotImage, NSError * _Nullable error) {
UIGraphicsBeginImageContextWithOptions(CGSizeMake(boundsWidth, contentHeight), NO, scale);
[snapshotImage drawInRect:CGRectMake(0, 0, boundsWidth, contentHeight)];
UIImage *fullImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.webView.frame = oldFrame;
snapShotHandler(fullImage);
}];
});
} else {
self.webView.frame = CGRectMake(0, 0, boundsWidth, contentHeight);
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.3 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
UIGraphicsBeginImageContextWithOptions(CGSizeMake(webView.bounds.size.width, webView.bounds.size.height), NO, scale);
[webView drawViewHierarchyInRect:CGRectMake(0, 0, boundsWidth, contentHeight) afterScreenUpdates:YES];
UIImage *fullImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.webView.frame = oldFrame;
!snapShotHandler ? : snapShotHandler(fullImage);
});
}
}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
994 次 |
| 最近记录: |