我有一个SCNView,它渲染了我的场景。但是,保存时我想在不同的时间点获得几个屏幕截图。
我对此的最佳猜测是创建一个SCNRenderer并渲染指定不同时间的场景。
我已经尝试过,但是我的图像只是空白,这是我的代码,有什么想法吗?:

-(void)test
{
//First create a new OpenGL context to render to.
NSOpenGLPixelFormatAttribute pixelFormatAttributes[] = {
    NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersionLegacy,
    NSOpenGLPFADoubleBuffer,
    NSOpenGLPFANoRecovery,
    NSOpenGLPFAAccelerated,
    NSOpenGLPFADepthSize, 24,
    0
};

NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:pixelFormatAttributes];
if (pixelFormat == nil)
{
    NSLog(@"Error: No appropriate pixel format found");
}

NSOpenGLContext *context = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];

//set the renderer to render to that context
SCNRenderer *lRenderer = [SCNRenderer rendererWithContext:context.CGLContextObj options: nil];
lRenderer.scene = myscnview.scene;
lRenderer.pointOfView = [myscnview.pointOfView clone];

//render the scene
[lRenderer render];

//I think I should now have the scene rendered into the context now?
//so i could just do:
NSImage *image = [self imageFromSceneKitView:controller.docView fromWindow:window ctx:context];

}

- (NSImage*)imageFromSceneKitView:(SCNView*)sceneKitView fromWindow:(NSWindow *)window ctx:    (NSOpenGLContext *)ctx
{
NSInteger width = sceneKitView.bounds.size.width * window.backingScaleFactor;
NSInteger height = sceneKitView.bounds.size.height * window.backingScaleFactor;
width = width - (width % 32);
height = height - (height % 32);
NSBitmapImageRep* imageRep=[[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
                                                                   pixelsWide:width
                                                                   pixelsHigh:height
                                                                bitsPerSample:8
                                                              samplesPerPixel:4
                                                                     hasAlpha:YES
                                                                     isPlanar:NO
                                                               colorSpaceName:NSCalibratedRGBColorSpace
                                                                  bytesPerRow:width*4
                                                                 bitsPerPixel:4*8];

CGLLockContext((CGLContextObj)[ctx CGLContextObj]);
[ctx makeCurrentContext];
glReadPixels(0, 0, (int)width, (int)height, GL_RGBA, GL_UNSIGNED_BYTE, [imageRep bitmapData]);
[NSOpenGLContext clearCurrentContext];
CGLUnlockContext((CGLContextObj)[ctx CGLContextObj]);
NSImage* outputImage = [[NSImage alloc] initWithSize:NSMakeSize(width, height)];
[outputImage addRepresentation:imageRep];

NSImage* flippedImage = [NSImage imageWithSize:NSMakeSize(width, height) flipped:YES drawingHandler:^BOOL(NSRect dstRect) {
    [imageRep drawInRect:dstRect];
    return YES;
}];
return flippedImage;

}

最佳答案

这是一个相当老的问题,但让我提及相当新的 SCNView.snapshot() SCNRenderer.snapshot(atTime:with:antialiasingMode:) API。

关于objective-c - 将SCNView渲染到屏幕外缓冲区以生成图像,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/25874982/

10-09 16:12