问题描述
在我的项目中,纹理是从PaintCode(
问题来自于使用SKTextureAtlas(dictionary:)
初始化atlas.
使用此方法创建的 SKTexture 不会嵌入与图像的 scale 属性相关的数据.因此,在 init(texture:)
创建 SKSpriteNode
的过程中,纹理中缺少比例信息导致选择纹理大小而不是图像大小.
纠正它的一种方法是在 SKSpriteNode
创建期间提供节点的大小:init(texture:size:)
In my project, textures are procedurally generated from method provided by PaintCode (paint-code).
I then create a SKTextureAtlas
from a dictionary filed with UIImage
generated by these methods :myAtlas = SKTextureAtlas(dictionary: myTextures)
At last, textures are retrieve from atlas using textureNamed
:var sprite1 = SKSpriteNode(texture:myAtlas.textureNamed("texture1"))
But displayed nodes are double sized on iPhone4S simulator. And triple sized on iPhone 6 Plus simulator.
It seems that at init, atlas compute images at the device resolution.But generated images already have the correct size and do not need to be changed. See Drawing Method below.
Here is the description of the generated image:<UIImage: 0x7f86cae56cd0>, {52, 52}
And the description of the corresponding texture in atlas:<SKTexture> 'image1' (156 x 156)
This for iPhone 6 Plus, using @3x images, that's why size is x3.
And for iPhone 4S, using @2x images, as expected:<UIImage: 0x7d55dde0>, {52, 52}
<SKTexture> 'image1' (156 x 156)
At last, the scale
property for generated UIImage
is set to the right device resolution: 2.0 for @2x (iPhone 4S) and 3.0 for @3x (iPhone 6 Plus).
The Question
So what can I do to avoid atlas resizing the pictures?
Drawing method
PaintCode generate drawing methods as the following:
public class func imageOfCell(#frame: CGRect) -> UIImage {
UIGraphicsBeginImageContextWithOptions(frame.size, false, 0)
StyleKit.drawCell(frame: frame)
let imageOfCell = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return imageOfCell
}
Update 1
Comparing two approaches to generate SKTextureAtlas
// Some test image
let testImage:UIImage...
// Atlas creation
var myTextures = [String:UIImage]()
myTextures["texture1"] = testImage
myAtlas = SKTextureAtlas(dictionary: myTextures)
// Create two textures from the same image
let texture1 = myAtlas.textureNamed("texture1")
let texture2 = SKTexture(image:testImage)
// Wrong display : node is oversized
var sprite1 = SKSpriteNode(texture:texture1)
// Correct display
var sprite2 = SKSpriteNode(texture:texture2)
It seems that the problem lie on SKTextureAtlas
from a dictionary as as SKSpriteNode
initialization does not use scale property from UIImage
to correctly size the node.
Here are descriptions on console:- texture1: '' (84 x 84)- texture2: 'texture1' (84 x 84)
texture2
miss some data
! That could explain the lack of scale information to properly size the node as:
node's size = texture's size divide by texture's scale.
Update 2
The problem occur when the scale property of UIImage
is different than one.
So you can use the following method to generate picture:
func imageOfCell(frame: CGRect, color:SKColor) -> UIImage {
UIGraphicsBeginImageContextWithOptions(frame.size, false, 0)
var bezierPath = UIBezierPath(rect: frame)
color.setFill()
bezierPath.fill()
let imageOfCell = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return imageOfCell
}
The problem come from the use of SKTextureAtlas(dictionary:)
to initialize atlas.
SKTexture created using this method does not embed data related to image's scale property. So during the creation of SKSpriteNode
by init(texture:)
the lack of scale information in texture leads to choose texture's size in place of image's size.
One way to correct it is to provide node's size during SKSpriteNode
creation: init(texture:size:)
这篇关于如何强制从字典创建的 SKTextureAtlas 不修改纹理大小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!