Hi!
I’m getting some strange behaviour in my game that seems to be related to Mac Retina displays. I’m not getting bug reports about this from any other OS.
Basically, what I do in my game is that rather than using gameobjects/tiles for the bottom layer (floor) of my levels, I load in one large PNG file as a texture into a GUI that tracks the camera. The script looks like this (posting for reference only as this does work 99% of the time):
local img = image.load(ground_png_file, true)
self.ground_node = gui.new_box_node(vmath.vector3(current_num_rows * 20, current_num_cols * 10 - 4, 0), vmath.vector3(img.width, img.height,0))
gui.new_texture("ground_texture", img.width, img.height, "rgba", img.buffer, false)
gui.set_texture(self.ground_node, "ground_texture")
gui.set_adjust_mode(self.ground_node, gui.ADJUST_STRETCH)
The scene should look like this:
But what happens is that the floor is rendered at the wrong scale and so doesn’t sit where it should:
I have tried to detect whether or not a display is Retina using DefOS, but not sure how to do it. I tried using “scaling_factor”, which is 2 for one user that is having the issue. However, that would result in false positives since on my Windows machine, a high resolution monitor with a scaling factor of 2 displays the floor perfectly.
So I have two questions:
1 - do you have any idea what causes Retina displays to behave like this?
I’m not very hopeful for an answer here but if you have any leads that would be great.
2 - how would you go about detecting whether a screen is Retina or not?
The closest thing I can think of is combining sys.get_sys_info() to detect a Mac and then use scaling_factor ~= 1. Is there a better way? (I ask this because I could choose to just render the floor tiles individually, it’s a performance hit but better than the game being unplayable.)