I’ve pushed fixes to beta now, and they should be available in ~50-60 minutes.
Yes, it’s mostly about ram. And, as we don’t support pages in that texture, the max dimensions of your graphics card as well.
What about a game.project parameter for maximum glyph size? Which could then complain if it is exceeded, similar to spawning game objects in excess of the instance cap.
I think it’s even harder for a developer to anticipate the actual glyph size (they vary quite a lot).
Besides, that type of runtime warning is what the “Entire font glyph cache (1024 x 2048) is filled in a single frame … Consider increasing the cache for ´path/to/resource.font´” is doing.
That’s not exactly what I suggested. The warning you are referring to relates to exceeding a given glyph cache size (whether it’s automatic or manual). What I am suggesting is a cap on the automatically calculated glyph cache size, and warnings when that is exceeded.
As it stands though that is precisely what I have to do due to the arbitrary cap at 1024x2048. I know that size is not enough, but I don’t know what is enough. It’s hard to know whether 2048x2048 is appropriate or if 2048x4096 is required. I’ll probably only find out if end users start complaining about flickering. Targeting desktop platforms means I am tempted to just stick 4096x4096 in there and call it a day.
My current workflow:
I know that 1024x2048 is not enough, so I must either identify the peak level of text on screen in my game (technically difficult), or create a test scene where I render all unique characters of each font (that is basically what my latest repro project does) (time consuming). Since the amount of text in a game grows as it develops, in practice I will either create that test scene and have it delete itself after init(), or save myself the effort and go for a large safe value (e.g. 4096x4096).
Potential workflow
The automatic calculation is uncapped, but warnings are given for extreme values. I can specify that I am comfortable with glyph cache sizes up to 4096x4096, which means I can forget about it unless I ever do reach the cap at which point I probably need to save space by reducing the font size or something.
I don’t know, I don’t want to badger you into a big change since it seems to affect only a tiny population of your users. It’s just creating a custom scene to test glyph cache sizes seems extremely hacky and weird (I’ll need to do *.io stuff to automatically populate the labels/text nodes with the unique characters in the font files).
Adding a warning to bob/editor is along the lines we’re thinking as well.
I know that size is not enough, but I don’t know what is enough
Yeah, I think we should show (in the editor), the number of glyphs that will fit into the cache.
And that’s an actual target you can use as you need to decide upfront how many unique glyphs you will show at any given frame.
(edit)
The automatic calculation is uncapped, but warnings are given for extreme values
That would be inconsistent with how we usually deal with budgets.
We start low for a reason namely to make devs think about budgets and also to help ship the game.
I’m not sure I understand how to opt-in to SIMD support in WASM. Is it done by manually editing the app manifest? Is it supposed to look something like this?
Could you please write more information about AABBs?
AABB information is afaik in model’s local space, correct?
I’m trying to use the new API in Basic 3D example, but I get same data for all models (cube and sphere) and regardless if rotated (there is no scaling for model, only for the parent game object, so I expected scaling go, won’t affect the model’s aabb).
DEBUG:SCRIPT:
{ --[[0x777aa0041e00]]
min = vmath.vector3(-0.5, -0.5, -0.5),
max = vmath.vector3(0.5, 0.5, 0.5)
}
I also added a simple function to draw the lines joining corners of aabb:
I only got different result, when testing quad (as height is 0) and a quad sized 2x2 (the built-in quad_2x2.dae) of course (below is aabb visualisation for quad_2x2, but in position of cube):