How to save a huge table with subtables?

Dear Sirs)

Well, I have a map where each map[x][y] has a lot of various data.
Something like:
map[x][y].obj = hash: [/instance622]
or
map[x][y].event = { }
and all sorts of simpler variables.

Now I’am trying to io.save all that, and it seems like I can’t save a table, and table.concat simply ignores my map or writes “invalid value (table) at index 1 in table for ‘concat’”.

Is there any way to save my table as it is, or I should clear all those subtables, factories references and other stuff?

Have you tried the sys.save()function? (it does have an upper limit of the size though).
The hashes are user data types, and they cannot be stored directly using io.save

1 Like

Sure. It seems like it’s size is a bit over limit for sys.save. 8(
Trying to make it shorter.

If you can solve the hashes, you can also use the asset cjson which lets you encode a table into json.

1 Like

Thanks! Is it possible to io.save my table in json? Will it make a string from it?

Oh I see… it makes a decodable json string from it! Awesome!

What is the purpose of saving something like hash: [/instance622]? It looks like a reference to a factory created object. Is that of any use once you have loaded the table in the next play session?

2 Likes

Exactly… there’s no need for those references, since loaded map will be reinitialized. Still it might exceed the limit.

2 Likes

You might be helped by a (de)serialisation module I created a while back: GitHub - britzl/desert: (de)sert is a Lua table (de)serialiser for the Defold game engine

It doesn’t handle hashes, but it can serialise game objects (single instances or factory created)

You could also create an iterator for your Lua table and remove any hashes, vectors etc and then use json.encode() and io.write(). JSON encode from either cjson or here: GitHub - rxi/json.lua: A lightweight JSON library for Lua

1 Like

Thanks! Checking…

Seems like cjson won’t decode it’s own encoded files. Possibly because of negative array indexes like map[-10][-10].

And there’s no any vectors or matrices, i.e. all other data is quite simple. Still there’re subtables and hash references present. And they are decoded correctly or as strings, so the issue is in a 2d table structure with negative indexes.

SOLVED! Thanks everyone! Somebody sometimes later might have similar questions so…

Save/load for Defold tables procedure works best (in my case of a huge table) with sys.save() after some minor manipulations with the table, i.e. deep cloning and cutting out (= nil) all spare variables and object references. Keep it as short as possible, since sys.save won’t work with files larger than 512kB. Keeping up with Defold tables seems like the only way to keep your data intact and avoid converting it to json and other formats that might damage Defold table structure.

Some minor advices:

  • keep variable names as short as possible in your table, since each latter increases save size
  • remove all factory references since you’ll have to reinitiate them upon loading anyway as Alex mentioned
3 Likes

I’m in a similar situation (saving data per grid square). In my case I want to transition from a working implementation using sys.save() to io.save() so that I can apply some kind of obfuscation or encryption to the serialised data, but hashes are causing a problem.

Some replies above talk about ‘solving hashes’ or ‘removing hashes’ but that’s not easily done.

Honestly, it makes me regret using them so heavily. Maintaining a look-up-table of hashes to strings (in reality, multiple look up tables) is going to be pretty inconvenient, and I’d prefer the option of enabling such a thing natively in the engine. For some infrequently used systems, I’m tempted to replace hashes with strings to avoid complicating the save/load code.

Unless there’s another way?

1 Like

so that I can apply some kind of obfuscation or encryption to the serialised data

As far as the data we save, it’s in binary format.
But if that’s not enough obfuscation, you might want to read the file back using the io module, apply your encryption on it, then save it our to disc again. And vice versa when reading it back.

1 Like

We should add a function to go from a hash to something serialisable and back again. We already have hash_to_hex() so we need a hex_to_hash().

3 Likes

That’s a good idea, and it would work. I worry a little about the 3x I/O cost and related overhead causing gameplay stutters when autosaving on lower-end devices but I guess over time I could implement it as a native extension.

A hex_to_hash() function would be very neat. Is this on the roadmap? Maybe it can be done in Lua?

2 Likes