Thread Tools Display Modes
11/19/17, 03:39 PM   #1
Solinur
AddOn Author - Click to view addons
Join Date: Aug 2014
Posts: 78
[implemented] Issues when saving/opening a lot of data to/from saved variables

Hi,

I wonder what the true and exact limits of saving data into saved variables are.

I played a bit around with a simple setting where I directly accessed the Global Variable defined in the Manifest file.
I managed to create a savedvar with 2.7 MB that gets corrupted upon loading by simply doing the following:

Lua Code:
  1. CombatMetricsFightData_Save = {}
  2.  
  3. for i = 1, entries do
  4.  
  5.     CombatMetricsFightData_Save[i] = true
  6.  
  7. end

if entries is bigger than 131072 the last and first entry become weird

Lua Code:
  1. [true] = true,
  2. ...
  3. ["CombatMetricsFightData_Save"] = true,

it uses its own handle and value as table key ...
This usually happens on load, when I check the file before loading I can see in a texteditor that everything is still fine.

If I save longer values e.g.

Lua Code:
  1. CombatMetricsFightData_Save[i] = tostring(math.random())

then the max number decreases to ~99.5k

Lua Code:
  1. CombatMetricsFightData_Save[i] = {i, "ABC"..i, math.random()}

can take about 52k

then again

Lua Code:
  1. CombatMetricsFightData_Save[i] = {1,2,3,4,5,6,7,8,9,10}
or a fixed string with 999 chars allow for 131k values again (in the latter case the file is 133 MB big, so its not the filesize...).

I would like to somehow make sure that I don't create corrupted saved variables (but still be able to save a lot of data).

It somehow feels like a bug to me, but it might be something that cannot be changed.

So if the restriction cannot be lifted, I at least would like to have a way to tell if my file will be fine or not.
 
11/19/17, 06:08 PM   #2
Rhyono
AddOn Author - Click to view addons
Join Date: Sep 2016
Posts: 659
Through personal experience and that of other devs: even 10k table entries can have unexpected results.
 
11/19/17, 06:58 PM   #3
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
I've been testing this too and while I've found results similar to yours I'm left more confused now about exactly what the limit is and how it changes depending on the content. I've made an empty addon that just populates the saved variable file with a structure like:


Code:
TestNoneLogSavedVars =
{
    ["Default"] = 
    {
        ["@Reorx"] = 
        {
            ["$AccountWide"] = 
            {
                ["test1"] = 
                {
                },
                ["version"] = 3,
            },
        },
    },
}
  • The file is saved fine but it breaks on load.
  • File corrupt can result in "random" data being added, missing data, sibling data being added up one level or complete removal of the data and a reset of the saved variable file.
  • If I save "true" values the file breaks at 131066 entries. This is suspiciously close to 2^17=131072.
  • If all inserted data is the same it doesn't seem to matter how long it is. The break size of an array with trues is the same as one with 100 byte strings.
  • The limit seems to only affect the number of entries in a single variable. For example, 2/3/4 different arrays with 131000 entries each save and load fine.
  • The depth of the field seems to have some effect on the max length of the array. For example, adding one level to the saved variable data like:
    Code:
                ["$AccountWide"] = 
                {
                    ["test1"] = 
                    {
                       ["test2"] = 
                       {
                           -- Lots of entries
                       },
                    },
                },
    changes the break size from 131065 to 131064.
  • Variation in the data seem to affect the break size. If we have:
    Code:
     
    savedVars[i] = i + 101713    -- Arbitrary constant here
    Then it breaks at 65533 (close to 2^16). If we have:
    Code:
     
    savedVars[i] = (i % 10) + 101713
    then it breaks at 131066, one more than the constant case.
  • Files that break when loaded into ESO seem to load fine with a command line Lua v5.2. I believe ESO uses 5.1 but I didn't see any bugs/change log that might affect this.

So more data but I'm not sure I'm any closer to understanding the exact issue or exactly when it occurs. I know in our uespLog addon I've limited the logging data array to 60k elements to prevent issues although I do run into random corruption during various data mining operations involving large amounts of data.

It sort of looks like someone is using a fixed buffer with a size around 131072 elements which is somehow overflowed during file loading. I assume the ESO code somewhere is just using a dofile()/loadfile()/dostring() Lua API call and if so the issue would in the Lua library somewhere. Or perhaps the ESO code uses a custom file load which has a bug in it.

Would be nice if a ZOS dev (Chip?) could step through the saved variable loading with a known bad file to see where the issue is, confirm it, and hopefully fix it. For the record, all you need to is create an addon with saved variables that outputs a single large array like:

Code:
function testNone.SetVar1(count)

	if (count == nil) then count = 131072 end
	
	testNone.savedVars.test1 = {}
		
	for i = 1, count do
		testNone.savedVars.test1[i] = true
	end
end
 
11/20/17, 06:58 AM   #4
Solinur
AddOn Author - Click to view addons
Join Date: Aug 2014
Posts: 78
I might have found a clue.

Shinni pointed out that lua stores each string only once, preventing a duplicate string to be stores again. Maybe its the same with values. This would make it very memory efficient and maybe explain how certain limits come into place. It would probably mean that there is a fixed number of different keys + values (my guess: 131072) a table can have, be it strings or values.

for example:

Lua Code:
  1. function CMX.MakeData(N,N2)
  2.  
  3.     CombatMetricsFightData_Save = {}
  4.    
  5.     for i = 1,N do
  6.        
  7.         CombatMetricsFightData_Save[i] = {}
  8.         local logdata = CombatMetricsFightData_Save[i]
  9.        
  10.         for k = 1, N2 do
  11.  
  12.             logdata[k] = i.."|"..k
  13.        
  14.         end
  15.     end
  16. end

and calling it with N = 131 and N2 = 1000 leads to a corrupted file since numbers 1 to 1000 are used as keys and 131000 strings of type "i|k" are created. This means we have 132000 different elements in use causing it to fail upon recreation from file and starting to address them from the beginning again.

Calling said function with N=130 and N2 = 1000 seems to be fine (would be 131000 different keys+values)

Edit: Just saw that uesp had the same conclusion at the end. Didn't read it in detail as I already had an idea in my mind that I wanted to note down
 
11/20/17, 10:15 AM   #5
ZOS_ChipHilseberg
ZOS Staff!
Premium Member
Yes this person is from ZeniMax!
Join Date: Oct 2014
Posts: 551
There is a limit on the number of constants (unique strings in this case) which comes from using 32-bit bytecodes. We could switch to 64-bit, but there is a memory and performance cost to doing that (for all addons and our UI).
 
11/20/17, 10:40 AM   #6
Ayantir
 
Ayantir's Avatar
AddOn Author - Click to view addons
Join Date: Jul 2014
Posts: 1,019
Before changing this, I would ask first Why. ESO is now 3 years old and we managed to live with this without, so we could maybe try together to bypass that limit by working differently.

If it's for the bull**** of https://forums.elderscrollsonline.co...le-size-limits

I would simply add : LEARN TO CODE

If you have an example of an addon storing more than 130k keys. Please show us your addon, how you save data, etc etc.

And I can already say that saving an entire Combat Metric log for each combat event in saved vars is not a good idea. There is simply too much data.

I had pChat tables of 90K lines and lorebooks tables of 95k datamined entries and I managed to make it work. Not alone, Not without pain, Not quickly, but it has been done.
 
11/20/17, 11:02 AM   #7
sirinsidiator
 
sirinsidiator's Avatar
AddOn Author - Click to view addons
Join Date: Apr 2014
Posts: 1,566
@Chip How much of a performance impact are we talking about? Would it be noticable?

@Ayantir Even if it is an old topic, I have to agree that corrupting save data on load is a bug and should be fixed.

Maybe the game should stop writing data before it reaches the limit, or skip saving a new version of a file if it has too many strings? Loosing the data of one session is IMO still better than loosing all data. The ZO_SavedVariables class could also offer some way for addons to determine how much "space" is still left to give them a way to determine if they should remove old entries in what ever way they need.
 
11/20/17, 11:06 AM   #8
Rhyono
AddOn Author - Click to view addons
Join Date: Sep 2016
Posts: 659
I can understand saving massive quantities of data to the saved vars while doing dev stuff, but for actual users to be doing so? Loading all of that data/writing to it would be a significant performance drain. Every time I go on the PTS, it's amazing how fast this game loads when you don't have addons. I don't even have that many addons, but I do have a few big ones. More addons doing things with even larger data sets just seems like a bad idea, especially when the layman won't know (beforehand) what a massive hit they are going to take when using such an addon.
 
11/20/17, 11:58 AM   #9
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
Originally Posted by ZOS_ChipHilseberg View Post
There is a limit on the number of constants (unique strings in this case) which comes from using 32-bit bytecodes. We could switch to 64-bit, but there is a memory and performance cost to doing that (for all addons and our UI).
Could you explain this a little more? I don't see how this would explain an array of 131072 "true" values becoming corrupted on load, or explain why it doesn't get corrupted on save or while in memory but only on loading the file. I also can't replicate it by using the raw Lua C API, either 5.1 or 5.2...they both seem to be able to load files fine that get corrupted in ESO.

Is there any way that corruption and loss of data can be prevented when loading files with "too many" entries? I'd rather than entries past the "too many" point just be ignored rather than corrupt the entire file.

As for Ayantir's question on why, I think that so long as it is not a huge amount time/effort to at least prevent data corruption it is worth it, even if only a fraction of users ever encounter it. Also worth pointing out that *now* it only affects a some number of users but depending on the nature of the bug it may well start affected more and more users at some point. I run into data corruption all the time as I deal with with very large amounts of logged data from the game but my case is definitely unique. I also don't believe it is as simple as limiting arrays to <50k of elements as even with smaller arrays I've encountered data corruption.
 
11/20/17, 12:18 PM   #10
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
Couple of references I've found in case anyone else happens to be interested in the technical details of this issue:

From the first link the limit seems to be 2^18 literal constants per source function (i.e., a single saved variable file). So its not nearly as simple as just limiting array sizes to some arbitrary value and would explain why I've run into data corruption issues with relatively small arrays of deeply nested data.

Edit: Also explains why the issue is at load time and not save or run time. At run time a table can have as many elements as memory allows. Saving is fine as you are just serializing the data to a string format. At load time, however, you are converting the string table format to VM byte-code which is where the 2^18 issue lies.

I'm surprised this sort of thing doesn't result in a Lua run time error as the library is pretty good about detecting these.

Last edited by Uesp : 11/20/17 at 12:43 PM. Reason: addt
 
11/20/17, 01:29 PM   #11
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
So with some more digging I have some answers but more questions. I'm not entirely sure the issue is due to the overflow of the constant table but may be related to it somehow.

The following assumes a minimal saved variable file with N entries of true as used previously.
  • Eso (64 bit if it matters) repeatably corrupts data after 131065 entries.
  • Command line Lua 5.1 and 5.2 have no problem loading files larger than 131065 entries.
  • Using "luac" on the file with 131065 entries shows that the constant table is only 131072 in size.
  • Increasing the size up to 262136 works fine with command line 'luac'. Past that it aborts with a "constant table overflow" error message. This works out to the expected 2^18 limit (or exactly 2^18-1).
  • Eso crashes to the desktop when trying to load a file with 262150 entries (it seems to save it fine).

So there's 2 probably related issues here:
  1. Crash due to exceeding the 2^18 constants.
  2. Data corruption starting around 2^17 constants (maybe, still exactly unsure of the trigger).

Hopefully the crash can be avoided by adding some error checking and the cause of the data corruption still need to be confirmed.
 
11/20/17, 07:21 PM   #12
Solinur
AddOn Author - Click to view addons
Join Date: Aug 2014
Posts: 78
Originally Posted by Uesp View Post
Eso (64 bit if it matters) repeatably corrupts data after 131065 entries.
If you use ZO_SavedVariables you use a few entries up for keys ("Default", "@Reorx" ... in your example) Then you have true as a value the rest goes into the numbered key of your table (1 to 131065) that way you fill up that number of constants to 2^17.

Also, Thanks for all the replies!

If there is a noticeable performance cost I wouldn't want the 64-bit bytecodes system either.

For me it is enough to know how to prevent data loss in the future, since its not hard to go trough the table and find out if the limit is reached.

Also knowing how the problem occurs tells me how I can improve my data format in a way that tries to minimize usage of HD space but also limits the amount of unique values and keys. I'll probably discuss a few ideas in the chat the next days.

On the question why I want to store this much data: Having a full combat log is useful for purposes of theorycrafting or finding bugs in the combat system. Usually I cannot stop during a raid, so looking and filtering through the log at a later point gives me valuable insights into what is happening. Having an option to analyze it after the raid is important for me. Of course the general user doesn't need that, thats why on default the log won't be saved, it requires Shift+Clicking the save button.

I'll probably work on a way to only save selected entries (e.g. only damage events) to improve on this situation. Also having played around I got a good feel for the increase in loading time if a 130 MB file is loaded. Due to this I'll also add a limiter that interacts with the user once a certain size of the saved variables is reached.
 
11/20/17, 07:54 PM   #13
Uesp
AddOn Author - Click to view addons
Join Date: Mar 2014
Posts: 15
Originally Posted by decay2 View Post
If you use ZO_SavedVariables you use a few entries up for keys ("Default", "@Reorx" ... in your example) Then you have true as a value the rest goes into the numbered key of your table (1 to 131065) that way you fill up that number of constants to 2^17.
Yes...so as I understand it now there is a limit of 2^17 (131072) unique constants (number, string, true/false/nil) that a saved variable file can have in ESO before you run into corruption issues. Note that this includes constants in the entire file and not just in one array or nesting level which makes it a little more complicated to determine.

It's also non-obvious by just looking at the file sizes. For example, I have a 140MB file that only uses 50k constants while a 12MB file uses 120k constants. It matters more on how many unique constants you have rather than the actual file size.

It should be possible to count the number of constants in a saved variable from within the Lua API in ESO. Just find the root of the saved variable, iterate through it, and say all strings/numbers/true/false/nil into a table as keys, then count the number of keys in that table. At the very least it would let you know if you are getting close to the problem size.

I also don't have any idea why the problem occurs at 2^17 instead of 2^18 like it should according to the technical details. Offhand I'd guess a signed/unsigned issue but don't see any issue in the code itself and the Lua API code works fine up to 2^18 indicating some issue relating to ESO or the version of Lua it uses.
 
11/21/17, 05:12 PM   #14
ZOS_ChipHilseberg
ZOS Staff!
Premium Member
Yes this person is from ZeniMax!
Join Date: Oct 2014
Posts: 551
We'd have to do profiling to see what the impact is. We haven't explored it yet.
 
01/17/18, 10:38 PM   #15
Slick_007
Join Date: Nov 2017
Posts: 1
so with the upcoming changes, namely no longer supporting the 32bit client, will that have any impact on this situation?
 
04/02/18, 09:34 AM   #16
ZOS_ChipHilseberg
ZOS Staff!
Premium Member
Yes this person is from ZeniMax!
Join Date: Oct 2014
Posts: 551
With Summerset we've changed to using the 64bit byte codes which should fix this.
 
03/14/19, 05:45 PM   #17
Solinur
AddOn Author - Click to view addons
Join Date: Aug 2014
Posts: 78
Ok this is probably overdue, but since this is implemented now and can easily load more than 10 million constants (where loading times increase to very noticeable amounts, so don't do it). This is "fixed" or rather improved. This thread can be closed.
 

ESOUI » Developer Discussions » Wish List » [implemented] Issues when saving/opening a lot of data to/from saved variables

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off