- Joined
- Jul 10, 2007
- Messages
- 6,306
When I first began in save/load, the primary concern was the code size. This was because players had to copy the code and then paste it into the map later on. Another issue was that codes could only ever be up to 120 chars long (the max amount of chars a player can type into chat). Speed was only a secondary concern: it only needed to be fast enough to not cause a fps drop when the -save/-load commands are used.
With local saving (file i/o) now being a very viable option, the concern is no longer size, but rather speed. Using the max compression approach, save/load is only fast enough to handle around 240 chars. With local saving, 20,000-100,000 char codes could easily be achieved, meaning that the old approach is not fast enough. Furthermore, the save/load process is different. The encrypted data should not be sync'd as it will be bloated. The final data, the stuff used to create units etc, should be sync'd. This means that decryption is done locally. Furthermore, save/load should be done every time a change to a character occurs: get an item, get xp, etc. Also, various characters should be linked together on one profile so that a player can't keep old save codes w/o losing all of their other chars.
saving steps-
1. serialize the data into a stream of characters (used to be done with BigInt, now should be done using the old approach).
2. generate a checksum, key, or crc value for that data (use a 1 way hash algorithm or something that gives perhaps a 50-100 char key, generating a checksum will simply take way too long with a 10,000 digit number. A hashtable would also have to be used in order to manipulate that amount of data).
3. encrypt the data (scrambler can't be used as changing the base would take way too long. Use AES instead http://www.cs.bc.edu/~straubin/cs381-05/blockciphers/rijndael_ingles2004.swf)
4. write the data (120 chars per line, 12 lines per file)
loading init steps-
1. read the data (local)
2. decrypt the data (local)
upon char select, load-
1. validate the data (local)
2. interpret the data (local, break it up)
3. sync the data (compress into blocks of 32 bits)
4. implement the data (create hero, etc)
Essentially, 10 different resources need to be made to handle each of the above steps. More resources are required when you consider the substeps.
With local saving (file i/o) now being a very viable option, the concern is no longer size, but rather speed. Using the max compression approach, save/load is only fast enough to handle around 240 chars. With local saving, 20,000-100,000 char codes could easily be achieved, meaning that the old approach is not fast enough. Furthermore, the save/load process is different. The encrypted data should not be sync'd as it will be bloated. The final data, the stuff used to create units etc, should be sync'd. This means that decryption is done locally. Furthermore, save/load should be done every time a change to a character occurs: get an item, get xp, etc. Also, various characters should be linked together on one profile so that a player can't keep old save codes w/o losing all of their other chars.
saving steps-
1. serialize the data into a stream of characters (used to be done with BigInt, now should be done using the old approach).
2. generate a checksum, key, or crc value for that data (use a 1 way hash algorithm or something that gives perhaps a 50-100 char key, generating a checksum will simply take way too long with a 10,000 digit number. A hashtable would also have to be used in order to manipulate that amount of data).
3. encrypt the data (scrambler can't be used as changing the base would take way too long. Use AES instead http://www.cs.bc.edu/~straubin/cs381-05/blockciphers/rijndael_ingles2004.swf)
4. write the data (120 chars per line, 12 lines per file)
loading init steps-
1. read the data (local)
2. decrypt the data (local)
upon char select, load-
1. validate the data (local)
2. interpret the data (local, break it up)
3. sync the data (compress into blocks of 32 bits)
4. implement the data (create hero, etc)
Essentially, 10 different resources need to be made to handle each of the above steps. More resources are required when you consider the substeps.