Objective-C – Using NSCoding and updating your app to the App Store - objective-c

I'm using NSCoding to encode my objects and save them to disk as a "caching" feature not having to download data every time my app is started. Right now I'm saving this data in the Documents folder of the app which I have read is not deleted when updating the app.
So my concern is that I do some update to my class like adding an instance variable. And then uploads the app to the App Store. So when the user updates to new version the old objects that are saved in the Documents folder are encoded without that new instance variable that I had added in the new version. So that when my app will try to decode the saved objects during startup it will fail because the "old" objects from the Documents folder were not encoded with this new variable?
How would I deal with this problem? Make sure I write my classes "right" from the start? I'm sure I will eventually need to modify one of my classes and then break the old saved objects on disk.

Change either the filename or encoding key of the objects when you decide on a new version.
For example if you are now saving your objects in an collection to 'myObjectsFile', when you have a new version use the filename 'myObjectsFile2'. When your application launches check for 'myObjectsFile' if it is there load your old objects and migrate them to the new object version, then save those migrated objects to 'myObjectFile2' and delete 'myObjectFile'.
On the next launch you are all set, since you have deleted 'myObjectsFile'.

YOu can version your objects by having a version property that you would guarantee to always be there.
After loading an object from disk, don't do anything except check the version property. (You could also check for the existence of the version property first.) If your current code base does not support the version of your object, simply discard it.

Related

Multiple instances of application using Lucene.Net

I'm developing a WPF application that uses Lucene.Net to index data from files being generated by a third-party process. It's low volume with new files being created no more than once a minute.
My application uses a singleton IndexWriter instance that is created at startup. Similarly an IndexSearcher is also created at startup, but is recreated whenever an IndexWriter.Commit() occurs, ensuring that the newly added documents will appear in search results.
Anyway, some users need to run two instances of the application, but the problem is that newly added documents don't show up when searching within the second instance. I guess it's because the first instance is doing the commits, and there needs to be a way to tell the second instance to recreate its IndexSearcher.
One way would be to signal this using a file create/update in conjunction with a FileSystemWatcher, but first wondered if there was anything in Lucene.Net that I could utilise?
The only thing I can think of that might be helpful for you is IndexReader.Reopen(). This will refresh the IndexReader, but only if the index has changed since the reader was originally opened. It should cause minimal disk access in the case where the index hasn't been updated, and in the case where it has, it tries to only load segments that were changed or added.
One thing to note about the API: Reopen returns an IndexReader. In the case where the index hasn't changed, it returns the same instance; otherwise it returns a new one. The original index reader is not disposed, so you'll need to do it manually:
IndexReader reader = /* ... */;
IndexReader newReader = reader.Reopen();
if(newReader != reader)
{
// Only close the old reader if we got a new one
reader.Dispose();
}
reader = newReader;
I can't find the .NET docs right now, but here are the Java docs for Lucene 3.0.3 that explain the API.
If both instance have their own IndexWriter opened on the same directory, you're in for a world of pain and intermittent bad behaviour.
an IW expects and requires exclusive control of the index directory. This is the reason for the lock file.
If the second instance can detect that there is a existing instance, then you might be able to just open an IndexReader/Searcher on the folder and reopen when the directory changes.
But then what happens if the first instance closes? The index will nolonger be updated. So the second instance would need to reinitialise, this time with an IW. Perhaps it could do this when the lock file is removed when the first instance closes.
The "better" approach would be to spin up a "service" (just a background process, maybe in the system tray). All instances of the app would then query this service. If the app is started and the service is not detected then spin it up.

Archiving object to a read-only file

I have my app saving some objects into .sav files using NSKeyedArchiver archiveRootObject: toFile:; however, I realized that if a user were to open up one of the .sav files in textedit and change it at all, the app would fail to unarchive the objects next time it opens, and would stop working.
Is there any way I can archive the root objects to read-only file or otherwise stop users from editing them? They're buried in application support, so not super accessible, but I'd like to play it safe.
Your application should be able to handle that kind of error.
Also, suppose you did archive the data and then set the file to be read-only. What would stop a determined user from making it read-write again?
You could use some kind of checksum to verify file integrity, though, but you would probably have to roll your own in that case.
I don't think there is a way for you to avoid potentially losing the saved state (in the end the user could simply delete the file), however if you are worried about the user manipulating the data, you should look at NSSecureCoding.
I believe that is a way to avoid unarchiving "corrupt" data and guaranteeing the integrity, I have not explored the topic further so I can't say for sure whether that would allow a scenario in which the contents of specific fields are changed (i.e object type is the same, but value is different).
In conclusion, I think it is better/safer to build your system with the idea of someone trying to circumvent your security in mind and instead of trying to stop the user from manipulating/deleting the data, just making sure invalid data is not loaded in. For example in case of invalid/corrupt/missing data just reverting back to the default values (i.e as if the app is launching for the first time).

iOS Core Data: When is data recreatable?

My iOS Application has been in review, but was rejected regarding the iOS Data Storage Guidelines. In the end, it was rejected because my Core Data database (sqlite) was located in the /Documents folder. I'm was aware, that this folder should only be used, it the data could not be recreated by my application. The reason why I chose to put it there anyway was, that one of the entities in my database contains an attribute telling if the given news has been read. This information cannot be recreated. Is this not enough to put the database in the /Documents folder?
Another thing is, that the current version of my application does not use this value to visualize if the news item has been read or not. So, should I tell the review-team about this attribute and argument why I think it should be placed in the document-folder -- or should I just move it to the /Library/Cache/?
The app review team wants you to split your data apart. Store the re-creatable parts in the Cache folder and the stuff that can't be re-created in the Documents folder. It's okay if there's a little bit of stuff in Documents that could theoretically be re-created—nobody will even notice a title or datestamp—but long text documents, video, audio, or images should be kept in the Cache folder if they can be downloaded again later.
There are a couple different ways you could do this:
Store the downloaded content in the Cache folder and only put the content's filename in your Core Data database (or calculate the filename from something else, like the SHA-1 hash of the URL it was downloaded from). Make sure your code will re-download any content that's not in the cache.
Use two Core Data stores with a single store coordinator. Note that you can't split an entity's attributes across two stores, so you may have to break some of your entities in half. Nor can you create a relationship from an object in one store to an object in another, so you'll have to store the object ID URI instead. See "Cross-Store Relationships" in the "Relationships and Fetched Properties" section of the Core Data Programming Guide for more details.
Whatever you do, keep in mind that iOS may purge your Cache folder at any time. Be prepared for files in your Cache folder to have disappeared; if that happens, you should re-download the files as the user requests them.

Can I have multiple Core data handlers for one iphone app?

I'm wanting to build an app with 2 core data handlers for one iphone app.
The app will be a sports game with pre-filled information. Lets call it prefilledDB as a reference. This prefilledDB will be read-only, I do not want the user to add/edit/delete or change anything in this.
The second db ("gameDB") would have the same core data relationships/models/entities and structure.
When a user selects "New game", it will blank/empty the gameDB and fill it up with the prefilledDB contents, and "Continue game" would just use prefilledDB, assuming it is not empty.
However, I'm not sure if this is the right way to do it. My question therefore is what is the best way to handle this kind of processing. Would an in-built migration system be better than dropping/recreating databases -- or perhaps just using SQLite as the prefilledDB and then filling the gameDB with its contents?
Any help on this would be great.
The prefilled persistent store will have to be readonly if it ships in the app bundle as everything in the app bundle is read only. To make use of the data in a readwrite persistent store, you will need to copy the data to persistent store outside the app bundle e.g. in the documents directory for the app.
You have two ways of doing this:
1) Simplest: Create a new persistent store for each game. In this case you would just copy the prefilled persistent store file from the app directory to the documents directory while in the process renaming it to the current game name. Then you would open that file as the gameDB and it would be automatically populated with the preexisting data. The downside of this system is that you have a large number of files, one for each game, and the data could not be easily shared between them.
2) Best: You use two persistent stores simultaneously in the same context. This is more complicated to set up but gives you greater flexibility. The trick is to create a single abstract subentity e.g. Opponent and then two concrete subentities that are identical save for their names e.g. PreOpponent and GameOpponent. Using the configurations option in the data model, assign PreOpponent to the prefilled persistent store and GameOpponent to the gameDB persistent store. Write some code in the Opponentclass to clone itself to a new GameOpponent instance. When a new game starts, clone all the instances from prefilled to 'gameDB'. All the newly cloned GameOpponent instances will automatically write to the gameDB persistent store.
The advantage of the latter system is that you have all your active data in one readwrite persistent store which you can then manipulate as needed. E.g. you could check previous game or even clone a previous game to create a novel starting point for a new game.
Marcus Zarrus had some code on his site showing how to set up automatic cloning for managed objects.

Testing on blackberry device - adding and removing app multiple times

It would be useful for many people to know how to completely remove an application from your device when testing.
I have downloaded my app many times now, and likewise have deleted it many times. The problem is when deleting the app, it does not remove things like the persistent object related to my app, or the images downloaded through the app. So, when I download the next build, I have no idea if something broke that is related to building the persistent object or fetching the images since those elements already exist from the last build.
I don't know if this is a cache thing. I don't know if this is expected and I have to use some utility to wipe this data after deleting the app. I can't really find much info through basic web searches.
Any information would be appreciated.
Blackberry Bold 9000. 4.6 OS. tested with both SD card and no SD card.
Objects stored in the PersistentStore are automatically deleted on uninstall if their interfaces were defined in your project. If they are from the standard BlackBerry API then they will stick around until they're deleted. E.G if you save a String in the PersistentStore it will stay in the PersistentStore but if you save a class you created it will be deleted on an uninstall. So if you want to have those objects be deleted automatically just create a wrapper class and save that.
Images stored on the filesystem will not be deleted until you or some application deletes them. However, it should be easy for you to write an app that clears everything out.
Another solution you could implement is making your app somewhat self-aware of its data.
Create a simple String value that you persist (or optionally, persist it in a Hashtable so you can store many properties this way) that includes "Version".
At startup of the GUI app, compare the stored "Version" against the application's current version. If the stored version doesn't exist, or if it exists and matches, take no action.
If it exists and does not match, automatically clean up old persisted data; or alternatively prompt the user to see if they want that data to be deleted (which one is better will depend on your implementation)
You can also use CodeModuleListener to listen for an uninstall event -- when that happens, you can clean up at that time as well or instead.
(As an aside and a bit of shameless self promotion, I am actually currently working on a shareable library for Blackberry that makes managing persistence much easier, as well as desktop data backup/restore. I'm doing this as part of the BBSSH project, but I'll be splitting it off into a separate library of core components and publishing it under a dual GPL/optional commercial license. It will contain hooks for data cleanup and data versioning. )