In Rails Test changes to objects not reflecting in Database? - ruby-on-rails-3

I have a Rails 3 application, in which I am trying to write a test-case? When the test case loads, I can see that it is creating the objects corresponding to the fixtures in the database. However, if I change the object within the test case or create another object they do not reflect in the database. Further, if I find the newly created object that is happening correctly.
What is happening here? Also, how can I get the changes to reflect in the database?

In your test_helper.rb, the use_transactional_fixtures option is probably set to true. This means that every test is wrapped in a database transaction. A transaction (in MySQL InnoDB) has to be persisted with a "COMMIT". The rails testing framework just doesn't do that so your database is in a fresh state after each test. As a consequence, your changes are rolled back.
Changes within a transaction can only be seen by the database connection that started it, from the outside (in your case the rails db command) they won't show up.
I hope, this helps.

Related

What is the additional functionality provided by transaction that isnt already provided by SaveChanges?

In Entity Framework, when I make changes to multiple models and at the end call SaveChanges, either all changes get stored into the database, or no changes get stored if any of them result in an error.
So this implicitly works like a transaction. So what exactly is the purpose of creating explicit transactions?

How to explicitly call TIBDataSet.RefreshSQL

I have list of records in TIBDataSet (Embarcadero Delphi) and I need to locate and modify one record in this list. There is chance that underlying database record has been changed by other queries and operations since TIBDataSet had been opened. Therefor I would like to call RefreshSQL for this one record (to get the latest data) before making any changes and before making post. Is it possible to do so and how?
I am not concerned about state of other records and I am sure that the record under consideration will always be updated and those updates will be commited before I need to changes this record from TIBDataSet.
As far as I understand then RefreshSQL is used for automatic retrieve of changes after TIBDataSet has posted upates to database. But I need manual (explicit) retrieval of the latest state before doing updates.
Try adding a TButton to your form and add the following code to its OnClick handler:
procedure TForm1.btnRefreshClick(Sender: TObject);
begin
IBQuery1.Refresh; // or whatever your IBX dataset is called
end;
and set a breakpoint on it.
Then run your app and another one (e.g. 2nd instance of it) and change a row in the second app, and commit it back to the db.
Navigate to the changed row in your app and click btnRefresh and use the debugger to trace execution.
You'll find that TDataSet.Refresh calls its InternalRefresh which in turn calls TIBCustomDataSet.InternalRefresh. That calls inherited InternalRefresh, which does nothing, followed by TIBCustomDataSet.InternalRefreshRow. If you trace into that, you'll find that it contructs a temporary IB query to retrieve the current row from the server, which should give you what you want before making changes yourself.
So that should do what you want. The problem is, it can be thoroughly confusing trying to monitor the data in two applications because they may be in different transaction states. So you are rather dependent on other users' apps "playing the transactional game" with you, so everyone sees a consistent view of the data.

Rails testing - no rows returned when querying the database during tests

I've made fixtures for some (not all) of my models, and have written tests in which objects of other models are created and saved. I'm using the standard testing system built into Rails (rake test). In one of my tests I've called debugger.
In the debugger, calling Login.all returns two logins which were created in the test. However, if I connect to the database (a PostgreSQL database, I'm connecting using pgAdmin 3) and do a select * from logins, no rows are returned.
I know this is the right database because my fixtures are all there (eg. if I run select * from users, all of my fixture users appear).
I assumed that when I called #save on an ActiveRecord object during a test, it would actually insert it into the database. Is this assumption incorrect, or is there something else I'm doing wrong?
Turns out each test runs in a transaction. When the debugger is hit, the transaction hasn't been committed yet so nothing shows up in the database. The transaction can be manually committed (but then the data can't be rolled back after the test completes, so it could clash with other tests).

Test seeds.rb in Rails

Before updating production, I need to ensure, that all manipulations, performed with seeds are working correctly.
How do you test seeds.rb with RSpec?
A seed is intended to be run first on an empty database, to correctly give the correct state. If that initial state needs to change (e.g. domain tables) you have to adapt the seed accordingly that it can add non-existing elements or change existing elements. A good way to achieve this is to do something like:
admin = Operator.find_or_create_by_login!(:admin) do |adm|
adm.name = 'admin'
adm.is_administrator = true
end
Before running the tests, we also load the seeds, so then it is easy to create a spec that will verify that the needed data is there (in case you wouldn't trust it).
If you need to manipulate existing data the preferred way is to use a migration. I generally do not write specs for migrations, but test them on my development database, and on a copy of my production database (before running it on top of actual production).
Hope this helps.

iOS Rolling out app updates. Keeping user data intact when DB update required

I have just done a quick search and nothing too relevant came up so here goes.
I have released the first version of an app. I have made a few changes to the SQLite db since then, in the next release I will need to update the DB structure but retain the user's data.
What's the best approach for this? I'm currently thinking that on app update I will never replace the user's (documents folder, not in bundle) database file but rather alter its structure using SQL queries.
This would involve tracking changes made to the database since the previous release. Script all these changes into SQL queries and run these to bring the DB to the latest revision. I will also need to keep a field in the database to track the version number (keep in line with app version for simplicity).
Unless there are specific hooks, delegate methods that are fired at first run after an update I will put calls for this logic into the very beginning of the appDelegate, before anything else is run.
While doing this I will display "Updating app" or something to the user.
Next thing, what happens if there is an error somewhere along the line and the update fails. The DB will be out of date and the app won't function properly as it expects a newer version?
Should I take it upon myself to just delete the user's DB file and replace it with the new version from the app bundle. OR, should I just test, test, test until everything is solid on my side and if an error occurs on the user's side it's something else, in which case I can't do anything about it only discard the data.
Any ideas on this would be greatly appreciated. :)
Thanks!
First of all, the approach you are considering is the correct one. This is known as database migration. Whenever you modify the database on your end, you should collect the appropriate ALTER TABLE... etc. methods into a migration script.
Then the next release of your app should run this code once (as you described) to migrate all the user's data.
As for handling errors, that's a tough one. I would be very weary of discarding the user's data. Better would be to display an error message and perhaps let the user contact you with a bug report. Then you can release an update to your app which hopefully can do the migration with no problems. But ideally you test the process well enough that there shouldn't be any problems like this. Of course it all depends on the complexity of the migration process.