Transfer MySQL to SQLite - sql

I want to start using Core Date on iPhone with pre-existing MySQL databases. What's the easiest way to transfer a MySQL database to SQLite?
I've tried using SQLite Migrator, but I don't know where to find the ODBC drivers for Mac (Snow Leopard). I found http://www.ch-werner.de/sqliteodbc/ which seems to have drivers, but they are for Power PC.
If someone could give me a walkthrough, or tell me what the best tools for this are, I'd be grateful.
Thanks.

Perhaps the simplest would be to use mysqldump to dump the raw SQL from your MySQL database into a text file and then use the sqlite3_exec() function to execute that SQL in order to populate the SQLite database.

Have you looked at this Perl script? I haven't used it - just did a quick search for mysql to sqlite migration and it popped right up.
Edit (after you replied to my comment):
The reverse direction is dealt with here.
If you are going to do it repeatedly and if data structure changes are to happen, maybe you would be better off using something like Django (albeit in a very hackish way). With it I would:
# This three lines are done once
django-admin.py startproject mymigrationproject
cd mymigrationproject
./manage.py startapp migration
# The following lines you repeat each time you want to migrate the data
edit settings.py and make the changes to connect to MySQL
./manage.py inspectdb > ./migration/models.py
edit ./migration/models.py to reorder tables (tables in which other tables depend on top)
mkdir fixtures
./manage.py dumpdata migration > ./fixtures/data.json
edit settings.py and make the changes to connect to SQLite
./manage.py syncdb
./manage.py loaddata ./fixtures.data.json

Here is a list of converters:
http://www.sqlite.org/cvstrac/wiki?p=ConverterTools
An alternative method that would work nicely but is rarely mentioned is: use a ORM class that abstracts the specific database differences away for you. e.g. you get these in PHP (RedBean), Python (Django's ORM layer, Storm, SqlAlchemy), Ruby on Rails ( ActiveRecord), Cocoa (CoreData)
i.e. you could do this:
Load data from source database using the ORM class.
Store data in memory or serialize to disk.
Store data into source database using the ORM class.

You can use a trial from http://www.sqlmaestro.com/products/sqlite/datawizard/
It is completely functional for 30 days.

You can get ODBC drivers for Mac OS X from Actual Technologies.
http://www.actualtech.com/
To connect to MySQL you need their ODBC Driver for Open Source Databases:
http://www.actualtech.com/product_opensourcedatabases.php
(Disclaimer: I am the author of SQLite Migrator)

There is a free ETL product that can be used to migrate data from one db to another. Have a look: http://www.talend.com/index.php
Good luck!

To do my conversions, I ended up using an ODBC from Actual Access. I think I used it in combination with SQLite Migrator. I never liked this way though it was always clunky. Expensive too, it ended up costing about $80 for those two pieces of software.
If I had to do this again, I'd buy SQLiteConverter by SQLabs. I use their SQLite Manager, and although it has a lot of interface problems, for database software it's not bad.
http://www.sqlabs.net/sqliteconverter.php

Related

How do I add (sql) database to my sintra application?

When I started to code my sinatra application I never used it before. Note that I had and still have no experience with RoR. I had one .rb file and one .haml and was happy. Now I had to split .rb file into about 10 'library' files as the whole application gets more and more complex.
I store some application logs/info in csv files and now I am getting conflicts when accessing the csv file. So I think that I need to introduce "proper" database solution. I want it to be part of my ruby (sinatra) application.
How can I introduce 'light' sql database into my sintra application?
I am on ruby 1.8.7 (2010-08-16 patchlevel 302) [i386-mingw32] soon upgrading to 1.9 (hopefully)
I'd recommend looking at Sequel. It's very flexible and powerful, and works well with SQLite, MySQL, Postgres, Oracle and other DBMs. It's not opinionated about how you talk to the database; You can use it as an ORM or with simple datasets, and allows embedded SQL or more programmatic approaches.
For ORM, both ActiveRecord and Sequel are recommended. About database, I guess sqlite3 will be good enough for your need. Also you can choose mysql or pg.
If you want to use active_record, you'll find this article very useful.
And if Sequel is the choice, just read Sequel documents here.
After the gems installed. You can start writing some code to connect the db. Then maybe some migration task to build database tables (and don't forget build some corresponding models). Both gem have similar syntax for migrations. After that, import your csv data and well done.
I have had no trouble using either Active Record or DataMapper to add object persistence to my Sinatra apps. People also tell me Sequel is very good but philosophically it is not not worlds apart from Active Record imho.
Active Record and Sequel favour a more database-centric approach, whereby you spell out your tables as a set of database and table definitions in a collection of migration files and merge them into a schema which is then used to build or update your database tables. If you really care about the underlying SQL database then one of these is for you. I find them to be six of one, a half-dozen the other.
DataMapper is more object-centric and lets you define the properties and object relationships you need in your object's own class definition; and then when your app launches you make sure you call DataMapper.auto_upgrade! and it upgrades the database to suit your object graph. The upside is that you only have one place to look to find what properties your object might have. The downside is you have less control over the specifics of the underlying databases, though it's not impossible to tightly define the mappings, DataMapper works well when you care about object graphs over database tables.
The good news is they pretty much all work in the same way once you have your mappings from object graph to SQL database tables defined. All support lazy or pre-emptive loading of related collections of objects, many_to_many relationships, polymorphism, etc, and tend to vary only in configuration and seeding details.
I often start projects using DataMapper just for its speed of throwing up and tearing down database schemas, as the app's object graph is still in flux; I refactor quickly to use Active Record when the schema has settled down. Next project I think I'll give Sequel a go though, as people do seem to rave about it.
I have had success using datamapper with Sinatra, put like the other post you can also use Sequel and Active Record. One advantage to maybe using Active Record though is if you do ever want to use/learn ROR, Active Record is the default ORM so that might be something that you want to consider.
If you don't want to go the ORM route you can always use the SQL-Ruby gem which will allow you to create and run sql queries. Here is some example code from the website http://sqlite-ruby.rubyforge.org/
require 'sqlite'
db = SQLite::Database.new( "data.db" )
db.execute( "select * from table" ) do |row|
p row
end
db.close

Data Migrations on Production Database

Is there any way to have data migrations on production database not to be with SQL?
We are using MigratorDotNet and when we build a new funcionality for the application that changes the scheme of the database and we need to do some data updates we have to do this complex and troublesome SQL statements so the data is consistent on production.
Was wondering if there was another way to do this, what are the bests practices to do this? Any ideas on other possible solutions?
We cannot use something like NHibernate because then we have to keep fixing old migrations when the scheme changes, and that can be error prone.
The trick is to use your migration tool and fold said data manipulation statements into the migrations. We use an extended version of the same thing typically for a few projects and it can definitely handle that trick.
If you're already using a migration tool like Migrator.NET then I'd say you're most of the way there. Complex schema/data changes are just a fact of life in the RDBMS world.
Try mite. It let's you do anything that you can do with sql and use sql to do it but have the ability to ensure your database is on the desired version and not risk executing a script that has already run (or miss a script), leaving your database in a consistent state.
If your developers adopt this. Deployments are a simple mite update and then you know problems are product related or data related (but not schema related).
https://github.com/soitgoes/mite
Let me know what you think. I developed this and have been using this with my team for years with great success.

what method is the most used and efficient to copy "database schema" to other servers not the db!

servers all sql server 2008, and win xp
i have the following task
create a huge database, DONE
distribute it to the 20 waiting servers!!
if there were two or three i would have taken the trouble of creating the db's on all of using sql server managemnt stdio
but i am guessing that there is an efficient way
please note,
only the copy of the database structure, the schema is needed not the values within the cells!
thank you

			
				
Or of course you should have been creating the scripts as you went along and putting them in Source control. Then you would have exactly which scripts you needed for this version of the software and be used to doing the same thing for later modifications. You would also script the data inserts for any lookup tables you need to build.
Not having that, you can script the entire database. or use a SQL compare tool. But I strongly urge you to start treating database code like all other code and scripting, storing it in source control and versioning it. Life is so much better when you do that.
What Gabriel McAdams has shown, or, Redgate SQL Compare does this very nicely also.
If you can spare the moolah, using a tool like Red Gate's SQL Packager is an option i have used in the past and it works well!
The tool can do a lot more as well and may not be worth the spend though if you do not need the other features!
In that case, Gabriels'option above is definitely the easiest one to go with!

Whats the best build system for building a database?

This is a problem that I come to on occasion and have yet to work out an answer that I'm happy with. I'm looking for a build system that works well for building a database - that is running all of the SQL files in the correct database instance as the correct user and in the correct order, and handling dependencies and the like properly.
I have a system that I hacked together using Gnu Make and it works, but it's not especially flexable and frankly can be a bit of a pain to work with in some situations. I've considered looking at things like SCons and CMake too, but I don't know how much better they are likely to be, or if there's a better system out there that already exists...
Just a shell script that runs all the create statements and imports in the proper order. You may also find migrations (comes with rails) interesting. It provides a make like infrastructure that let's you maintain a database the structure of which evolves over time.
Say you add a new column to some table. In migrations you'd write a snippet of code which describes the requirements for adding the column and also to rollback the change so you can switch to different versions of your schema automatically.
I'm not a big fan of the tight integration with rails, though, but the principles behind it are very interesting.
For SQL Server, I just use a batch file with SQLCMD.EXE and a bunch of .SQL files. It's not perfect, but it seems to work.
For my database, I use Migrator.NET
This is a .NET framework which allows you to create classes in where you define your DDL statements.
The framework comes with a command-line tool with which you can execute your 'migrations' in the correct order.
It also has a msbuild - task, so you can integrate it in a continuous integration build as well.
First export full DDL files describing all tables, views, source code
(procedures, functions, packages), sequences, and grants of a DB schema
See
Is there a tool to generate a full database DDL for SQL Server? What about Postgres and MySQL?
I created a database build system (part SQL-parser, part make file) to put these files together in a DB creation script using python.

What's the best way of converting a mysql database to a sqlite one?

I currently have a relatively small (4 or 5 tables, 5000 rows) MySQL database that I would like to convert to an sqlite database. As I'd potentially have to do this more than once, I'd be grateful if anyone could recommend any useful tools, or at least any easily-replicated method.
(I have complete admin access to the database/machines involved.)
I've had to do similar things a few times. The easiest approach for me has been to write a script that pulls from one data source and produces an output for the new data source. Just do a SELECT * query for each table in your current database, and then dump all the rows into an INSERT INTO query for your new database. You can either dump this into a file or pipe it straight into the database frontend.
It's not pretty, but honestly, pretty hardly seems to be a major concern for things like this. This technique is quick to write, and it works. Those are my primary criteria for things like this.
You might want to check out this thread, too. It looks like a couple of people have already put together basically what you need. I didn't look that far into it, though, so no guarantees.
As long as a MySQL dump file doesn't exceed the SQLite query language, you should be able to migrate fairly easily:
tgl#moto~$ mysqldump old-database > old-database-dump.sql
tgl#moto~$ sqlite3 -init old-database-dump.sql new-database
I haven't tried this myself.
UPDATE:
Looks like you'll need to do a couple edits of the MySQL dump. I'd use sed, or Google for it.
Just the comment syntax, auto_increment & TYPE= declaration, and escape characters differ.
Here is a list of converters:
http://www.sqlite.org/cvstrac/wiki?p=ConverterTools
An alternative method that would work nicely but is rarely mentioned is: use a ORM class that abstracts the specific database differences away for you. e.g. you get these in PHP (RedBean), Python (Django's ORM layer, Storm, SqlAlchemy), Ruby on Rails ( ActiveRecord), Cocoa (CoreData)
i.e. you could do this:
Load data from source database using the ORM class.
Store data in memory or serialize to disk.
Store data into source database using the ORM class.
If it's just a few tables you could probably script this in your preferred scripting langauge and have it all done by the time it'd take to read all the replies or track down a suitable tool. I would any way. :)