I have a database in a format which can be accessed via ODBC. I'm looking for a command-line tool to generate SQL file with DROP/CREATE statements from it, preferably with all the information including table/field comments and table relations. (Possibly for a tool to parse the file and import the schema too, but I guess this would be relatively easier to find). Need this to automate workflow, to be able to design the database visually but store it in SVN in code form.
Which tool should I use?
If this helps, the database in question is MS Access, but I guess there's a higher chance of finding a generic ODBC tool...
Okay, I wrote the tool to export access schema/parse SQL files myself, it's available here:
https://bitbucket.org/himselfv/jet-tool
Feel free to use if anyone needs it.
Adding this because I wanted to search an ODBC schema, and came across this post. This tool lets you dump a csv format of the schema itself:
http://sagedataobjects.blogspot.co.uk/2008/05/exploring-sage-data-schema.html
And then you can grep away..
This script may work for you with some modifications. Access (the application) is required though.
Related
I would like to know if anyone has a good solution for importing BAIv2 banking files into SQL Server. First of all the files have "continuation records" which have to be considered along with the parent records. Also, T-SQL doesn't have a pleasant way of parsing comma-separated strings. Finally, one hierarchy in the file has a varying number of elements so that makes direct pasting into a table difficult because the columns would not line up.
This is the file from hell. If anyone has any insights into how to import and parse BAIv2 banking files I would be most appreciative.
Thank you,
You're best off handling this with a dedicated application server and a real (general-purpose) programming language. T-SQL is ill-suited for this task.
When that's not an option, you can use C# for a SQL CLR stored procedure to parse the files. I did something similar for banking flat-files when I didn't have the option of an application server.
Can we export schema of a table and it's data from qt code?
Or can we do it with sql script, a query that return a table's schema?
You can probably manage to export simple tables in a generic manner using only the functions provided by QSqlDatabase (via the tables() and records functions as a starting point), but as far as I know, you'll need to use database-specific queries to get complete schema information.
This is best done, in my opinion, with your specific database implementation's tools. For instance, SQLite has a .dump command that does just that. MySQL has a dedicated mysqldump utility. PostgreSQL has pg_dump, etc...
It's safer to use pre-built tools for your specific engine. Getting all the DDL statements correct, plugging in the keys and triggers at the right time, worrying about encoding, ... is quite a task.
I want to start using Core Date on iPhone with pre-existing MySQL databases. What's the easiest way to transfer a MySQL database to SQLite?
I've tried using SQLite Migrator, but I don't know where to find the ODBC drivers for Mac (Snow Leopard). I found http://www.ch-werner.de/sqliteodbc/ which seems to have drivers, but they are for Power PC.
If someone could give me a walkthrough, or tell me what the best tools for this are, I'd be grateful.
Thanks.
Perhaps the simplest would be to use mysqldump to dump the raw SQL from your MySQL database into a text file and then use the sqlite3_exec() function to execute that SQL in order to populate the SQLite database.
Have you looked at this Perl script? I haven't used it - just did a quick search for mysql to sqlite migration and it popped right up.
Edit (after you replied to my comment):
The reverse direction is dealt with here.
If you are going to do it repeatedly and if data structure changes are to happen, maybe you would be better off using something like Django (albeit in a very hackish way). With it I would:
# This three lines are done once
django-admin.py startproject mymigrationproject
cd mymigrationproject
./manage.py startapp migration
# The following lines you repeat each time you want to migrate the data
edit settings.py and make the changes to connect to MySQL
./manage.py inspectdb > ./migration/models.py
edit ./migration/models.py to reorder tables (tables in which other tables depend on top)
mkdir fixtures
./manage.py dumpdata migration > ./fixtures/data.json
edit settings.py and make the changes to connect to SQLite
./manage.py syncdb
./manage.py loaddata ./fixtures.data.json
Here is a list of converters:
http://www.sqlite.org/cvstrac/wiki?p=ConverterTools
An alternative method that would work nicely but is rarely mentioned is: use a ORM class that abstracts the specific database differences away for you. e.g. you get these in PHP (RedBean), Python (Django's ORM layer, Storm, SqlAlchemy), Ruby on Rails ( ActiveRecord), Cocoa (CoreData)
i.e. you could do this:
Load data from source database using the ORM class.
Store data in memory or serialize to disk.
Store data into source database using the ORM class.
You can use a trial from http://www.sqlmaestro.com/products/sqlite/datawizard/
It is completely functional for 30 days.
You can get ODBC drivers for Mac OS X from Actual Technologies.
http://www.actualtech.com/
To connect to MySQL you need their ODBC Driver for Open Source Databases:
http://www.actualtech.com/product_opensourcedatabases.php
(Disclaimer: I am the author of SQLite Migrator)
There is a free ETL product that can be used to migrate data from one db to another. Have a look: http://www.talend.com/index.php
Good luck!
To do my conversions, I ended up using an ODBC from Actual Access. I think I used it in combination with SQLite Migrator. I never liked this way though it was always clunky. Expensive too, it ended up costing about $80 for those two pieces of software.
If I had to do this again, I'd buy SQLiteConverter by SQLabs. I use their SQLite Manager, and although it has a lot of interface problems, for database software it's not bad.
http://www.sqlabs.net/sqliteconverter.php
I have a Databse "Product" in in sql 2008.I have another Databse "ORDER" in sql 2008.
Both exist in different servers.
Now the requirement is to Merge both databases, and test pointing the applications to this new DB.
Can anyone suggest the best way to accomplish this without losing the information?
I have 2 options.
1) Script the DB objects.(script both the DB and run this scripts inthe new DB)
2) Export DB
Which one in this is best or should i use any other methods to avoid errors.
I am new to SQL so please guide me with correct options.
Thanks
SNA
In my opinion the best way to achieve what you want is by just Exporting the database.
I think this is the best option because it's alot more safe then scripting the db's into a new one (a way to just get alot of frustration and errors).
Just try the exporting of your database first before trying to do anything with scripting (which obviously also takes alot more time). So try your fast solution first, and see if it will work.
(I see you are using sql-server 2008) Are you also using the management studio? If so, you can go into the tables in edit-mode and try to copy / paste rows into the new tables. I don't know how big your tables / DB's are, but this could also be an option.
Greetings,
Younes
As you say, two options are scripting or using the SQL server export/import wizard.
I've used both (for the same database as it happens)
A third option is to use Visual StudioTeam System 2008 Database Edition GDR.
In terms of a one time export and import then I'd recommend going with the wizard. This is very safe and also very straightforward. Particulary as you are new to SQL server, you want to take the approach that minimizes the risk.
The only downside to doing it this way is that it is perhaps a little less transparent than the other methods.
On the project where I merged databases I ended up using the scripting method but that was mainly because I had a project that was already using GDR to merge incremental database updates, so adding in a data merge script to that was a simple task - all changes needed to go through DBAs who unfortunately weren't very SQL literate (I know!) so keeping all the processes similar was a must.
I also took some of my learnings from scripting the data and applied them to setting up my reference data scripts, so the effort of scripting was not a one time cost.
Either way, the most important tip I can give is to back up the databases before doing any work on them.
This is a problem that I come to on occasion and have yet to work out an answer that I'm happy with. I'm looking for a build system that works well for building a database - that is running all of the SQL files in the correct database instance as the correct user and in the correct order, and handling dependencies and the like properly.
I have a system that I hacked together using Gnu Make and it works, but it's not especially flexable and frankly can be a bit of a pain to work with in some situations. I've considered looking at things like SCons and CMake too, but I don't know how much better they are likely to be, or if there's a better system out there that already exists...
Just a shell script that runs all the create statements and imports in the proper order. You may also find migrations (comes with rails) interesting. It provides a make like infrastructure that let's you maintain a database the structure of which evolves over time.
Say you add a new column to some table. In migrations you'd write a snippet of code which describes the requirements for adding the column and also to rollback the change so you can switch to different versions of your schema automatically.
I'm not a big fan of the tight integration with rails, though, but the principles behind it are very interesting.
For SQL Server, I just use a batch file with SQLCMD.EXE and a bunch of .SQL files. It's not perfect, but it seems to work.
For my database, I use Migrator.NET
This is a .NET framework which allows you to create classes in where you define your DDL statements.
The framework comes with a command-line tool with which you can execute your 'migrations' in the correct order.
It also has a msbuild - task, so you can integrate it in a continuous integration build as well.
First export full DDL files describing all tables, views, source code
(procedures, functions, packages), sequences, and grants of a DB schema
See
Is there a tool to generate a full database DDL for SQL Server? What about Postgres and MySQL?
I created a database build system (part SQL-parser, part make file) to put these files together in a DB creation script using python.