I am looking into replicating in real time data from Oracle to Vertica database.
So far i can not find anything that is able to do this !!!
But i have foung Tungsten replicator that seems to work well in Mysql to Vertica (havent test it yet).
My Question is :
Is there any tools or ways of doing this (oracle => vertica)???
And if so how would updates would be handled ?
WisdomForce - acquired by Informatica - does data replication between oracle and vertica. http://www.wisdomforce.com/.
Look at Tungsten Replicator. It replicates from MySQL/Oracle to Vertica and is fully open source (GPL V2). You can find the Tungsten Replicator project at code.google.com. (Disclaimer: I work for Continuent and wrote a good chunk of the replicator code.)
Related
Is there any hack || way in V-language default ORM to support MongoDB?
Currently, I'm able to connect PostgreSQL as it supports it as a default DB. This is just the RDBMS part, But what if we want to connect NoSQL?
Also, I'm not able to connect MySQL as well as other RDBMS databases. I'll also share my thoughts in the answer section(If I get some). If you got any hack / Ideas then please post it below. As this language is new and the community is growing, Your answer and thoughts will be helpful to others as well.
Since your answer, Vlang now has an official driver for MongoDB.
By now (dec 2020) there are no way to connect to a mongodb database, as of the page of the project on github. The support for other RDBMS is planned, but NoSQL databases have no plan to be added. But, if there are source code, there are a way todo, and maybe somebody can contribute adding this feature ;-).
To be honest, I am not a J2EE developer, but I do some coding/scripting and work mostly on database side. My current requirement is to run liquibase's generatechangelog successfully for DB2 z/os.
But I see liquibase does not have sysibm tables in certain places, like here -> https://github.com/liquibase/liquibase/blob/master/liquibase-core/src/main/java/liquibase/snapshot/jvm/UniqueConstraintSnapshotGenerator.java
My plan here is to replace information_schema tables query with sysibm tables query (lines 309-327). Because lines 235 to line 253 might be not any use of DB2 z/os.
Not sure if the code in the following link helps identify if its DB2 z/os -> https://github.com/liquibase/liquibase/blob/master/liquibase-core/src/main/java/liquibase/database/core/Db2zDatabase.java (But it does check if it is DSN, "DSN" string for DB2 Z/OS and "SQL" for DB2 LUW i think)
Not sure I am looking in the correct liquibase code in github, but to my best knowledge it is the link to liquibase source code.
I need your help to know if I am good to modify this source code and compile it for using in our environment ( I mean no legal issues in modifying source code)
Your suggestion if I am looking in the right place to make it (liquibase) work for DB2 Z/OS
Please let me know in case of any queries ?
Sorry I can provide only general details .
You are looking in the right place in general, and there are no legal issues to modifying the source code for your own use, other than those set out in the license for Liquibase. https://github.com/liquibase/liquibase/blob/master/LICENSE.txt
It sounds like you are new to the open source world, so I would spend a little time researching how projects like Liquibase work. A good article to start with would be something like this one: https://opensource.guide/how-to-contribute/
The way that the Liquibase project is designed, you would not make changes directly to the code you found - Liquibase is designed to be extensible and work with many different database engines, so you would want to create a new set of classes that do the 'right' things for DB2/Z. You should definitely get an experienced Java developer working on this with you.
Alternatively, you might look at Liquibase Pro or Datical DB. I work for the company that produces Liquibase Pro and Datical and we have already started work to add support for DB2/Z.
How can I see different version of Hbase data in Hive.
As per my understanding using HbaseStorageHandler only latest version of Hbase data will be available in Hive .Is my understanding correct/updated?
Is there any way to access different version of Hbase data using Hive??
Thanks in advance :)
(New to Hbase-Hive Integration)
That would depend on the version of hive that you are using.
Prior to hive 1.1, hbase timestamps were not accessible through the hive-hbase integration [1] (Related: [2]).
So the answer being, You require hive 1.1 or higher.
Hope it helps.
[1] https://issues.apache.org/jira/browse/HIVE-2828
[2] https://issues.apache.org/jira/browse/HIVE-8267
Not 100% answer but directions. In normal life HBase is always about special cases.
Here is slightly outdated but really simple article to understand approach:
http://hortonworks.com/blog/hbase-via-hive-part-1/
So practically you can implement any InputFormat or OutputFormat you need.
But this is related to MapReduce gears.
In principle Spark can always rely on InputFormat too so the question is only about your special case.
Another good idea is depicted here: http://www.slideshare.net/HBaseCon/ecosystem-session-3a
So snapshots could help to take state of tables you really need and then you are free to use any gear to connect Hive with HBase if it follow standards.
In general basic idea is to tune gears which connects Hive to your HBase data so they will apply needed version filters to you. This does not depend so much on versions as this interface is pretty stable.
Hope this will help you.
Does Mondrian support nosql db like mongodb in the current version. I read some blogs and bugs related to the same.
Any help is appreciated.
thanks
Lokesh
please read the following Blog from Julian Hyde, creator of Mondrian
http://julianhyde.blogspot.com.es/2014/03/improvements-to-optiqs-mongodb-adapter.html
Here you can see Julian have been working on a new approach that converts even complex SQL queries into MongoDB Queries behind the scenes.
Mondrian does not directly support MongoDB at the moment. MongoDB does not have a JDBC implementation.
There are a few options. One of them can be setup if you have access to a Pentaho Data Integration server. You can use a thin JDBC implementation which will allow Mondrian to access a SQL to Mongo bridge.
There are certainly other ways to set this up, since there are a lot of data federation engines out there.
Not directly as far as I know. Maybe someone is working on a dialect, is that even possible..? Interesting question though... May be worth linking the blogs you found so far?
One solution however could be to use the kettle jdbc driver, this driver works with mondrian and then the other end can be any ETL process. So you could use a mongodb input step etc.
There is Apache Drill. You can query MongoDB through standard SQL and Drill has a JDBC driver. So maybe it is possible that Mondrian uses this driver.
Uwe
I want to start using Core Date on iPhone with pre-existing MySQL databases. What's the easiest way to transfer a MySQL database to SQLite?
I've tried using SQLite Migrator, but I don't know where to find the ODBC drivers for Mac (Snow Leopard). I found http://www.ch-werner.de/sqliteodbc/ which seems to have drivers, but they are for Power PC.
If someone could give me a walkthrough, or tell me what the best tools for this are, I'd be grateful.
Thanks.
Perhaps the simplest would be to use mysqldump to dump the raw SQL from your MySQL database into a text file and then use the sqlite3_exec() function to execute that SQL in order to populate the SQLite database.
Have you looked at this Perl script? I haven't used it - just did a quick search for mysql to sqlite migration and it popped right up.
Edit (after you replied to my comment):
The reverse direction is dealt with here.
If you are going to do it repeatedly and if data structure changes are to happen, maybe you would be better off using something like Django (albeit in a very hackish way). With it I would:
# This three lines are done once
django-admin.py startproject mymigrationproject
cd mymigrationproject
./manage.py startapp migration
# The following lines you repeat each time you want to migrate the data
edit settings.py and make the changes to connect to MySQL
./manage.py inspectdb > ./migration/models.py
edit ./migration/models.py to reorder tables (tables in which other tables depend on top)
mkdir fixtures
./manage.py dumpdata migration > ./fixtures/data.json
edit settings.py and make the changes to connect to SQLite
./manage.py syncdb
./manage.py loaddata ./fixtures.data.json
Here is a list of converters:
http://www.sqlite.org/cvstrac/wiki?p=ConverterTools
An alternative method that would work nicely but is rarely mentioned is: use a ORM class that abstracts the specific database differences away for you. e.g. you get these in PHP (RedBean), Python (Django's ORM layer, Storm, SqlAlchemy), Ruby on Rails ( ActiveRecord), Cocoa (CoreData)
i.e. you could do this:
Load data from source database using the ORM class.
Store data in memory or serialize to disk.
Store data into source database using the ORM class.
You can use a trial from http://www.sqlmaestro.com/products/sqlite/datawizard/
It is completely functional for 30 days.
You can get ODBC drivers for Mac OS X from Actual Technologies.
http://www.actualtech.com/
To connect to MySQL you need their ODBC Driver for Open Source Databases:
http://www.actualtech.com/product_opensourcedatabases.php
(Disclaimer: I am the author of SQLite Migrator)
There is a free ETL product that can be used to migrate data from one db to another. Have a look: http://www.talend.com/index.php
Good luck!
To do my conversions, I ended up using an ODBC from Actual Access. I think I used it in combination with SQLite Migrator. I never liked this way though it was always clunky. Expensive too, it ended up costing about $80 for those two pieces of software.
If I had to do this again, I'd buy SQLiteConverter by SQLabs. I use their SQLite Manager, and although it has a lot of interface problems, for database software it's not bad.
http://www.sqlabs.net/sqliteconverter.php