I have an existing Mongo DB that was created and managed using PyMongo. I would like to implement MongoEngine and use it from now on on the existing data.
What would be the best way to approach it? Is it possible to migrate the current DB structure to and ODM based approach using MongoEngine?
Related
I managed database sharding using sequelize and express app like below,
Create connections on bootup.
Used hash key based(hash ring) algorithm to get the partition on insert/query data.
e.g model.getPartition(userId).get({...})
Looking for the suggestion to handle it using NestJS/Typescript/TypeORM.
I'm using a tool called Teamwork to manage my team's projects.
The have an online API that consists of JSON files that are accessible with authorisation
https://developer.teamwork.com/projects/introduction/welcome-to-the-teamwork-projects-api
I would like to be able to convert this online data to an sql db so i can create custom reports for my management.
I can't seem to find anything ready to do that.
I need a strategy to do this..
If you know how to program, this should be pretty straightforward.
In Python, for example, you could:
Come up with a SQL schema that maps to the JSON data objects you want to store. Create it in a database of your choice.
Use the Requests library to download the JSON resources, if you don't already have them on your system.
Convert each JSON resource to a python data structure using json.loads.
Connect to your database server using the appropriate Python library for your database. e.g., PyMySQL.
Iterate over the python data, inserting rows into the database as appropriate. This is essentially the JSON-to-Tables mapping from step 1 made procedural.
If you are not looking to do this in code, you should be able to use an open-source ETL tool to do this transformation. At LinkedIn a coworker of mine used to use Talend Data Integration for solid ETL work of a very similar nature (JSON to SQL). He was very fond of it and I respected his opinion, so I figured I should mention it, although I have zero experience of it myself.
Is there any extension/tool/script available to import data from eXist database to PostgreSQL database automatically?
From the tag description it's pretty clear that you're going to need to use an ETL tool or some custom code. Which is easier depends on the nature of the data and how you want to migrate it.
I'd start by looking at Talend Studio and Pentaho Kettle. See if either of them can meet your needs.
If you can turn the eXist data into structured CSV exports then you can probably just hand-define tables for it in PostgreSQL then COPY the data into it or use pgloader.
If not, then I'd suggest picking up the language you're most familiar with (Python, Java, whatever) and using the eXist data connector for that language along with the PostgreSQL data connector for the language. Write a script that fetches data from eXist and feeds it to PostgreSQL. If using Python I'd use the Psycopg2 database connector, as it's fast and supports COPY for bulk data loading.
If I use SQLAlchemy's ORM to create objects and store them, does that mean I pretty much also only retrieve the data from the DB via SQLAlchemy? Will the underlying tables created by SQLAlchemy ORM still be sane? Can I still query the DB directly and have useful findings?
The ORM will only create and modify the database records as they're defined. You'll be able to query them just as you normally would. Using SqlAlchemy does not limit your normal database access in any way. SqlAlchemy can output the queries used into log files for seeing what exactly they're doing. It's nothing like html generation where you then don't want to look at the html it created.
I have the following scenario - Our desktop application talks to a SQL Server on another machine. We are using Nhibernate 2.1.2. Now, we want to use SQLite on client machine to store data which could not be uploaded. For example, if Order table has not been updated on SQL Server, we want to save it to SQLite. And, then later try to upload to SQL Server. So, we are thinking to use Nhibernate for storing data in SQLite. How do I configure NHibernate to achieve this?
thanks
You will need to create a whole new session/session source. NHibernate can not simply switch contexts with the push of a button. Best bet is to spin up a separate repository and session that point at that specific second database.