Is it possible to transfer data to SQL Server from an unconventional database - sql

My company has a SQL Server database which they would like to populate with data from a hierarchical database (as opposed to relational). I have already written a .net application to map its schema to a relational database and have successfully done that. However my problem is that the tech being used here is so old that I see no obvious way of data transfer.
However, I have certain ideas about how I can do this. This involves having to write file scans in my unconventional database and dump out files as csv. Then do a bulk upload into SQL Server. I do not appreciate this as there is the element of invalid data involved which terminates the bulk upload quite so often.
I was hoping to explore options around service broker. I was hoping to dump out live transactions where a record has changed in my database and then this can somehow be picked up?
Secondly I was also hoping to use something which if I dump out live or changed records in a file (I can format the file to whatever format is needed), can something suck it into SQL Server?
Any help would be greatly appreciated.
Regards,
Waqar

Service Broker is a very powerful queue/messaging management system. I am not sure why you want to use it for this.
You can set up an SSIS job that keeps checking a folder for csv files and when detects a new one it reads it into SQL Server and then zips it and archives it somewhere else. This is very common. SSIS can then either process the data (its a wonderful ETL tool) or invoke procedures in SQL Server to process the data. SSIS is very fast and is rarely overwhelmed so why would you use Service Broker?
If its IMS (mainframe) type data you have to convert it to flat tables and then as csv type text tables for SQL Server to read.
SQL server is very good at processing XML and, as of 2016, JSON shaped data, so if that is your data type you can directly import into SQL Server.

Skip bulk insert. The SQL Server xml data type lends itself to doing what you're doing. If you can output data from your other system into an XML format, you can push that XML directly into an argument for a stored procedure.
After that, you can use the functions for the XML type to iterate through the data as needed.

Related

erasing all data and populate with dummy data

what is the best way to transfer database copy as backed up file for outside maintenance on the application? But that copy should not have any sensitive data and it can only have dummy data. What is the efficient and best practice to erase all data in the tables and populate with dummy data? ( sql server 2019)
This is not a trivial task. A 3rd party solution would probably be easiest.
There are several answers available here that discuss copying objects in SQL Server Management Studio. Example: Backup SQL Schema Only?.
If you have access to SQL Server Integration Services, you can copy selected objects using the Transfer SQL Server Objects Task. I have tried this once a long time ago, so I have very little experience to describe how it works.
Another option is to create a job that runs a copy-only backup, restores the database, and then runs a manual series of SQL queries to clear or mask sensitive data.

Use a script to export database Schema and Data into SQL file

I am looking for a means to programmatically pass in a Database name and have it return the Schema and Data as a SQL file. We work with several clients who have complex database configurations and we would like to be able to back up those configurations for our records. The people doing the backups would not know how to use a tool such as SQL Server Manager, as well the chance of producing erroneous SQL files is higher.

Performance moving data from Postgres to SQL Server via SSIS

I have several large SQL queries that I need to run against a Postgres data source. I am using SSIS on SQL Server 2008 R2 to move the data. Because of the way our system is set up, I have to use a tunnel via PuTTY and set up local port redirection.
In the SSIS package, I am using ADO.NET source and destination. I have PostgreSQL drivers installed, and we were able to get the 32-bit version working. My package runs, I am getting the data, but the data transformation tasks run painfully slow ... about 2,000 records per second.
Does anyone have experience making a trip to Postgres with static queries and dumping the results into a SQL Server? Any tips / best practices?
You should try to get the data and store it in a ssis raw file.
Then make your transformation and whatever you like on the raw file data.
After that send it back to DB.
General try not to have many calls to the database.

how create a sql database fom a stongly typed dataset

I'm looking for an easy way to transfer a database schema I have developed inside visual studio as a strongly typed dataset (xsd file) into a corresponding sql server database. Silly me I assumed the process would be forthright, but I can't find out how to do it. I assume I could duplicate the tables column by column, but that seems so error prone. Does anyone know of a way to perform the schema transfer like this? Maybe a tool to translate the xsd file into a corresponding sql server ddl file?
Final thought once I have the schema transferred moving data around between the two data stores will be straight forward, its just getting the schemas synced that has me stumped...
Thanks,
Keith
Why didn't you implement your data model directly in SQL Server ?! It is more common and engineered and I think this is why Microsoft has not provided any wizard or tool for this case. As well you can make your data model as scripts or .sql files and they can be managed via SVN and whenever you need the model implementation you can sue them.

Will SSIS work well for importing to multiple tables?

I won't have access to SSIS until tomorrow so I thought I'd ask for advice before I start work on this project.
We currently use Access to store our data. It's not stored in a relational format so it's an awful mess. We want to move to a centralized database (SQL Server 2008 R2), which would require rewriting much of our codebase (which, incidentally, is also an awful mess.) Due to a time constraint, well before that can be done we are going to need to get a centralized database set up solely for the purpose of on-demand report generation for a client. So, our applications will still be running on Access. Instead of:
Receive data -> Import to Access initial file with one table -> Data processing -> Access result file with one table -> Report generation
The goal is:
Receive data -> Import to Access initial file with one table -> Import initial data to multiple tables in SQL Server -> Export Access working file with one table -> Data processing -> Access result file -> Import result to multiple tables in SQL Server -> Report generation whenever
We're going to use SSRS for the reporting component, which seems like it'll be straightforward enough. I'm not sure if SSIS alone would work well for splitting the Access data up into numerous tables, or if everything should be imported into a staging table with SSIS and then split up with stored procedures, or if I'll need to be writing a standalone application for this.
Haven't done much of any work with SQL Server before, so any advice is appreciated.
In SSIS package, you can write code (e.g. C#) to do your own/custom data transformations. However, SSIS comes with built-in transformations that may be good for your needs. SSIS is very powerful and flexible. Actually, you may do pretty much anything you want with the data in SSIS.
The high level workflow for your task could like like the following:
1. Connect to the data source and pull the data
2. Transform the data
3. Output data to the destination data source
You certainly can split a data flow into two separate branches and send it to two destinations. All you need to do is put a multi-cast in the dataflow and then the bulk of the transformations will happen after that.
From what you've said, however, a better solution might be to use the Access tables as a staging database and then grab the data from there and send it to SQL Server. That would mean two data flows but it will be a cleaner implementation.