Is There A Way To Append Deleted and Updated Data To History Table - sql

Alright, so I am working on a project at work and I need to append data to a new history table every time the data in our other table is updated or deleted. However, we are getting access to our sql tables from another company and they only gave us read-only privileges and we can only view them through Microsoft Power BI and Excel.
So I wanted to see if there was any way of creating a trigger of some sorts.
Thank You

From your question, you are trying to do an incremental load of data, to be able to append new data to a table. Also you are looking to have some sort of archive process to a history table, via a trigger. Incremental loads are a Power BI Premium feature only. However for the way you want to move the data based on a trigger, this is not supported in Power BI.
I would recommend trying to get better access to the SQL, or use Excel to get the data, dump it into Excel/CSV files, then create a process to load the new file(s) and figure out the changes, using some other database/etl process, then output to a file/table the results that PBI can read from.
Hope that helps

Related

What's the most optimal way of updating data using an Excel file

This is a question about how you would go about tackling it.
Every week or so, we get some client delivering an Excel file that needs its contents to be uploaded to their CRM package. It's always something different. For instance, now it's a list of all of their product-barcodes and the current stock. They want us to update the stock of all of their products this one time.
Since it's always something different some client requires, we haven't taken the time to automate this yet (there are other priorities) and we've been doing it by hand. We have automated the most received requests already.
What we do now when such a request comes in, is find the table that the data belongs to in the database, and then use Excel to create INSERT or UPDATE sql scripts that we can copy paste into SSMS to execute.
The way I would do it, is by first writing my INSERT STATEMENT in one cell, and then use excel functions on each row of data to concatenate my insert statement with all the values that are in that row in Excel.
This is quite error prone, time-consuming and I was wondering if anyone can offer any tips on what they would do? How would you handle a question like that? Is there a quicker way of doing it that you can think of?
Mind you: it's always a different question. Today it has to do with products, tomorrow it could be a list of VAT-numbers that they want to see uploaded so all of their clients now have the correct VAT number.
I'm very curious how you would handle this.
Since request is not about automation, I can suggest alternate solution which is still manual, but requires less work.
If you are using any tools for database access like TOAD or Sqldeveloper , there is facility to import data directly from excel.
What you can do is to import data into a separate schema in production or any other database, by creating a temporary table. Further use sql queries for any data massaging and update in target table.
Here are 2 sample threads
How to import excel data into Toad 9.5 table
SQL Developer for importing from Excel
Note: threads may refer Oracle database, but its no different in case of mssql too. Ability of tool remains same.

Loading data regulary from ServiceNow to Pentaho Kettle

I'm working on a BI project and I want to retrieve data from ServiceNow and load it to Pentaho Data Integration so I can record it in my data warehouse, and I want to do this regulary, in other words I want to retrieve the new records regulary from servicenow , only the new ones that haven't been loaded yet to the data warehouse, someone knows how can I acheive my goal? Help me please
The question is too vague.
You need to set up an ETL job that incrementally loads data. That will require you to define a timestamp or incremental key to identify which records are more recent than the ones already loaded.
You will need to schedule that job, e.g., using crontab and calling kitchen from the command line.
Your question pretty much translates to "please develop my ETL project". Too wide in scope.

Copy datarow from relational database along with its relational data

Using SQL Server 2005\2008
Is there a way to copy a particular data row from one relation database to another similar database, along with its relational data? In a sense that all the other data (FKs) from this database which will need to be there for this particular data row to exist in destination database should also be copied.
I basically need to copy a particular scenario from the production box to test box, without copying the entire table \databases over to test systems.
I can definitely find all the Fk from this table and copy the data manually but that is too time consuming when the base tables keeps changing.
Any tool\ generic queries that can help me move a data scenario from one relational database to other.
I would use SSIS (SQL Server Integration Services) to transfer data from one database to another. It's scriptable and can be automated and scheduled if necessary.
here is my attempt to solve this problem and open source it.
Data Transfer Application

Queries for migrating data in live database?

I am writing code to migrate data from our live Access database to a new Sql Server database which has a different schema with a reorganized structure. This Sql Server database will be used with a new version of our application in development.
I've been writing migrating code in C# that calls Sql Server and Access and transforms the data as required. I migrated for the first time a table which has entries related to new entries of another table that I have not updated recently, and that caused an error because the record in the corresponding table in SQL Server could not be found
So, my SqlServer productions table has data only up to 1/14/09, and I'm continuing to migrate more tables from Access. So I want to write an update method that can figure out what the new stuff is in Access that hasn't been reflected in Sql Server.
My current idea is to write a query on the SQL side which does SELECT Max(RunDate) FROM ProductionRuns, to give me the latest date in that field in the table. On the Access side, I would write a query that does SELECT * FROM ProductionRuns WHERE RunDate > ?, where the parameter is that max date found in SQL Server, and perform my translation step in code, and then insert the new data in Sql Server.
What I'm wondering is, do I have the syntax right for getting the latest date in that Sql Server table? And is there a better way to do this kind of migration of a live database?
Edit: What I've done is make a copy of the current live database. Which I can then migrate without worrying about changes, then use that to test during development, and then I can migrate the latest data whenever the new database and application go live.
I personally would divide the process into two steps.
I would create an exact copy of Access DB in SQLServer and copy all the data
Copy the data from this temporary SQLServer DB to your destination database
In that way you can write set of SQL code to accomplish second step task
Alternatively use SSIS
Generally when you convert data to a new database that will take it's place in porduction, you shut out all users of the database for a period of time, run the migration and turn on the new database. This ensures no changes to the data are made while doing the conversion. Of course I never would have done this using c# either. Data migration is a database task and should have been done in SSIS (or DTS if you have an older version of SQL Server).
If the databse you are converting to is just in development, I would create a backup of the Access database and load the data from there to test the data loading process and to get the data in so you can do the application development. Then when it is time to do the real load, you just close down the real database to users and use it to load from. If you are trying to keep both in synch wile you develop, well I wouldn't do that but if you must, make a nightly backup of the file and load first thing in the morning using your process.
You may want to look at investing in a tool like SQL Data Compare.
I believe it has support for access databases too, and you can download a trial.
I you are happy with you C# code, but it fails because of the constraints in your destination database you temporarily can disable them and then enable after you copy the whole lot.
I am assuming that your destination database is brand new DB with no data, and not used by anyone when the transfer happens
It sounds like you have two problems:
You're migrating data from one database to another.
You're changing your schema.
Doing either of these things is tricky if you are trying to migrate the data while people are using the data.
The simplest approach is to migrate the data based on a static copy of the data, and also to queue updates to that data from the moment you captured the static copy. I don't know how easy this is in Access, but in SQLServer or Oracle you can use the redo logs for this or a manual solution using triggers. The poor-man's way of doing this is to make triggers for all the relevant tables that log the primary key of the records that have changed. Then after the old database is shut off you can iterate over those keys and get those records from the old database and put them into the new database. Just copy the whole record; if the record was deleted then delete it from the new database.
Your problem is compounded by the fact that you can't simply copy the data, you have to transform it. This means you probably have to shut down both databases and re-migrate the records based on the change list. It will take a lot of planning to ensure you get things right and I'd recommend writing a testing script that can validate that the resulting data is correct.
Also I'd ensure that the code for the migration runs inside one of the databases if possible. Otherwise you are copying the data twice and this will significantly harm the performance.

Create a database from another database?

Is there an automatic way in SQL Server 2005 to create a database from several tables in another database? I need to work on a project and I only need a few tables to run it locally, and I don't want to make a backup of a 50 gig DB.
UPDATE
I tried the Tasks -> Export Data in Management studio, and while it created a new sub database with the tables I wanted, it did not copy over any table metadata, ie...no PK/FK constraints and no Identity data (Even with Preserve Identity checked).
I obviously need these for it to work, so I'm open to other suggestions. I'll try that database publishing tool.
I don't have Integration Services available, and the two SQL Servers cannot directly connect to each other, so those are out.
Update of the Update
The Database Publishing Tool worked, the SQL it generated was slightly buggy, so a little hand editing was needed (Tried to reference nonexistent triggers), but once I did that I was good to go.
You can use the Database Publishing Wizard for this. It will let you select a set of tables with or without the data and export it into a .sql script file that you can then run against your other db to recreate the tables and/or the data.
Create your new database first. Then right-click on it and go to the Tasks sub-menu in the context menu. You should have some kind of import/export functionality in there. I can't remember exactly since I'm not at work right now! :)
From there, you will get to choose your origin and destination data sources and which tables you want to transfer. When you select your tables, click on the advanced (or options) button and select the check box called "preserve primary keys". Otherwise, new primary key values will be created for you.
I know this method can hardly be called automatic but why don't you use a few simple SELECT INTO statements?
Because I'd have to reconstruct the schema, constraints and indexes first. Thats the part I want to automate...Getting the data is the easy part.
Thanks for your suggestions everyone, looks like this is easy.
Integration Services can help accomplish this task. This tool provids advanced data transformation capabilities so you will be able to get exact subset of data that you need from large database.
Assuming that such data is needed for testing/debugging you may consider applying Row Sampling to reduce amount of data exported.
Create new database
Right click on it,
Tasks -> Import Data
Follow instructions