I've seen a few questions like this asked but can't seem to find a good solution. Let's say I have a dataset with 20 tables (all with same schema just different row data). I have a stored procedure to populate a single sql table with the data from these tables but currently the only way I've been able to do this is to iterate through the tables in the dataset and send to sql via the stored prodcedure one at a time until all of them have made it to the target table on sql.
Is there a way to do this without iterating through each table in the dataset but still using a stored procedure?
Related
I'm looking for a way to speed up the following process: I have a SSIS package that loads data from Excel files on a weekly basis to SQL Server. There are 3 fields: Brand, Date, Value.
In the dataflow, I check for existing combinations of Brand+Date, and new combinations go to the table directly, the existing ones go to a RecordSet destination for updates:
The next step is to update the Value of the existing combinations:
As you can see, there are thousands of records to update, and it takes too long. The number of records tend to grow week by week. Please suggest.
The fastest way will be do this inside a Stored procedure using ELT (Extract Load Transform) approach.
Push all data from excel as is into a table(called load to a staging table in theory). Since you do not seem to be concerned with data validation steps, this table can be a replica of final destination table columns.
Next step is to call a stored procedure using Execute SQL task. Inside this procedure you can put all your business logic. Since this steps with native data manipulation on SQL server entities, it is the fastest alternative.
As a last part, please delete all entries from the staging table.
You can use indexes on staging table to make the SP part even faster.
I am trying to create a framework in SQL ( tables mainly ) which could help to me segregate SQL data dynamically based on user roles.
e.g. User with role A should have access to data from country XYZ
I have bunch of stored procedures which fetch different attributes of data now I am trying to update stored procedures in a way that stored procedures needs to be modified only once.
I might get different filter criteria in future so I am trying to create a matrix of filter conditions which could be read in stored procedure dynamically to filter data.
I am developing a report in SSRS.
My report has around 50 row headers. Data for each row header is the reult of a complex query to the database.
2 row header may/may not have data that relates to another row header.
In this case what would be the best way to create the report?
-- Do we create a procedure that gets all data to a temporary table and then generate the report using this temp table?
-- Do we create multiple datasets for this report.
Please advice on what would be the best way to proceed.
I read somewhere about using Link wherein data is retrieved from the post gre database (project uses postGreSql db) to the local sql server that SSRS provides.
Report then retrieves data from the local sql server to generate the report.
Thoughts?
You are best using a stored procedure as the source. It is easy to optimize the stored procedure so as to get the best performance so the report runs fast.
Assembling all your data so that you can use a single dataset to represent it would be the way to go.
I have a table with 103 columns and more than 100,000 records. I was told that Oracle Data base has bad data in few of the columns(four columns) and in few of the records(20,000 records approx) and was told to modify data. Also I was given a query where I should modify data(bad data). With the help of query, I exported all the bad data into excel and modified using macros.
How to replace existing bad data in Oracle data base to data which I have in excel(good data). Is it possible to modify data in some part of the database using SQL query? I mean to say 4 out of 103 columns and 20k out of 100k records has to be modified without affecting already existing good data in Oracle database.
I am using SQL developer and Oracle 11g
My query to retrieve bad data
select e.id_number
, e.gender_code
, e.pref_mail_name
, e.prefix
,e.first_name
,e.last_name
,e.spouse_name
,e.spouse_id_number
,e.pref_jnt_mail_name1
,e.pref_jnt_mail_name2
from advance.entity e
where e.person_or_org = 'P'
and ascii(e.spouse_name) <> 32
or ascii(e.spouse_id_number) <> 32"
Note: Moreover I am not changing any primary key or secondary key data. Bad data is in other columns.
You can start by importing all corrected data (which now resides in your Excel file), to a temporary table. See Load Excel data sheet to Oracle database for more details.
The it should be a simple task to write one or more UPDATE statements. If you are new to Oracle syntax, Tech on the Net has some nice examples.
Also, do not forget to backup the original table to a temporary table before you make any changes there. That way, if you mess up the data repair, you can start over.
I would like to have a stored procedure that return 3 results. these results are all the data i have about a person. the orders he made, his favorite products and his personal details. I would like to take these 3 results and save them in 3 different tables in the destination database. Is it possible to do it, or I have to do one by one?
I was thinking of maybe using a recordset destination, but i don't know how to read from each table in the recordset into a different table in the destination database.
Thanks
You'll need to use a script task to add a layer between your SP, and your tables.
http://www.codeproject.com/Articles/32151/How-to-Use-a-Multi-Result-Set-Stored-Procedure-in
If the tables are on the same DB as your SP, you might just want to write directly to to tables.