I am trying to create a framework in SQL ( tables mainly ) which could help to me segregate SQL data dynamically based on user roles.
e.g. User with role A should have access to data from country XYZ
I have bunch of stored procedures which fetch different attributes of data now I am trying to update stored procedures in a way that stored procedures needs to be modified only once.
I might get different filter criteria in future so I am trying to create a matrix of filter conditions which could be read in stored procedure dynamically to filter data.
Related
I'm looking for a very efficient way to control row level access and column level access, considering that users and user groups metadata is stored in the database. Our application is using Entity Framework and we have to ensure that all code access to some record and it's columns is filtered based on the user access to the requested data.
Our challenge is the following one :
in an Azure SQL database, we have multiple tables with the following table names : table_num where num is just an integer. These tables are created dynamically so the number of tables can vary. (from table_1, table_2 to table_N) All tables have the same columns.
As part of a U-SQL script file, we would like to execute the same query on all of these tables and generate an output csv file with the combined results of all these queries.
We tried several things :
U-SQL does not allow looping so we were thinking creating a View in our Azure SQL database that would combine all the tables using a cursor of some sort. Then, the U-SQL file would query this View (using external source). However, a View in Azure SQL database can only be created via a function and a function cannot execute dynamic SQL or even call a stored procedure...
We did not find a way to call a stored procedure of the external data source directly from U-SQL
we dont want to update our U-SQL job each time a new table is added...
Is there a way to do that in U-SQL through a custom extractor for instance? Any other ideas?
One solution I can think of is to use Azure Data Factory (v2) to assist in this.
You could create a pipeline with the following activities:
Lookup activity configured to execute the stored procedure
For Each activity that uses the output of the lookup activity as a source
As a child item use a U-Sql Activity that executes your U-Sql script which writes the output of a single table (the item of the For Each activity) to blob or datalake
Add a Copy Activity that merges the blobs from step 2.1 to one final blob.
If you have little or no experience working with ADF v2 do mind that it takes some time to get to know it but once you do, you won't regret it. Having a GUI to create the pipeline is a nice bonus.
Edit: as #wBob mentions another (far easier) solution is to somehow create a single table with all rows since all dynamically generated table have the same schema. You can create a stored procedure for populating this table for example.
I've seen a few questions like this asked but can't seem to find a good solution. Let's say I have a dataset with 20 tables (all with same schema just different row data). I have a stored procedure to populate a single sql table with the data from these tables but currently the only way I've been able to do this is to iterate through the tables in the dataset and send to sql via the stored prodcedure one at a time until all of them have made it to the target table on sql.
Is there a way to do this without iterating through each table in the dataset but still using a stored procedure?
I want to use dynamic table as my data source in SSAS to create my Time Series model. But only views and static tables from my database are shown in the data source wizard.Is there any method to access available dynamic tables in the database as data source?
Because I need to create around 20-25 models by changing the parameter of dynamic table. If I can't access the dynamic table I have to create 20-25 static tables or views which is very inefficient.
I am developing a report in SSRS.
My report has around 50 row headers. Data for each row header is the reult of a complex query to the database.
2 row header may/may not have data that relates to another row header.
In this case what would be the best way to create the report?
-- Do we create a procedure that gets all data to a temporary table and then generate the report using this temp table?
-- Do we create multiple datasets for this report.
Please advice on what would be the best way to proceed.
I read somewhere about using Link wherein data is retrieved from the post gre database (project uses postGreSql db) to the local sql server that SSRS provides.
Report then retrieves data from the local sql server to generate the report.
Thoughts?
You are best using a stored procedure as the source. It is easy to optimize the stored procedure so as to get the best performance so the report runs fast.
Assembling all your data so that you can use a single dataset to represent it would be the way to go.