Masking entire table with one line of script - Microsoft Dynamic Data Masking - sql-server-2016

is there a way in Microsoft Dynamic Data Masking (DDM) to mask an entire table in sql server with just one line of script using the same function for all columns? defining the masking rule on the table as a whole?
something like: ALTER TableName ADD MASKED WITH (FUNCTION =  'default()')
I am trying to mask an entire table in an SQL server database using dynamic data masking and would
like to find a fast and easy way to do it in one line of script, since I need all the columns masked and I need them to be masked with the same function, "DEFAULT".

Related

Using SSMA to convert from Access to SQL, scripting the fixes

I am using SSMA to convert from an Access db to a SQL 2019 DB.
There are some things I need to fix in the access DB so I am trying to figure out whether or not these things can be done via a query in access or you have to use the goofy UI and do everything manually.
So I had a couple of questions about queries in Microsoft Access:
Can you modify the 'required' attribute on a column within a table by using a query?
Can you configure Index (dupes) on a column by using a query?
Can you change validation rules using a query?
Can you create/delete relationships using a query?
Can you change the field length of a column by using a query?
Any examples of any of these would be helpful, when I google for ms access related things all of the content is either related to Access 2007/2010 or its very UI heavy rather than Query heavy.
I am trying to script this because I may have to do this migration several times.
Update: I was able to get most of what i needed figured out..
ALTER TABLE Users ALTER COLUMN Type CHECK(In ("I","U","") Or Is Null);
Still havent found a way to change the 'ValidationRule'.. trying to change it to
In ("I","U","") Or Is Null
Look into the Data Definition Language section of the MS Access SQL Reference, specifically the ALTER TABLE statement, which will cover the majority of your questions.
For example, in response to:
Can you change the field length of a column by using a query?
ALTER TABLE Table1 ALTER COLUMN Field1 TEXT(100)
The above will change the data type of the field Field1 within table Table1 to a text field accommodating 100 characters.

How do I deal with identically named fields in the source database, differentiated only by label name?

The database setup at my organisation is SQL tables copied onto our SAS server. The SQL tables were setup to run pre-programmed SQL queries, now SAS is the tool used. This however creates an issue with some tables having variables that are too long for SAS, but work in SQL. The label for the source variable is correct and not shortened.
The source table (in SQL Server) names:
Consolidated_Arrears_Vs_Portfolio_Balance_Ltd
Consolidated_Arrears_Vs_Portfolio_Balance_Pure
In SAS:
Consolidated_Arrears_Vs_Portfoli
Consolidated_Arrears_Vs_Portfoli
SAS Labels:
Consolidated_Arrears_Vs_Portfolio_Balance_Ltd
Consolidated_Arrears_Vs_Portfolio_Balance_Pure
So, how do I tell the difference in code between these two?
Thanks in advance.
To use the data as native in SAS, one approach would be to write a macro to map the original SQL names (per label) to the corresponding new SAS names. If the original table names got mangled as well you have a lot more issues.
Original SQL
select Abracadabra_Magical_Unity_Formation_SequenceId from AMUF_Master
Replace with
select %nameFor(Abracadabra_Magical_Unity_Formation_SequenceId) from AMUF_Master
The macro %nameFor would either do a dynamic lookup against the tables in the library, or perhaps better, when a static table design, create a fixed mapping table from a one time lookup
* presume SQL data now in libref MIGRATED;
* do once to get the variable metadata that includes LABEL and NAME;
proc sql;
create table static.nameFor as
select * from sashelp.vcolumn
where libnames = 'MIGRATED';
* use as needed;
%macro nameFor(SQL_Name);
%sysfunc(dosubl(select NAME from static.nameFor where LABEL="&SQL_Name"))
%mend;
You could also use the static.nameFor to discover all the SQL names that got changed during migration. Those would be where name ne label.
An automated approach would be to create a search and replace program that makes changes to a copy of the original SQL queries on-hand.
The search and replace would be either
find <long-named column>, replace with %nameFor(<long-named column>) , or
find <long-named column>, replace with <migrated to SAS column name>
The first replacement way adds noise.
The second way loses some of the original queries 'true-flavor'

How to run a select sql statement within a field in the Pentaho?

I have a table with a 'query' field containing a select sql and another 'parameters' field containing the sql parameters. I have merged these two fields into a new field containing a correct select sql statement. Now I need to execute this new field containing select sql, get the return from select (the output fields) and generate an excel file.
Use Table-Input if you are interested in a query result set. Table-Input supports SQL parameters, so no need to build the statement yourself using e.g. Replace-In-String, and tripping over escapes on your way. Also, there's variable substitution, just in case you can't live with a single template.
Update 21:14 GMT
I'm not very fond of the way you try to prepare the SELECT statement, but here we go, assuming it's a single statement we have:
Create a job with a Start entry and 2 Transformation entries (T1, T2). Let T1 produce the field containing your SELECT statement and use a Set-Variables step to make the statement available to T2 as variable SELECT. In T2 use a Table-Input step referencing ${SELECT} in the SQL statement text area. Don't forget to enable option "Replace variables in script".
From now on it's a matter of taste. I would prefer to create a CSV file using Text-File-Output. Using the right field separator Excel will open the file after double-clicking it. The advantage of Text-File-Output is that you don't have to specify the fields you don't know at design-time anyway. An empty field list will just handle all fields coming in. Comparable to the total projection in a Table-Input which will create the necessary fields from the retrieved columns downstream.
If you must produce an Excel workbook, you'll have to learn about metadata injection. That would be a separate project for a beginner, though. There are samples in your Kettle installation folder. And there is a very active community if you find yourself in trouble.

capture executed sql from input table in pentaho pdi

I am using pentaho for data migration testing. I have set a "table input" step where many parts of the query inside "table inputs" are variables. I have been looking for a way to capture that query after it gets executed during runtime.
I was wondering if there is any specific system log variables for sql or is it to do with metadata. need help! Thanks
Maybe the following approach will help:
We assume a transformation reading a CSV file to get the dynamic portion of the SELECT statement (e.g. the columns) and setting the variable columns with it.
The second transformation uses this variable to generate the SELECT statement and store it into the variable sql_statement.
In the main transformation we use ${sql_statement} as the SELECT statement of the table input and write the data to an output file (that's the business process so to say). From the same input we copy the output to another path. There we add the current time as a field (use element "Get system data") and we add the generated SQL statement, join them as a cartesian product and group the result by the sql_statement. That way we can compute the first time and the last time that the statement was used. These results are written to a text file.
The last thing we need is a job calling the three transformations sequentially.
This is a sample output:
sql_statement;min_time;max_time
SELECT my_column FROM test_table;2014/05/08 00:41:21.143;2014/05/08 00:41:21.144
Thank you Marcus! I did some thing similar.
It works. awesome.
I gathered parts of queries from table field where they were kept and formed a full query in javascript. After that full query will be sent as parameter to a transformation that will run and log the query.

Create delimited string from a row in stored procedure with unknown number of elements

Using SQL Server 2000 and Microsoft SQL Server MS is there a way to create a delimited string based upon an unknown number of columns per row?
I'm pulling one row at a time from different tables and am going to store them in a column in another table.
A simple SQL query can't do anything like that. You need to specify the fields you are concatenating.
The only method that I'm aware of is to dynamincally build a query for each table.
I don't recall the structure of MSSQL2000, so I won't try to give an exact example, maybe someone else can. But there -are- system tables that contain table defintions. By parsing the contents of those system tables you can dynamically build the necessary query for each source data table.
TSQLthat writes TSQL, however, can be a bit tricky to debug and maintain :) So be careful how you structure everything...
Dems.
EDIT:
Or just do it in your client application.