Create SQL trigger if data exists in table - sql

I am new to SQL.
What is the best way to create a TXT file, if a table has records > 0?
The code already exists to remove or add records to this table.
I am looking for ways to create a trigger file (with no content in the file) at a specific network folder.
Preferably, I would want this TXT file to be removed at the end of the day, so the process could repeat itself every morning

On an after delete Trigger do a select count(*) from table or query one of the system catalog views. If its zero, then call a stored proc that poops a file onto your share drive.
To move the file you could create a small package or call a powershell or bcp (after enabling xp_cmdshell though), or you could create a CLR function (after enabling CLR). I guess since the latter two you need to change a server setting, you could just create a package.
Annnd since there is no data you dont actually need to export, you just create a blank file!

Related

loading data from external stage - only truncate + load when theres new file

I'm loading data from a named external stage (S3) by using COPY INTO, and this S3 bucket keep all old files.
Here's what I want:
When a new file comes in, truncate the table and load the new file only, if there's no new file coming in, just keep the old data without truncation.
I understand that I can put option like FORCE = False to avoid loading old files again, but how do I only truncate the table when there's new file coming in?
I would likely do this a bit differently, since there isn't a way to truncate/delete records in the target table from the COPY command. This will be a multi-step process, but can be automated via Snowflake:
Create a transient table. For sake of description, I'll just call this STG_TABLE. You will also maintain your existing target table called TABLE.
Modify your COPY command to load to STG_TABLE.
Create a STREAM called STR_STG_TABLE over STG_TABLE.
Create a TASK called TSK_TABLE with the following statement
This statement will execute only if your COPY command actually loaded any new data.
CREATE OR REPLACE TASK TSG_TABLE
WAREHOUSE = warehouse_name
WHEN SYSTEM$STREAM_HAS_DATA('STR_STG_TABLE')
AS
INSERT OVERWRITE INTO TABLE (fields)
SELECT fields FROM STR_STG_TABLE;
The other benefit of using this method is that your transient table will have the full history of your files, which can be nice for debugging issues.

how to read a tab delimited .txt file and insert into oracle table

I want to read a tab delimited file using PLSQL and insert the file data into a table.
Everyday new file will be generated.
I am not sure if external table will help here because filename will be changed based on date.
Filename: SPRReadResponse_YYYYMMDD.txt
Below is the sample file data.
Option that works on your own PC is to use SQL*Loader. As file name changes every day, you'd use your operating system's batch script (on MS Windows, these are .BAT files) to pass a different name while calling sqlldr (and the control file).
External table requires you to have access to the database server and have (at least) read privilege on its directory which contains those .TXT files. Unless you're a DBA, you'll have to talk to them to provide environment. As of changing file name, you could use alter table ... location which is rather inconvenient.
If you want to have control over it, use UTL_FILE; yes, you still need to have access to that directory on the database server, but - writing a PL/SQL script, you can modify whatever you want, including file name.
Or, a simpler option, first rename input file to SPRReadResponse.txt, then load it and save yourself of all that trouble.

SSIS: How do I create tables using Foreach Loop Container?

I got a bunch of .csv files each containing a script that would create a certain table.
I want to create tables using these scripts in said files (each table to be created using one file).
I got a foreach loop container that specifies the path and which files to use.
I don't know how to configure the Execute SQL Task to execute the script in each one of these files in order to create a table.
You can use the Execute SQL Task with an input parameter of the table name (I would use the table name that the 'for each' container provides. I would first drop the table if it exists and then recreate it with a create table command (in the Execute SQL Task).
As other people have noted you may want to be careful with tasks that drop tables but I have created plenty of SSIS packages that involve truncating and/or creating tables.

Backup a single table with data in SQL Server [duplicate]

I want to get a backup of a single table with its data from a database in SQL Server using a script.
How can I do that?
SELECT * INTO mytable_backup FROM mytable
This makes a copy of table mytable, and every row in it, called mytable_backup. It will not copy any indices, constraints, etc., just the structure and data.
Note that this will not work if you have an existing table named mytable_backup, so if you want to use this code regularly (for example, to backup daily or monthly), you'll need to run drop mytable_backup first.
You can use the "Generate script for database objects" feature on SSMS.
Right click on the target database
Select Tasks > Generate Scripts
Choose desired table or specific object
Hit the Advanced button
Under General, choose value on the Types of data to script. You can select Data only, Schema only, and Schema and data. Schema and data includes both table creation and actual data on the generated script.
Click Next until wizard is done
There are many ways you can take back of table.
BCP (BULK COPY PROGRAM)
Generate Table Script with data
Make a copy of table using SELECT INTO, example here
SAVE Table Data Directly in a Flat file
Export Data using SSIS to any destination
You can create table script along with its data using following steps:
Right click on the database.
Select Tasks > Generate scripts ...
Click next.
Click next.
In Table/View Options, set Script Data to True; then click next.
Select the Tables checkbox and click next.
Select your table name and click next.
Click next until the wizard is done.
For more information, see Eric Johnson's blog.
Put the table in its own filegroup. You can then use regular SQL Server built in backup to backup the filegroup in which in effect backs up the table.
To backup a filegroup see:
https://learn.microsoft.com/en-us/sql/relational-databases/backup-restore/back-up-files-and-filegroups-sql-server
To create a table on a non-default filegroup (its easy) see:
Create a table on a filegroup other than the default
Another approach you can take if you need to back up a single table out of multiple tables in a database is:
Generate script of specific table(s) from a database (Right-click database, click Task > Generate Scripts...
Run the script in the query editor. You must change/add the first line (USE DatabaseName) in the script to a new database, to avoid getting the "Database already exists" error.
Right-click on the newly created database, and click on Task > Back Up...
The backup will contain the selected table(s) from the original database.
To get a copy in a file on the local file-system, this rickety utility from the Windows start button menu worked:
"C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\DTSWizard.exe"

Dynamically populate external tables location

I'm trying to use oracle external tables to load flat files into a database but I'm having a bit of an issue with the location clause. The files we receive are appended with several pieces of information including the date so I was hoping to use wildcards in the location clause but it doesn't look like I'm able to.
I think I'm right in assuming I'm unable to use wildcards, does anyone have a suggestion on how I can accomplish this without writing large amounts of code per external table?
Current thoughts:
The only way I can think of doing it at the moment is to have a shell watcher script and parameter table. User can specify: input directory, file mask, external table etc. Then when a file is found in the directory, the shell script generates a list of files found with the file mask. For each file found issue a alter table command to change the location on the given external table to that file and launch the rest of the pl/sql associated with that file. This can be repeated for each file found with the file mask. I guess the benefit to this is I could also add the date to the end of the log and bad files after each run.
I'll post the solution I went with in the end which appears to be the only way.
I have a file watcher than looks for files in a given input dir with a certain file mask. The lookup table also includes the name of the external table. I then simply issue an alter table on the external table with the list of new file names.
For me this wasn't much of an issue as I'm already using shell for most of the file watching and file manipulation. Hopefully this saves someone searching for ages for a solution.