In the data folder I have over 250 .sql files that must be deployed in the DB when java application starts run.
All the files are reading by liquibase but not executing.
loads 016-.sql and 022-.sql files
executing only 022-*.sql
Who can help, what's the issue?
The files permissions and file encoding are the same (CL)
Related
Let me know what additional info would help. I have an SSIS package that Imports a csv file to SQL Server and moves that csv into a subfolder of the folder where the csv resides. I can run the package from the Integration Services Catalog with no problem, but when running from the Agent Job it says it can't read the .csv file and I've tried different versions of the file. I tried recreating the job also. No luck. Also the C# script task is not moving the csv to the archive folder even when I run the package from the ISC. The agent error tells me to look at the Execution of the package which says basically it can't open the csv.
I have created an SSIS package which uses a for each loop container and an Excel connection string that I have created from a variable so I can loop through multiple files. My package works without issue and if I have a number of files in my source folder and I simply execute the package it works perfectly looping through all the files doing what I want it to do.
The issue I have is when I deploy the package, If I have files within my source folder it executes without error but when you look at the source folder it still has the files in. When digging a bit deeper in to the package reports it looks like it is reporting that there were no files found. If I manually execute the dtsx file in runs without issue and imports everything as it should.
Is there any reason why after deploying the package it is unable to recognise the files or the variable that I store the file name in?
Sounds like it could be permissions related. Does the SQL Server Service account have permissions to the directory where the files are stored?
I have 3 xml file to be written to a folder for client. while writing the 2 files got written perfectly but 3rd file failed. what are the ways by which I can prevent the client to open any file or all the files got deleted or locked if anything failed?
If your application has delete privileges on the system, keep a record of the filenames to the files you're writing. If a file fails for whatever reason, go through the list of file names and delete the files from the directory. A simple string list with a for loop should do it.
I am creating a Flowgear workflow that needs to process a raft of XML data.
I have the xml data contained in a set of .xml files (approximately 400 files) in a folder on my local machine hard-drive and I want to read them into a workflow, run an XSLT transform and then write out the resultant XML to another folder on the same local hard-drive.
How do I get the flowgear workflow to read these files?
It depends on the use case, the File Enumerator works exceptionally well to loop (as in for-each) through each file. Sometimes, one wants to get a list of files in a particular folder and check whether a file has been found or not. For this, I would recommend a c# script to get a list of files with code:
Directory.GetFiles(#"{FilePath}", "*.{extension}", SearchOption.TopDirectoryOnly);
Further on, use the File node to read, write, or delete files from a file directory.
NB! You will need to install a DropPoint on the PC/Server to allow access to the files. For more information regarding Drop Points, please click here
You can use a File Enumerator or File Watcher to read the files up. The difference is that a File Enumerator will enumerate all files in a folder once, the File Watcher will watch a folder indefinitely and provide new files to the workflow as they are copied into the folder.
You can then use the File node to write the files back the the file system.
I am New to SSIS config side. I have created one package with its config file. My Project placed into my account folder at server. but I created the config file which I placed at shared drive folder and also copy the mypackage.dtsx file into another shared folder.
Now I have ran the package with dtexec.exe /f "mypackage.dtsx" without using config file even though it successfully run.
even I have changed some of the property into the config file and ran the package with use of the dtexec.exe command(mentioned above) and it was executed successfully.
So I have a question that, Do I need the config file at dtexec.exe command line because I can run my package by "dtexec.exe /f "mypackage.dtsx" " too?
I saw the syntax of dtexec.exe /f "package.dtsx" /config "myconfig.dtsconfig"
Please guide me...Does the package contains the config file and its changes?
The package will remember it's saved settings. The benefit of a config file is that if you need to override/shange the settings that are contained in it, you can do that without needing to open, fix, and redeploy your package. A config file is not ever necessary, it is just a convenience to you the developer, especially if your environment has a strict change management policy. It is usually easier to change values in a config than to edit and redeploy a package under strict change managment.
CLARIFICATION
It appears from your question that you may be thinking that when you change the config it will change your package regardless of including your config in your execution. All of the information from the config will be in the package at the time you save it, but it may differ from what is in the config. If you run without config, you are running exactly what is saved in the package. Package executions work like this:
Load package with all configurations from the saved .dtsx file
Check for configurations to load.
Load configuration in memory and overwrite values loaded from .dtsx package.
Execute.
This is simplified, and there are other things going on, but at the basic level this is accurate.