How to fix source is empty error in XML source while using Foreach loop container in SSIS 2012? - variables

I have an issue with a very simple task in SSIS 2012.
I have a for-each container that runs in FOR-EACH-FILE Enumerator mode. I want to read a target folder with XML files. The path to the folder is correctly configured. The files field is set to *.xml
The variable mapping is defined with the follwing Variable: User::FileVar , Index 0.
Now I add a simple data flow task inside the container. The dataflow task only has a XML-Data Source task, that's it. For the XML Data source task, the XSD location is set. When I click choose columns, I can see the columns from the XSD schema.
BUT: When I save the XML task , I always get the error message: The Property XMLDataVariable is empty. I tried both data Access modes, XML file from variable and XML data from variable. The error message remains, I cannot run the package.
I don't use any expressions, neither at the foreach loop container nor at the data flow task.
I dont know what's wrong here, I did the steps exactly as shown in some tutorials for older versions of SSIS.
Do you have any ideas?

The issue is that the XML Source is trying to validate the existence of the given file during the design time. However, you will know the file name only during runtime when the Foreach loop container executes and loops through every XML file available in a given folder.
I recreated an SSIS 2012 package using my answer to one of other SO questions.
SSIS reading multiple xml files from folder
I was able to reproduce the error The property "XMLDataVariable" on the XML Source was empty
On the XML source, I set the property ValidateExternalMetadata to False. Setting this to false will force the package not to verify the existence of the xml file path during design time.
I was successfully able to execute the package.
Hope that helps.

Related

SSIS Connection Error - File name not valid

I'm seeing an issue with an SSIS (SQL Server 2005) job where I'm getting the following error:
The file name "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\UNC\FOLDERS\filename.xls;Extended Properties="EXCEL 8.0;HDR=YES";" specified in the connection was not valid.
My searching around this site and others indicates that the most common cause of this is a permissions error but I don't believe that's the case in this situation since any number of files have successfully been processed through this implementation.
Here's an overview of the setup:
Vendors FTP files to us on a daily basis that a Windows service picks up, copies to a temporary directory and then calls SSIS jobs on those files. There are two SSIS jobs for each vendor one for a snapshot data feed and one for a transaction listing.
There are currently over 50 different SSIS jobs in the overall process. All of them work except for one specific transaction job which fails with the above error in a script task step. Files come in at least daily with unique file names so I grab the job, determine the vendor based off the source directory and then the file type based off indicators in the file name to determine which SSIS job to call. Since file names change every day, when the service calls the SSIS job, I pass in a series of parameters including the vendor file name so it can properly connect to the file.
Each job begins with a script task that sets necessary variable values for the rest of the job. For example, since the vendor file name changes with each run, I pass in the vendor file name through the SSIS variables collection then set the connection string of a datasource using that file name as the DataSource in the string. It is at that point of the script task that the above error occurs. Here's the task script code where the error occurs:
Dts.Connections("Transactions File").ConnectionString = _
Dts.Variables("ConnectionString").Value.ToString().Replace("##FILE_PATH##", sourceFilePath)
The ConnectionString value is: Provider=Microsoft.Jet.OLEDB.4.0;Data Source=##FILE_PATH##;Extended Properties="EXCEL 8.0;HDR=YES";
The sourceFilePath is the full UNC path to the vendor file in the processing directory
I don't believe it's a permissions error since all the other files going through this process (using the same holding directory for processing) are working. It shouldn't be an issue of the file not existing since again it follows the same process as every other file and I have verified the file properly ends up in the correct directory. I also considered that the connection string might be too long, but the filepath ends up at 109 characters and even with a shorter (<90) full path, the same error occurs.
Is there anything else you can you think of for me to look at? Thanks for any help.
Based on the information presented, you are doing everything correct. If you're new to SSIS, one thing I'd suggest, is that you get a copy of the excellent add-in BIDSHelper. It has great features that can really save you time especially with regard to configurations and expressions.
I created a reference package that had an Excel Connection Manager pointing to C:\ssisdata\so_paulsmithjr.xls and wired everything up.
At this point, I know things are working so it was time to make the package move. I created the following variables and their values
CurrentFile - C:\ssisdata\so_paulsmithjr.xls
PlaceHolder - ##FILE_PATH##
TemplateConnection - Provider=Microsoft.Jet.OLEDB.4.0;Data Source=##FILE_PATH##;Extended Properties="Excel 8.0;HDR=YES";
A fourth variable is set to be an expression (Right click on variable, properties window. Set Evaluate as Expression = True & Expression is below)
CurrentConnection - REPLACE(#[User::TemplateConnection], #[User::PlaceHolder], #[User::CurrentFile])
I compared the CurrentConnection value to the ReferenceConnection (which is the original value of the Excel Connection Manager's connection string) and things were a match. At this point, if I were to change the value of CurrentFile to C:\ssisdata\so_paulsmithjr - Copy.xls, that would automatically be reflected in the value of CurrentConnection.
The final trick would be to use an Expression on the Excel Connection Manager. Again, right click on the CM and under Properties, there will be Expressions. It won't expand as there is nothing under it. Instead click the ellipses and then select ConnectionString property and select the ellipses again and this time drag down the #[User::CurrentFile] variable. Click OK x2 and now your connection manager is set to use wherever the CurrentConnection variable specifies.
Does that work any better?

MsTest, DataSourceAttribute - how to get it working with a runtime generated file?

for some test I need to run a data driven test with a configuration that is generated (via reflection) in the ClassInitialize method (by using reflection). I tried out everything, but I just can not get the data source properly set up.
The test takes a list of classes in a csv file (one line per class) and then will test that the mappings to the database work out well (i.e. try to get one item from the database for every entity, which will throw an exception when the table structure does not match).
The testmethod is:
[DataSource(
"Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\EntityMappingsTests.Types.csv",
"EntityMappingsTests.Types#csv",
DataAccessMethod.Sequential)
]
[TestMethod()]
public void TestMappings () {
Obviously the file is EntityMappingsTests.Types.csv. It should be in the DataDirectory.
Now, in the Initialize method (marked with ClassInitialize) I put that together and then try to write it.
WHERE should I write it to? WHERE IS THE DataDirectory?
I tried:
File.WriteAllText(context.TestDeploymentDir + "\\EntityMappingsTests.Types.csv", types.ToString());
File.WriteAllText("EntityMappingsTests.Types.csv", types.ToString());
Both result in "the unit test adapter failed to connect to the data source or read the data". More exact:
Error details: The Microsoft Jet database engine could not find the
object 'EntityMappingsTests.Types.csv'. Make sure the object exists
and that you spell its name and the path name correctly.
So where should I put that file?
I also tried just writing it to the current directory and taking out the DataDirectory part - same result. Sadly, there is limited debugging support here.
Please use the ProcessMonitor tool from technet.microsoft.com/en-us/sysinternals/bb896645. Put a filter on MSTest.exe or the associate qtagent32.exe and find out what locations it is trying to load from and at what point in time in the test loading process. Then please provide an update on those details here .
After you add the CSV file to your VS project, you need to open the properties for it. Set the Property "Copy To Output Directory" to "Copy Always". The DataDirectory defaults to the location of the compiled executable, which runs from the output directory so it will find it there.

SSIS XML Task with a For Each Loop

I have transformed an XML file
See this question for a complete description
What I want to do know is take this xml task and have it preformed in a For Each Loop. I want all xml files in a specific directory to be transformed and the resulting file to be moved to seperate directory.
It's not working. I'm getting the following error messaes:
Error: 0xC002F304 at XML Task, XML Task: An error occurred with the following error message: "Data at the root level is invalid. Line 1, position 1.".
Error: 0xC002928F at XML Task, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Can you perform an xml transformation within a for each loop?
The problem is that the XML Task was expecting the variable to contain the XML data that I was trying to transform. In fact the variable contained the file name and path pointing to the XML data.
My coworker showed me the fix.
Inside the XML Task select SourceType = File connection.
Then set the source to point to your file.
Here is the trick. At the bottom you will see Connection Manager and inside will be the file name you pointed your XML Task to. Click on the file name (inside of Connection Manager) and notice that the properties window will display.
Change ConnectionString to any string value (I used "placeholder").
Click on the plus sign next to Expressions and add a connectionstring like this:
#[User::FileName]
In this case FileName is a Package variable that contains the path to the xml file.
You should be able to execute the XML task within a Foreach Loop. The error messages seem to indicate that the XML task has a problem with the source file. You probably need to look at how the file names from the Foreach Loop are applied to the appropriate property in the XML task. Of course, you should ensure that the Foreach Loop is grabbing the correct file list.

SSIS 2005 flat file source - partial row which isn't actually a partial row

I'm currently working on an SSIS package to load mainframe logs from multiple server/file sources into a database.
As it stands at the moment I'm using a foreach loop container to loop through a recordset containing filenames and load the files using a Data Flow task from a Flat File Source and File connection to an OLE DB Destination through a Derived column.
I've built in error handling on the Data Flow task to allow for the fact that there won't always be a log file in the location specified (ie. because the server was down for maintenance during a specific period as the files are generated on an hourly basis), but the problems start after it finishes handling these errors.
If the file immediately following an attempt to load a file that wasn't found exists it begins to load it but then throws the following warning message: [Message Log File Source (NORDXSL) [57]] Warning: There is a partial row at the end of the file., and doesn't load all of the records in that file.
However, when I remove the files I know won't exist from the recordset (so that it only attempts to load files that do exist, including the one with the alleged "partial row"), everything works fine and all files/rows are loaded without a problem. It just seems to not want to load the first file after it's failed a missing file correctly and I can't for the life of me work out why?
I've tried calling Dispose() and ReleaseConnection() on the file connection after the Data Flow task has finished processing but this makes no difference and I'm now completely out of ideas.
Any help would be really appreciated as this is the last bug in this project and I want to get it out the door. PLEASE!!
Thanks,
James
I've now found a workaround for this problem...
I've added a Script Task before the Data Flow Task to load the files that checks to see if the file I want to read exists:
If (System.IO.File.Exists(Dts.Variables("MQLogMessagePath").Value.ToString)) Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
If it doesn't exist it fails the iteration of the Foreach Loop container and continues onto the next file.
BINGO!

SSIS - Skip Missing Files

I have a SSIS 2008 package that calls about 25 other SSIS packages.
Each of those child packages loads a specific file into a table. But sometimes one or more of these input files will be missing.
How can I let a child package fail (because a file is missing) but let the rest of the parent package keep on running?
I've tried increasing the maximum error count on the parent package, the tasks in the parent package that call each child, and in the child package itself. None of that seemed to make any difference. I still get this error when I run it with a file missing:
SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The
Execution method succeeded, but the
number of errors raised (2) reached
the maximum allowed (1); resulting in
failure. This occurs when the number
of errors reaches the number specified
in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.
Edit:
failpackageonfailure and faulparentonfailure are already all set to false everywhere.
I haven't tried this, but this is how I would approach it.
Create a variable for the file name and the child package name.
Use a For Each Loop container. Have it go through the location of the files and pull the file names one at a time. Use the file name to change the child package name variable. In the container have the task to run the child package and have the name dynamically set based on the values of the child package name variable.
Then it should only try to run the child packages which have appropriate files.
in the properties of the execute package task, you can set the failpackageonfailure and faulparentonfailure. i haven't worked with these, but you can probably play with them to get your desired results.
Side note: for simplicity, I'd set these settings on the parent SSIS package.
There is a MaximumErrorCount values at the Sequence Containers & package level. If you're using this be sure your values are in-sync because the package level settings take precedence.
Another option is the ForcedExecutionValue.
To set this up, load the properties tab for each of container and:
1) ForceExecutionValue to TRUE
This will cause the container to return whatever value you put in the variable (see step #2), despite the outcome of the task(s).
2) ForcedExecutionValue to 0
This acts a return value for that task, and sets it to 0 (true, think "return 0" as in C++).
I hope that helps.
This will cause the package to
Load the properties using "ForcedExecutionValue" to 0, then Then set the Force
I have done this kind of scenario development, first plan the package execution method as whenever you will get a file we need to process the package if not either fail or leave the package ultimately our target is to process all the package of files existing. take a variable for all the packages. set the variable to "Y" or "N" on the existing of the file using script component or connection string in the parent package. the existing condition to execute the package on the value of the variable.
This method gave us desired results of process multiple files with different occurences of source files.
thanks
prav