I have a ssis package which exports data from excel files and dump it into a SQL Table. For processing files, I am using a foreach loop and a dataflow opens an excel source and dumps data into ole db destination. If any file is not containing the required tab, I want the ssis package to log error and move to next iteration. I have tried following things but the package fails:
Propagation = false
ForceExecutionResult = Success
How can I handle this?
Attached images are the screen shots of control flow, data flow and progress.
Try on the For Each loop set maximum error count to 0 to ignore all errors.
In your for each loop have an on error sql script to write out if error so they are not lost.
Related
I am saving a large word document and before the saving process completed , I am uploading the active document content to the server using our vb project. As the saving process is not completed, my code is giving an error saying,
Unexpected Error 70 - Permission Denied
Commands.TPS_UplodDocument
Is there any way to identify if the file saving process finished or not by using a visual basic code?
I tried on this.
Word.ActiveDocument.Saved
But it is returning True just after the saving process started.
Please help me on this.
We can use the following vb code to identify a word document's saving process is in progress or not.
word.Application.BackgroundSavingStatus
This will return 0 if there is no document saving process running.
This will return a count <> 0 if the document saving process is in progress.
Do
Select Case True
Case mobjApplication.wordApp.BackgroundSavingStatus = 0
Exit Do
Case Else
DoEvents
End Select
Loop
This will wait the project execution until the document saving process get completed.
I am trying to create a project that will produce a separate excel file for every account in a specific table. The first steps to pull the data into a temp file and load an array variable with the accounts for the loop, this is working fine.
I am getting the 2 errors below. And when I execute the package it fails on the export to excel data flow task.
Error 1 Error loading RAW_DATA_EXPORT.dtsx: The connection string
format is not valid. It must consist of one or more components of the
form X=Y, separated by semicolons. This error occurs when a connection
string with zero components is set on database connection manager.
Error 2 Error loading RAW_DATA_EXPORT.dtsx: The result of the
expression ""\\server\DATA\Status\Testing\Filename" +
#[User::Accts] + ".xls"" on property "ConnectionString" cannot be
written to the property. The expression was evaluated, but cannot be
set on the property.
This is what my dynamic connection string looks like evaluated:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\\server\DATA\Status\Testing\Filename03500.xls;Extended Properties="EXCEL 8.0;HDR=YES";
I think the issue is since the excel file are non-existent, the validation fails when it gets to the step. From what I found online I need to either create the sheets through a a script task (not fond of) or use a variable value to create the sheet in the file. I have not been successful on either.
(...) since the excel file are non-existent, the validation fails when it gets to the step.
Have you tried delaying validation in the Data Flow task that exports to Excel? Check Data Flow's properties window.
I have an application that checks to see if it's the most recent version. If not, it updates itself by using File.Copy to replace the DB attached to the application with a fresh one (that may or may not have had changes made to it). In an attempt to keep the data from being deleted, I created a backup system that writes all the data to an XML file before the database is deleted and restores the data once the database has been copied.
I am having a problem with the File.Copy method, however, in that an error pops up telling me the .MDF is being used by another process.
I was told that stopping SQL Server would work, but it hasn't. I've also been told I can use SMO, but also have not been able to make that work. With this seeming so close to complete, SMO also seems like it won't be necessary.
My code is this:
'This is the backup. I make sure to close the SQL Connection when the process is complete.
Dim db As String = "C:\ACE DB\localACETest.mdf"
Dim dbLog As String = "C:\ACE DB\localACETest_log.ldf"
If File.Exists(db) = True Then
'Backup process
'...
End If
'"Data/localACETest.mdf" referenced below is the file located inside of my application that is used to overwrite the other MDF; it is NOT the .MDF I'm looking to replace.
Directory.CreateDirectory("C:\Random Directory\")
File.Copy("Data/localACETest.mdf", db, True) 'This is the line where I get the error
File.Copy("Data/localACETest_log.ldf", dbLog, True)
success = False
...
EDITS:
I have narrowed the issue down to the method that backs up my data. I'm using the following connection string:
Private Const _sqlDB As String = "Data Source=(localdb)\v11.0;Initial Catalog=localACETest;Integrated Security=True; _
AttachDbFileName=C:\ACE DB\localACETest.mdf"
I open SQL, run a command, and then close it:
Using connection = New SqlConnection(_sqlDB)
connection.Open()
...
connection.close()
Why does this not release the MDF from the process? (When I don't run it, I have no problems)
You are better off sending a command to SQL server telling it to make a backup for you. This SO article has a great (command line) script that you can copy/paste:
SQL Server command line backup statement
Put that into a batch and launch it like this.
System.Diagnostics.Process.Start("C:\Ace Db\MakeADBBackup.bat")
If you would prefer to make your program wait till the backup finishes, read about launching processes: http://support.microsoft.com/kb/305368
If you are REALLY insistent on making a copy of the .mdf (which is not a good idea), then you need to ask the server to stop the SQL service before you make a copy. You could run a batch that says
NET STOP MSSQLSERVER
Assuming that your SQL Server is running under the name "MSSQLSERVER". To check the names of running services, open a command prompt and type-in "NET START". It will give a list of services that are running. One of them will be the "process name" of your SQL Server's running service.
Better still, here is an article (for VB.NET) that shows the source code for starting/stopping SQL Server. http://msdn.microsoft.com/en-us/library/ms162139(v=SQL.90).aspx
I strongly recommend that you try the first approach that I suggested.
I am using SSIS2012, I am trying to import about 25 excel files (each containing about 70(variable) sheets) into SQLserver2008.
I have built it so that it will loop through all the excel sheets and import the first sheet, but this is useless, how can I loop all the excel files and loop all of the sheet names into SQL?
I have set up a script task to get the sheetName into a variable, but I don't know what to do from there.
Is my question clear enough?
I am much more fluent in VB over C# so if you're using script task, ideally paste VB,net code.
Thanks,
James.
You can Loop through Excel Files and Tables by Using a Foreach Loop Container
Here you will use nested foreach loops in the Control Flow. These will loop first over the files, and then loop over the tables within the files (worksheets). Inside the loops you will have a Data Flow with an Excel File Source.
I've done a similar thing. What I did was add a Foreach Loop Container, and set enumerator property to Foreach File Enumerator. Retrieve the file path and store in a variable. Then use that variable to dynamically set the file connection using the property extensions editor.
Finally, put your data flow inside the Foreach Loop Container.
Doing this I was able to import data for each Excel file found in the directory specified.
I have created SSIS to generate a excel file from sql
and giving file name dynamically depending upon todays date.
I have not yet deployed package because of testing.
When I changed System date to test wether it is able to create file for that date or not
but it is giving me an following error
Error at Data Flow Task [Excel Destination [34]]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E37.
Error at Data Flow Task [Excel Destination [34]]: Opening a rowset failed. Check that the object exists in the database.
Error at Data Flow Task [SSIS.Pipeline]: "component "Excel Destination" (34)" failed validation and returned validation status "VS_ISBROKEN".
Error at Data Flow Task [SSIS.Pipeline]: One or more component failed validation.
Error at Data Flow Task : There were errors during task validation.
(Microsoft.DataTransformationServices.VsIntegration)
In excel connection manager i have set property excelfilepath in expression
code is #[User::ExcelFileName] + (DT_WSTR, 20) (DT_DBDATE) GETDATE()+".xls"
which is giving evaluated value like C:\2013-05-24.xls
How do I resolved it.
To fix this issue. In the Excel connection manager properties you have to set the "DelayValidation" property to TRUE.