Upload a file using FTP with SSIS - sql-server-2005

I need to upload a file using an FTP task in SQL Server 2005 SSIS. While uploading I am unable to get the folders created on the FTP server.
Is there any other solution in SSIS for uploading a file to the FTP server?

Quite often when I need to use FTP in a DTS or SSIS package, I make a text file with the commands I want executed and then use an "Execute Process" task to run "ftp.exe /s:mycommands.txt". This can give you a little more control over what happens during the ftp connection.

Related

CSV transfer from SFTP server to S3 via WinSCP

My work uses WinSCP for SFTP transfers. We have some data coming in this way each week and I would like to get it into an S3 bucket. We want to automate this transferring by use of a cron job or some other way like this.
I know there are AWS tools, but they cost money and money can't be spent. We also do not have an ETL tool like Alteryx, otherwise I would use it. Nothing on the internet gives a lot of detail about transfering files from SFTP server to another server. Mostly reading how to transfer from server to local.
Below is the code I have found.
Can this WinSCP commands be used to transfer to S3 bucket somehow at the 'put:'? (I cannot use the generator like other posts have said because I do not have access to our AWS or any buckets, yet.) This is all about proving a concept.
# Connect to SFTP server using a password
open sftp://user:password#example.com/ -hostkey="ssh-rsa 2048 xxxxxxxxxxx...="
# Upload file (THIS IS WHERE I WOULD WANT S3 PATH SYNTAX)
put d:\examplefile.txt /home/user/
# Exit WinSCP
Exit
Once I have this command we can then create a Windows schedule task, from what I read. This would automate where the file is and we can then do more with the file where SFTP servers limit us.
If I understand the question correctly, you are asking how to transfer files directly from SFTP server to S3 using a script running on yet another machine.
It's not possible (unless AWS has a feature for that, but then it won't be free). You have to download the files from SFTP server and then upload them to S3.
With WinSCP scripting, you can do it with a script like:
open sftp://username:password#sftp.example.com/
get /sftp/path/*
exit
open s3://accesskey:secretkey#s3.amazonaws.com/
put * /bucket/
exit
You can mount your bucket on your server with "s3fs" "fuse" like an normal harddisc.

SSIS package to run script file on Azure

I want to create a SSIS package which will first get the .sql script file from Azure Blob storage and then execute the same in Azure sql server. I am able to do it when I am using local storage. That means when I am executing local file not from azure storage.
Please help me!!!
Frist, you're going to want to set up your Azure storage with Azure File Storage That way the machine running your SSIS package will be able to use the file storage like a mapped network drive.
Once you've got that setup, the first step in your package will first run an Execute process Task. This Task should be configured so that it runs a simple batch file (bat) that reconnects to the file share you setup in Azure File Storage. Here is a basic example of what would be in this batch file:
if not exist e:\*
net use e: \\<storageAccountName>.file.core.windows.net\<fileContainerName> /u:<storageAccountName> <primary access key for storage account>
After this is run, your SSIS package would then be able to access your .sql files stored on this share.
Simply pick up the file, read the contents and execute the statement contained in the .sql file.

How can I get download .sql file from a website

I have been given an assignment to fix the bugs in a website.
They give me the username and password to access the file in the hosting.
I'm using FileZilla and got all the source but the database .sql file is nowhere to be found
Any idea where the .sql went and how do I get it from the website using FileZilla?
thanks
You have to dump your database to .sql file first. Only then you can download it.
Assuming MySQL, see How to dump mysql database?
If you have phpMyAdmin (or a similar tool) available, you can dump and download the .sql file at once from the web interface directly.

How to use a Post deployment script to bulk insert a CSV on remote server

I am using a post deployment script in my SQL server database project to bulk insert data. This works great if I publish to a local database, but if I publish to a remote database, obvisouly the CSV I am bulk inserting from, will not exists on the remote server. I have considered using a command line copy on the CSV to a shared folder, but this raises security concerns, as anyone with access to this folder could possibly tamper with a deployment.
How should I be using post deployment scripts to copy data to a remote server? A CSV is easier to maintian than a bunch of inserts, so I would prefer using a CSV. Has anyone ran into and solved this issue?
The best way is the one you mentioned. Command line copy it to a secured shared folder and BCP from there.
From that point on the security of the folder depends on network services.

Error opening .mdf file through SQL Server Management Studio Express

I am doing a project of a web enabled database. I have created the database file in my PC.
Now when I just want to open .mdf i.e. of the database I created, I cannot open it in other PC. I even had copied the .ldf file i.e the log file to that PC.
Since I need to transfer the database to the Server later, I don't know how I will dump the database in that server from my PC so that the company can use it.
The basics of using mdf file is like this:
Create a new database using SQL Server (set the path for the file as you wish)
If you wish to move the file elsewhere,
detach the database from your server
copy/move the file to wherever you wish
attach the file as database in SQL server
You are not supposed to open the file by double-clicking as mime setting or attachment of extensions migh not be there in the target machine.
Why don't you use the decent method to copy/move database?