I am new to Apache Airflow and so far, I have been able to work my way through problems I have encountered.
I have hit a wall now. I need to transfer files to a remote server via sftp. I have not had any luck doing this. So far, I have gotten S3 and Postgres/Redshift connections via their respective hooks to work in various DAGs. I have been able to use the FTPHook with success testing on my local FTP server, but have not been able to figure out how to use SFTP to connect to a remote host.
I am able to connect to the remote host via SFTP with FileZilla, so I know my credentials are correct.
Through Google searching I have found the SFTPOperator, but am not able to figure out how to use it. I have also found FTPSHook, but still I have not been able to get it to work.
I keep getting the error nodename nor servname provided, or not known or a general Operation timed out in my Airflow logs.
Can someone point me in the right direction? Should I be using the FTPSHook with SSH or FTP Airflow Conn Type? Or do I need to utilize the SFTPOperator? I am also confused as to how I am supposed to setup the credentials in my Airflow connections. Do I use the SSH profile or FTP?
If I can provide any more additional info that may help, please let me know.
Cheers!
SFTPOperator is using ssh_hook underhood to open sftp transport channel that serves as a basis for file transfer. You can either configure ssh_hook by yourself or provide connection id via ssh_conn_id.
op = SFTPOperator(
task_id="test_sftp",
ssh_conn_id="my_ssh_connection",
local_filepath="",
remote_filepath="",
operation=SFTPOperation.PUT,
dag=dag
)
Related
i have running some NiFi stand alone instances in secured mode. connecting each other vie site-to-site is working fine, i'm not able to get a connection to itself.
let's assume some nifi instances
- https://my-nifi-1.local:9443/nifi
- https://my-nifi-2.local:9443/nifi
- https://my-nifi-3.local:9443/nifi
doing a remote connection between https://my-nifi-1.local:9443/nifi and https://my-nifi-2.local:9443/nifi and https://my-nifi-3.local:9443/nifi works properly.
if my flow sends some data via "Remote connections Output Port" to a Remot-Process-Group on the top level of the flow, then its not working.
ErrorMessage: forbidden - Site-to-Site is not secure.
i tried to configure my RPG using https://my-nifi-1.local:9443/nifi or https://localhost:9443/nifi.
THX for any help
found the solution
Each instances CN has to be added to the users and also there are some policies to apply
Thanks to Bryan Bende I could figure it out -> https://bryanbende.com/development/2016/08/30/apache-nifi-1.0.0-secure-site-to-site
before asking this question i looked through google and tried different alternatives none of which were successful for me, sadly. I'm a little above the noob level. What i want is to basicaly host a wordpress site on a google cloud debian machine.
I was doing good installing services through their SSH access until i got to the point where i installed an ftp service and wanted to access it through a remote computer(my own) i only got as far as to:
Status: Waiting to retry...
Status: Connecting to 104.197.183.19...
Response: fzSftp started
Command: open "root#104.197.183.19" 22
Error: Connection timed out
Error: Could not connect to server
I kept on looking and trying new ways until i found the gcloud documentation for ftp but it is not aimed at new ones, so my questions are:
Where do i input the commands for gcloud, on my computer or on the SSH console(Google cloud machine)?
Do i need to use gcloud for ftp remote access or can i do it entirely through my computer and their SSH machine?
Do i really need to add an ssh authorization file to FileZilla or is there a way i can disable that check on my vps so it lets me sign in with just a username and a password?
What i already tried and didn't work for me:
gCloud documentation for ssh and ftp
Google cloud documention for setting up a wordpress site
Many others
Basically what i need in short is to manage to access the vps through ftp so i can continue with my learning.. Been stuck there two days.
To get access to a users public area, ie. public_html
Go to the accounts Cpanel area and under Security > SSH Access you can import a key file.
You can use PuttyGen to make one, you will need both a private and public key.
Past the keys into the box's.
You may get a warning message about the private key, this is ok.
Go to Manage under public key and authorize it.
Or
Make on using the interface in Cpanel and download both Keys.
Then in FileZilla
Host: IP of server
Protocol: SFTP
Logon Type: Key File
Key File: the PPK you made.
(if you asked Cpanel to make the file select the one that does not end in .pub and FileZilla will convert it for you to a .ppk file.
After clicking connect you should be in
If you still have an error make sure the SSH port (22) is open in your filewalls both Google cloud.google.com > Networks and WHM > LDF/CSF plugin
Use SSH File Transfer Protocol.
No need to install ftp service.
Use winscp for connecting with sftp.
The recommended way of transferring files to a Unix-based Google Compute Engine VM is via the gcloud compute copy-files command. For this, please install the Google Cloud SDK. Then, run a command such as the following:
gcloud compute copy-files --zone=<Compute Engine zone>/path/to/local/file.txt <Compute Engine instance name>:/path/to/destination/file.txt
If you'd like to use FileZilla, you'll have to configure it for access. The SSH daemon on Compute Engine VMs is set up for key-based authentication. This forum post indicates how this is possible in FileZilla. The catch is that you need to put your public key on the VM, which can be a little tricky. gcloud compute copy-files and gcloud compute ssh take care of this for you, which is why they are the recommended method.
I am trying to connect to SFTP server using notepad++ plugin NppFTP. However, while connecting to the remote server I always get below error:
Connecting
[SFTP] Connection failed : Timeout connecting to <IP address>
Unable to connect
Disconnected
Here are the important configuration details set in NppFTP window for your reference:
Port: 22
Connection Type: SFTP
Authentication: Try password authentication
I tried to go through some of the solutions stated in few the questions in Stack Overflow but to no avail.
Would really appreciate your suggestions/pointers to resolve this. If it is related to SSH private key, would appreciate if you can provide the steps as well.
Edit: I am able to access the server using another FTP tool FileZilla which eliminates any mess with the access front.
You need to find out more about where the issue is:
Can you ping the server you are trying to connect to?
ping 166.178.233.70
Did you try connecting with another ftp tool, like FileZilla, that gives more detailed log information?
The possible issues are numerous:
The ftp server config: set up to exclude connections from certain ip addresses or domains, requires public/private keys...
Your local connection/VPN is not passing the connection.
It may require a
I got the same issue. The problem was solved after I updated NppFTP.
in my case, I changed the wifi connection then, it works.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have an apache svn server running on a shared hosting linux account.
The symptom is that i can connect to the server that is hosting svn with PuTTY just fine,
but the TortoiseSVN Repository Browser can't connect.
I have tried svn, svn+ssh,
If I try just svn I get:
E730060: Unable to connect to a repository at URL
'svn://50.97.138.99:36901 /test' svn: E730060: Can't connect to host
'50.97.138.99': A connection attempt failed because the connected
party did not properly respond after a period of time, or established
connection failed because connected host has failed to respond.
If I try svn+ssh,
I get repeatedly prompted for the password and end up having to cancel the pwd dialog several times, and the error is:
Unable to connect to a repository at URL 'svn+ssh://wren.arvixe.com'
To better debug connection problems, remove the -q option from ssh in
the [tunnels] section of your Subversion configuration file. Network
connection closed unexpectedly
I've been working on this two whole days now, googled my heart out, and am starting to get delirious.
Thanks for any help. Ill be happy to provide more details / do experiments / etc.
I didn't know what details anyone would want to get started, so sorry if not enough details initially.
There seems to be some confusion about how your repository is accessible. You can't simply browse via svn+ssh:// without someone setting it up. Did someone set this up?
I'm surprised you can log onto the server via SSH (using PuTTY) since it's a shared hosting server. That's usually not allowed. Most shared hosting sites don't allow shell access.
You mention apache svn server in your opening statement. Then, you try both svn:// and svn+ssh://. There are FOUR separate ways of setting up a Subversion server and accessing Subversion:
Use Apache httpd as the server. To do this, you need Apache httpd to be configured and compiled correctly. You need several Apache plugins such as mod_dav_svn.so and mod_dav.so. Do you have this setup? If you do, you need to access your repository with http:// and not svn:// as you show.
Use svnserve as the server. This is simple to setup. You access your repository with svn:// as you show. Did someone configure svnserve and have it running?
Use svnserve over ssh. This uses the svnserve, but integrates with ssh and can use the Unix file access. This is very, very tricky to setup, and I have seen extremely few instances of people actually using this. Each user, when they access the repository, fires off their own svnserve process. The repository has direct access to it, and you must set up the individual accounts to prevent any shell access. Otherwise users could directly manipulate the repository.
Use direct file access This is highly not recommended when sharing a repository since all users must be granted direct read/write access on the repository.
The problem is that it becomes impossible to help you without knowing how Subversion was setup on your system. Did you create the repository? If so, did you start a svnserve process to access it? Did someone else setup your Subversion repository on that system? If someone else setup the repository, you'll have to get the exact directions from them on how to access it.
If you are the one who setup the repository, did you setup Apache or did you setup svnserve? If you have setup svnserve, did you use the default port 3690, or setup another port to use? Can you determine if this port is blocked by your firewall? You can use the telenet command to try accessing the port directly:
C> telnet 50.97.138.99 3690
If you can't connect, you'll see an "Unable to connect to remote host" message, and that means either svnserve isn't running, or your firewall is blocking that port. Since you can log into the server, try logging into the server, and acesssing that repository directly:
$ svn log svn://localhost
If you can access the repo, you have svnserve running, but your firewall blocking it. If you can't access svnserve, you might not have it running.
Try that and see what you get.
I have been looking around for a solution that implements this, but google always gives me tutorials on establishing a live chat over an ssh tunnel--not the other way around.
I suspect this can be implemented just using tunnels (if it is possible at all), but I am not sure how.
I am sorry if this has been asked, but after looking through the related questions, but I have not been able to find one that I can be sure will work for my particular needs (i.e. I cannot create an ssh session directly with gmail.com etc.) If I am wrong, please just post a link to the applicable question.
If you can establish connections between peers via your IRC channel, then there is a solution.
Don't try to fiddle with IRC itself, but build a solution on top of it.
Use ssh yourself on top of IRC.
I mean create a SSH/SSL connection to a dummy socket you can use to intercept the data sent by SSH. Transform this data (if necessary) to make it transportable via IRC. And send it to the remote peer via IRC.
On the remote peer, intercept your data, un-transform it before giving it to your ssh/ssl connection listener. And proceed the same way to send response.
If the connection is successful ssh will tell you and your can start pouring your data through this secure 'channel'.
Your data going via IRC will be safe, because ssh is.