Connect to Cassandra via Kundera using SSL - ssl

We are using Kundera to connect to Cassandra, which is working just fine. Now we had to move Cassandra to a distant server what required to encrypt the connections. There is no problem using native Cassandra driver, however we have no clue, how to do that with Kundera.
There is a complete lack of documentation. Could anyone help?

Related

Could anyone connect Cloud SQL with cloud sql proxy pod

I'm trying to setup a very basic wordpress setup as explained in this document: https://cloud.google.com/kubernetes-engine/docs/tutorials/persistent-disk
And cloud sql proxy is giving me certificate errors:
esonika#cloudshell:~ (esonika)$ k logs wordpress-8d7998ccd-xnfn9 -c cloudsql-proxy
2022/12/30 10:43:38 using credential file for authentication; email=cloudsql-proxy#esonika.iam.gserviceaccount.com
2022/12/30 10:43:38 Listening on 127.0.0.1:3306 for esonika:europe-west9:mysql-wordpress-instance
2022/12/30 10:43:38 Ready for new connections
2022/12/30 10:44:01 New connection for "esonika:europe-west9:mysql-wordpress-instance"
2022/12/30 10:44:02 couldn't connect to "esonika:europe-west9:mysql-wordpress-instance": x509: certificate is valid for 38-968d77ed-a928-4b25-97d3-5451b5f3c670.europe-west9.sql.goog, not esonika:mysql-wordpress-instance
I dont know why a certificate such as "38-968d77ed-a928-4b25-97d3-5451b5f3c670.europe-west9.sql.goog" is created and where.
Tried resetting ssl configurations and it didn't work.
Usually, if you don't explicitly set a SSL connection on your Cloud SQL instance, the communication with the database is in plain text.
EXCEPT when you create a tunnel with Cloud SQL proxy. This time, a secure connection is created, with encrypted data. The encryption is ensure by this automatically and ephemeral certificate created by the proxy.
Here is a doc which might help you in connecting to Cloud SQL from GKE using sidecar pods.
Thanks, The document doesn't list anything that I haven't tried. I think there is an internal issue with cloud_sql_proxy, that's why I decided to switch Cloud SQL to a private network only and wordpress pod is directly connecting to Cloud SQL private IP.
I was running into the same issue around the time you posted this question. I also reset SSL configuration on the DB like you did. My solution was upgrading from the version 1.11 to 1.33.2. It resolved all of the x509 errors. No clue why it suddenly stopped working.

Does snowflake support ssl?

Hi I want to have the ability to connect from jdbc driver to snowflake with ssl. I did it many times with other db, just add ssl=true(sometimes other properties) in connection url.
Unfortunately I didn't find this option in snowflake documentation. I found that snowflake supports something like ssl they call it Using Key Pair Authentication
Does it ssl mode for snowflake?
Does snowflake support ssl?
You can set a jdbc connection property ssl to on or off as seen here
That's what determines if the connection will be made via https or http, going by the implementation here
I tried it and it worked for me.
Note that if ssl is not set, the value is on by default.
I believe Snowflake only allows HTTPS to connect, and so it always uses TLS/SSL.
SSL encryption cannot be switched off when connecting to Snowflake. Full Stop.
There is no option in the JDBC or ODBC driver to disable (or enable) SSL. That is why this parameter is not mentioned on the documentation pages of Snowflake.
Generic
ODBC
Snowflake connections use SSL by default.
And using only HTTPS connections. Regardless whether it is a driver or the GUI.
That is also the reason that you do not need to add https:// to your server connection.
More on this can be found here.
The http-connections you might see Snowflake drivers make are to satisfy OCSP.
If you trying to test whether Snowflake supports un-encrypted connections and you get the information back that Snowflake does support unencrypted connections or does support TLS1.0 or TLS1.1 you have been testing your proxy-server settings and not snowflakecomputing.com or snowflake.com.
BTW : Currently only TLS1.2 is supported by Snowflake for HTTPS connections.

Apache Airflow unable to establish connect to remote host via FTP/SFTP

I am new to Apache Airflow and so far, I have been able to work my way through problems I have encountered.
I have hit a wall now. I need to transfer files to a remote server via sftp. I have not had any luck doing this. So far, I have gotten S3 and Postgres/Redshift connections via their respective hooks to work in various DAGs. I have been able to use the FTPHook with success testing on my local FTP server, but have not been able to figure out how to use SFTP to connect to a remote host.
I am able to connect to the remote host via SFTP with FileZilla, so I know my credentials are correct.
Through Google searching I have found the SFTPOperator, but am not able to figure out how to use it. I have also found FTPSHook, but still I have not been able to get it to work.
I keep getting the error nodename nor servname provided, or not known or a general Operation timed out in my Airflow logs.
Can someone point me in the right direction? Should I be using the FTPSHook with SSH or FTP Airflow Conn Type? Or do I need to utilize the SFTPOperator? I am also confused as to how I am supposed to setup the credentials in my Airflow connections. Do I use the SSH profile or FTP?
If I can provide any more additional info that may help, please let me know.
Cheers!
SFTPOperator is using ssh_hook underhood to open sftp transport channel that serves as a basis for file transfer. You can either configure ssh_hook by yourself or provide connection id via ssh_conn_id.
op = SFTPOperator(
task_id="test_sftp",
ssh_conn_id="my_ssh_connection",
local_filepath="",
remote_filepath="",
operation=SFTPOperation.PUT,
dag=dag
)

Microsoft Azure VPN WebApp not communicating with external SQL

The problem I have is that we're trying to use our WebApp in Microsoft Azure to connect to an external SQL-database (not our own) through a VPN. The SQL-database is only allowing connections from our local IP-addresses that we put up as a Network (for example 176.0.0.0/24).
We are now connected to the same virtual private network, and through our Azure-VM we can now connect to the SQL-Server through SQL Server Management Studio.
Now we want to do the same with a WebApp, but we're not getting through to the server. It gets "Not authenticated" before reaching the SQL-Server (probably the server isn't accepting our IP from the WebApp).
The different problems I have tried to look through is:
Do we only try to connect through our Outbound IPs?
Is the WebApp not connected to the VPN?
I have unfortunatley not found any real answers, and neither any solutions to my problem. If you have any ideas of how to solve our problem, or maybe know how I could tunnel all of the SQL-calls through the VM, the help would be very much appreciated!
Hybrid connection is one option. What you can also do is enable point-to-site in your VPN. Once you do that, you can directly integrate your web app to the vnet and your connections will work. (Go to your web app -> Settings -> Networking -> VNet Integration)
If your Vnet is V1(older way of creating VNs) then enabling point to site is very straightforward. You can do it through portal. For V2 Vnets you have to do it through powershell commands.
Here is a link for the documentation which explains both the options.
https://azure.microsoft.com/en-in/documentation/articles/web-sites-integrate-with-vnet/
There's a way to "tunnel all of the SQL-calls through the VM". You may want to use hybrid connections (cf https://azure.microsoft.com/en-us/documentation/articles/integration-hybrid-connection-overview/).
The principle is to have an agent installed on the VM that can access the database with the correct IP address.
Suppose you can access the SQL DB as mysqldbsrv from the VM. You add an hybrid connection associated to your web app, you install the agent on the VM. Then, when you connect to mysqldbsrv from the Web App, you go through the VM.

How to configure openfire to connect mysql cluster

I want the high availability of database for my Openfire server. Hence, I have successfully created a mysql cluster. But I am not able to configure Openfire to use that mysql cluster.
I have searched about it but not able to find much except using the following connection string : "jdbc:mysql:loadbalance://" + hosts + "/test".
But this does not seems to work.
Please help me to configure my Openfire server to use mysql cluster. Any kind of help or suggestions will be appreciated. Thanks in advance.