SSL_CTX_load_verify_locations error while trying to connect Impala with Table Desktop - impala

I need help for this error on Cloudera Impala :
[Cloudera][ImpalaODBC] (100) Error form the Impala Thrift API: SSL_CTX_load_verify_locations: error code: 0 .
Could anyone help me to solve and explain what does the error means ?

You need to save a copy of the .pem certificate from the Impala server to the computer running Tableau Desktop.
Download and edit the TDC file to specify the file path to the trusted certificates, and then add the .tdc file to:
Tableau Desktop: The My Tableau Repository\Datasources folder.
Tableau Server for Windows: In the Tableau Server data directory
under tabsvc\vizqlserver\Datasources. The default path is
C:\ProgramData\Tableau\Tableau
Server\data\tabsvc\vizqlserver\Datasources
Tableau Server for Linux: In the Tableau Server data directory under
tabsvc/vizqlserver/Datasources. The default path is
/var/opt/tableau/tableau_server/data/tabsvc/vizqlserver/Datasources/
For any changes to Tableau Server, changes must be applied to all nodes using processes that make data source connections (Backgrounder, Data Server, Vizportal, VizQL Server).
The TDC file must be an exact match to its counterpart on Tableau Desktop: the same drive letter, file path, and name for the .pem file.

Related

pgAdmin4 web "Utility Not Found"

I'm running pgadmin4 on Apache Web Server and i'm having this problem when i'm trying to import/export data from a CSV uploaded to the Storage Manager.
Storage Manager works, can upload and download.
Utility not found. Could not find the specified server
/pgadmin4/import_export/utility_exists/1 returns this JSON
Things that i've tried:
Restarting postgresql and apache
Set apache allow all CORS configuration "*"
Set PostgreSQL 14 binary path in pgAdmin preferences -
/usr/lib/postgresql/14/bin
I can't find any solution to this specific problem, all solutions i've seen were simply fixed by setting the binary path.

-bash: imp: command not found oracle

I have a centos Linux machine and a oracle server installed on a server which is located at a remote location.
I have installed oracle client on my Linux centos machine using the link :
How to install SQL * PLUS client in linux
It may be noted that while installing client there was no /network/admin directory and hence no tnsnames.ora file. now I have manually created the directories and have have created a tnsnames.ora file. I am able to connect to remote server.
Now when I look into the bin folder I get just three exe
adrci genezi sqlplus.
I cant find the imp.
Hence when I try to import the dump file from centos to oracle , I get the error:
-bash: imp: command not found
I am using the following command to import dump on oracle server:
imp 'rdsuser#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=oracledbrds.cwuabchlhlu.us-east-2.rds.amazonaws.com)(Port=1521))(CONNECT_DAT A(SID=oracledb)))'
Kindly help
The instant client does not include many of the tools from the full client, including imp/exp, their newer data pump equivalents, SQL*Loader etc. See the instant client FAQ, which highlights that it's largely for distribution with your own applications, but can incude SQL*Plus - the only tool mentioned.
If you need to use export/import or any other tools then you will need to install the full client, or run them on the server; which might be an issue with AWS. Amazon have an article on importing data into Oracle.
Incidentally, you can put your tnsnames.ora file anywhere as long as you set TNS_ADMIN to point to that location, but you aren't referring to it in your imp command anyway - you're specifying all the connection data. If you know the service name, which may be different to the SID (you can run lsnrctl services on the server to find the right value) you can use the 'easy connect' syntax:
sqlplus rdsuser#//oracledbrds.cwuabchlhlu.us-east-2.rds.amazonaws.com:1521/your_service_name

How to get a the remote file attributes through FTP using ConnectionKit

I need to get the file attributes of a remote file using the ConnectionKit framework.
What I am trying to do is compare the modification date of the file on the server and if it is newer than the local copy then download it. I've got the CKTransferRecord for the transfer but this only gives me the remote file path not that remote files attributes.
What is the best way to get the remote files attributes?
Thanks for your help!

Connecting to a File share with a flat file source in SSIS2005

I have created a SSIS package in BIDS005 that uses a flat file source as the input. The file I am wanting to use doesnt exist on my local machine, or on the server where the package will be executed. The file exists on a file share on another server, however when I try to use the server path for example:
\\servername\fileshare$\filename.csv
I get an error message saying:
A valid filename must be selected
Any ideas why this is happening?
It could be a permissions problem. Do you have a valid connection to the file containing the userID & pwd to connect to the share drive?
Is it a problem in the development environment, or only when deployed?

Error opening .mdf file through SQL Server Management Studio Express

I am doing a project of a web enabled database. I have created the database file in my PC.
Now when I just want to open .mdf i.e. of the database I created, I cannot open it in other PC. I even had copied the .ldf file i.e the log file to that PC.
Since I need to transfer the database to the Server later, I don't know how I will dump the database in that server from my PC so that the company can use it.
The basics of using mdf file is like this:
Create a new database using SQL Server (set the path for the file as you wish)
If you wish to move the file elsewhere,
detach the database from your server
copy/move the file to wherever you wish
attach the file as database in SQL server
You are not supposed to open the file by double-clicking as mime setting or attachment of extensions migh not be there in the target machine.
Why don't you use the decent method to copy/move database?