I am not able to import the CSV file to pgAdmin. An error called 'Utility not found' appears and told me to correct the Binary Path in the Preferences dialog when I click the Import/Export Data Button.
I am using PostgreSQL 15 and Windows 11 is my OS.
I went to the preference dialog and imputed "C:\Program Files\PosgresSQL\15\bin" in the binary path so that I will be able import the CSV file but error message " 'C:\Program Files\PosgresSQL\15\bin\psql.exe' file not found. Please correct the Binary Path in the Preferences dialog"
How can I address the issue?
Related
I want to upload a file to automate a process with Selenium IDE from chrome, I already gave the permissions to the extension, however I get the following error yet.
The way I'm loading the file is as follows:
Command: type
id= txtFile
Value: C:\fakepath\factura.xls
Failed:
Unable to upload files due to cross origin frames in the page
https://i.stack.imgur.com/LNUqH.png
ist because of the Fakepath
You need to insert the direct path.
You can look it up by making a left click in the Explorer on the document.
The path should be visible now
With the above I get the error msg below in the console:
Can't open file 'ini=false': the system cannot find the file specified.
It's trying to open the filename 'ini=false'!?
ini=false is the default value. The documentation you referenced provides several examples: ini='file.ini' will create file.ini in the default system-dependent location, ini='./file.ini' will create file.ini in the ZeroBrane Studio folder, and ini='d:/file.ini' will create file.ini in the root directory of the D: drive.
I don't get any error messages in the console when I run it with zbstudio.exe -cfg "ini=false".
I'm on a Windows 10 box using Oracle Essbase Admin Services console 11.1.2.4. When I load data I get an error. My problem is that the dataload.err file will not write to my c:\ drive or any network drive to which I have access. If I manually create the dataload.err file in Windows Explorer and run the load data process, I can watch the dataload.err file disappear. I've tried a different file name and different directories, but it won't save.
Why won't the dataload.err file save? Any help would be greatly appreciated.
I need :
1) Let the user select a file from his local pc
2) Upload that file to the pentaho server
3) Process the file using a kettle transformation
I tried with a csv data source in Pentaho User Console (PUC) 5.0 but found no way to access it from a .ktr file uploaded to PUC repository. I also try to upload the csv file to a folder and still not able to access it from a .ktr file.
I think this requirement is valid :
Upload a csv data file and .ktr file to PUC folder. The .ktr should be able to read the uploaded csv file when it is executed from PUC
Imagine a simple user, with a csv. Will he be able to upload csv file to linux host using wincsp, filezilla or another ftp tool??
We need to give an easy upload functionality to our user, so after several researching hours (pentaho source code) without one line of Pentaho documentation, I found this test:
https://github.com/pentaho/pentaho-platform/blob/master/extensions/src/test/java/org/pentaho/platform/plugin/services/importer/PlatformImporterTest.java that showed me that a mimetype list should be exist somewhere.
So after search some words in all pentaho folder wiht grep command, I found this file:
/my_apps/pentaho-server-ce-7.1.0.0-12/pentaho-server/pentaho-solutions/system/ImportHandlerMimeTypeDefinitions.xml
With some intuition, I added this xml
<ImportHandler class="org.pentaho.platform.plugin.services.importer.RepositoryFileImportFileHandler">
<MimeTypeDefinitions>
<MimeTypeDefinition mimeType="text/plain" >
<extension>csv</extension>
</MimeTypeDefinition>
</MimeTypeDefinitions>
</ImportHandler>
At the bottom of file:
<tns:ImportHandlerMimeTypeDefinitions xmlns:tns="http://www.pentaho.com/schema/" .....
<ImportHandler ../>
<ImportHandler ../>
<!-- PUT CSV CONFIG HERE -->
</tns:ImportHandlerMimeTypeDefinitions>
Finally, I restarted my pentaho-server-ce-7.1.0.0-12 server and I was able to upload my csv file with this steps :
go to http://localhost:8080/pentaho
click en browse files
select some folder
click in upload (right side)
select csv and ok
Read this csv file from ktr is pending...
I hope this helps
When I am trying to import .DMP file via SQL developer I am getting this error
Exception: ORA-31640: unable to open dump file "/home/oracle/Desktop/dump/vahe.DMP" for read
dump directory and vahe.dmp file have read and write permission.
I use Database App Development VM.
how can I fix this issue ?
Thanks.
Well I found the problem. Actually I had type error. I have typo "vahe.DMP" instead of "vahe.dmp"(in lower case ). I think error message is not good one, because it should clearly say that file does not exist instead of saying "unable to open dump file '' for read" (IMHO)
Thanks everybody who tried to help me.
I was having the same error while importing DMP file shared from colleague.
error “ora-31640 unable to open dump file for read”
By creating new user with same name and password which was used while creating DMP file, and used this user for connect and importing, it resolved this error.
I was importing the data using "Data Pump Import Wizard" in oracle 11g R2 server.