cannot write sqlite3 file to new directory - sql

I am attempting to move my learn.db file to a new folder, however when I entered the necessary commands :
.output ./iivri.andre/lxdb/teenager.sql
I received this error message
sqlite> .output ./iivri.andre/lxdb/teenager.sql
Error: cannot open "./iivri.andre/lxdb/teenager.sql"
Error: cannot write to "./iivri.andre/lxdb/teenager.sql"
How can I fix this ?
Cheers,
Andre

Related

SQL Loader Unix - Giving error SQL Loader-500 Unable to open file (ABC_CTL.dat)

We are executing SQL loader command from shell script on RH Linux OS. The command passes both control file and data file as command line parameters. The syntax used is given below (masked sensitive data):
sqlldr userid=$connstring control=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/bin/ABC_CONTROL.ctl data=$f log=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/logs/ABC.log bad=/local/abc-1.2.3/instances/www.abc.com/apps/int/script/logs/ABC.bad
The data file name is passed as dynamic variable in a FOR loop to process multiple files. The data file extension is *.app and path is /local/abc-1.2.3/instances/www.abc.com/apps/int/script/input.
We have verified that $f variable is able to correctly point to data file. Also verified file permissions. We tried changing the directory paths as well.
Still script fails with below error: **SQL Loader-500 Unable to open file (ABC_CTL.dat)**, SQL Loader-553 File not found, SQL Loader-509 System error: No such file or directory
The same script runs with exact same syntax on another server. Please suggest any solutions.

Beeline CLI create txt file command?

I recently have started to use Beelines CLI to interface with a hive server.
The problem is that create file command is failing for me.
I have tried the following:
add FILE[S] 'example.txt';
Which returns this error:
Error: Error while processing statement: null (state=,code=1)
You should remove the quotes from the path. i.e.
beeline> add file example.txt;
Also be sure that you are only adding files to the server hive is running on.

Why can't I attach an MDF file to SQL Server 2012 Express

I've tried using SSMS but when I click Attach I get error File Not Found. This is AdventureWorks2012_Data.mdf. There is no LDF file. I tried exec'ing this as well:
EXEC sp_attach_single_file_db #dbname='AdventureWorks2012',
#physname = N'R:\SqlServer\AdventureWorks2012_Data.mdf'
GO
but I get the same error:
failed with the operating system error 2 (The system cannot find the file specified.).
TRY the below snippet and you will be through without a .ldf file.
create database AdventureWorks2012_Data
on
(FILENAME='D:\AdventureWorks2012_Data.mdf')
FOR ATTACH_REBUILD_LOG
GO

How to execute a Zim Database file from command line?

I have a Zim Database myfile file with the content:
output "hello world"
I want to execute the file from command line. But, when I try to call zimmu myfile from shell, I see the following error:
*** Error *** "myfile" is not a know name.
>
In order to work, 'myfile' must have been created in directory 'zim' and must be located in zim0001.ws (if compiled).
If not compiled, 'myfile' must be located in the database directory.

Exception: ORA-31640: unable to open dump file "..." for read

When I am trying to import .DMP file via SQL developer I am getting this error
Exception: ORA-31640: unable to open dump file "/home/oracle/Desktop/dump/vahe.DMP" for read
dump directory and vahe.dmp file have read and write permission.
I use Database App Development VM.
how can I fix this issue ?
Thanks.
Well I found the problem. Actually I had type error. I have typo "vahe.DMP" instead of "vahe.dmp"(in lower case ). I think error message is not good one, because it should clearly say that file does not exist instead of saying "unable to open dump file '' for read" (IMHO)
Thanks everybody who tried to help me.
I was having the same error while importing DMP file shared from colleague.
error “ora-31640 unable to open dump file for read”
By creating new user with same name and password which was used while creating DMP file, and used this user for connect and importing, it resolved this error.
I was importing the data using "Data Pump Import Wizard" in oracle 11g R2 server.