Using IP ADDRESS in PROC EXPORT in SAS - file-io

I'm using PROC EXPORT of SAS to export information in xlsx format into a shared folder in my network. When I use the servername in the path of the output file, everything works well. When I change for the IP ADDRESS of the server, I get the following error:
ERROR: Connect: The Microsoft Access database engine cannot open or write to the file '\\123.12.12.12\PUBLIC\TEST1.xlsx'. It is already opened exclusively by another user, or you need permission to view and write its data.
Here it's my code:
proc export
data=WORK.TABLE1
DBMS=EXCEL
outfile="\\123.12.12.12\PUBLIC\TEST1.xlsx"
REPLACE;
SHEET='A';
run;
Do you know if the IP ADDRESS format is supported by PROC EXPORT? If not, it exists other method to export in SAS by using IP ADDRESS in the path of the output file? I have to use a way with IP ADDRESS as the servername is changed from time to time and I a have a bunch of scheduled SAS projets.
Thank you,
Dan

That error is coming from the file system. SAS is using the file system to get to whatever path you specify, so in Windows, what you have should work.
Possible Problems:
Someone else has the file open.
Another process has locked the file.
You (the user SAS runs as) do not have write permissions to the file
or directory.
Test #3 by
data _null_;
file "\\123.12.12.12\PUBLIC\TEST1.txt";
put "Hi";
run;
This will confirm you have write permissions in the directory. If SAS is running on the server, do this in the same way you call your other program.
If that is successful, then try deleting the XLSX file from Windows. If that fails, you don't have permission or someone has the file opened. You will need to debug that.
If it succeeds, then rerun your program. Hopefully it will create the file.

Related

Store Neomutt Mails in External Disk

I have installed Neomutt on Arch Linux using Luke Smith's Mutt-Wizard. It's working fine. I am storing all my emails in my local laptop's ~/.config/mutt/accounts folder which is mentioned in my .muttrc file.
But I have thousands of emails. So I wanted to change the location of storing the mails. I intend to store them on an external hard disk. But when I write the location of external disk in my .muttrc, Neomutt gives me error:
Maildir error: cannot read UIDVALIDITY.
Error: channel joy_deep#gmx.com: near side box INBOX cannot be opened.
Is there any way to config this?
I got it figured out somehow. I copied the mw file to mymw file. I changed the bash script. In maildir location, I put my Nextcloud folder. Changed same for .mbsyncrc file. Now it works.
Thanks.

Access denied to run exe file in stored procedure

i want to run exe file from stored procedure. But, output is 'Access is denied'.
how can i do? my sql query is as the following. It is just testing query
EXEC master..xp_CMDShell 'C:\Users\myo.minlin\Downloads\Firefox Setup Stub 33.1.1.exe'
The reason is because your SQL-Server process runs, if not specified other, not with your myo.minlin account. The account which started the process does not have permissions on the file you provided. This is good so, because the database should not be able to access your private files. There are three or even more possible solutions:
Move the file to a location where the SQL-Server has access to.
Make sure that the account on which the SQL-Server is running has the right permissions to execute your file
Launch the SQL Server process with the myo.milin account.
I do not reccommend solution 3. Solution 2 is also not reccommended if the file is in your home directory.
See Configure Windows Service Accounts and Permissions.

Error using BCP and hidden path to export data from SQL

When I use hidden path like \\\twn-a110093\s$\SNData1.csv, it does not work.
Error message:
[Microsoft][SQL Native Client] Unable to open BCP host data-file
Perhaps the '$' char is not recognized? How do I fix this problem?
Exec master..xp_cmdshell 'bcp " select * from sfm.dbo.tblSNDataDetail " queryout "\\twn-a110093\s$\SNData1.csv" -c -t, -T -S TWN-SQL-01'
Simply you can't BCP data directly to UNC path, there will be double hop issue, you have to set up constrained delegation between 2 servers. What you need to do is to first BCP to your local drive, then move/copy file to UNC path, which is actually faster than you directly BCP to UNC path even you set it up correctly. Believe it or not, try it.
More likely a security issue:
"The Windows process spawned by xp_cmdshell has the same security rights as the SQL Server service account"
Be sure the SQL Server service account user has access to that path/file.
You are using an UNC path in BCP command, and actually you directly create a file on drive root S$. It will involve security check. What you can try is:
1. First change the path from drive root folder to a sub folder like \twn-a110093\s$\Test\SNData1.csv
2. Give SQL Server service account write permission to folder 'test', if SQL Server running under NT Service, then you need to give Network Service account write permission to folder
3.If "twn-a110093" is not the same server which SQl Server located, then on the file share folder, you probably need to give 'everyone' full permission.
4. If all doesn't work, then first BCP out to local folder, then robocopy file to UNC Path, like #Guna said, that is better to directly write file to UNC path, that is true.

Check who has logged in using SQL Server 2000 trc files

I'm trying to go through multiple .trc files to find out who has been logging into SQL Server over the last few months. I didn't setup the trace, but what I've got are a bunch of .trc files,
ex:
C:\SQLAuditFile2012322132923.trc,
C:\SQLAuditFile201232131931.trc
etc.
I can load these files into SQL Profiler and look at them individually, but I was hoping for a way to load them all up, so that I can quickly scan them for logins. Either using a filter, or better yet, load them into a SQL Server table and query them.
I tried loading the files into a table using:
use <databasename>
GO
SELECT * INTO trc_table
FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', 10);
GO
But when I do this, i get the error message:
File 'C:\SQLAuditFile2012322132923.trc' either does not exist or is not a recognizable trace file. Or there was an error opening the file.
However, I know the file exists, and I have the correct name. Also they appear to be recognizable because I can load them up into SQL Profiler and view them fine.
Anybody have an idea why I'm getting this error message, and if this won't work, perhaps another way of analyzing these multiple .trc files more easily?
Thanks!
You may be having permissions issues on the root of C:. Try placing the file into a subfolder, e.g. c:\tracefiles\, and ensuring that the SQL Server account has at least explicit read permissions on that folder.
Also try starting simpler, e.g.
SELECT * FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', default);
Anyway unless you were explicitly capturing successful login events, I don't know that these trace files are going to contain the information you're looking for... this isn't something SQL Server tracks by default.
I had pretty much the same issue and thought I'd copy my solution from
Database Administrators.
I ran an SQL trace on a remote server and transferred the trace files to a
local directory on my workstation so that I load the data into a table on my
local SQL Server instance for running queries against.
At first I thought the error might be related permission but I ruled this
out since I had no problem loading the .trc files directly into SQL Profiler
or as a file into SSMS.
After trying a few other ideas, I thought about it a bit more and realised
that it was due to permissions after all: the query was being run by the SQL
Server process (sqlsrvr.exe) as the user NT AUTHORITY\NETWORK SERVICE –
not my own Windows account.
The solution was to grant Read and Execute permissions to NETWORK
SERVICE on the directory that the trace files were stored in and the trace
files themselves.
You can do this by right-clicking on the directory, go to the Security
tab, add NETWORK SERVICE as a user and then select Read & Execute for
its Permissions (this should automatically also select Read and
List folder contents). These file permissions (ACLs) should automatically
propagate to the directory contents.
If you prefer to use the command line, you can grant the necessary permissions to
the directory – and its contents – by running the following:
icacls C:\Users\anthony\Documents\SQL_traces /t /grant "Network Service:(RX)"

Connecting to a File share with a flat file source in SSIS2005

I have created a SSIS package in BIDS005 that uses a flat file source as the input. The file I am wanting to use doesnt exist on my local machine, or on the server where the package will be executed. The file exists on a file share on another server, however when I try to use the server path for example:
\\servername\fileshare$\filename.csv
I get an error message saying:
A valid filename must be selected
Any ideas why this is happening?
It could be a permissions problem. Do you have a valid connection to the file containing the userID & pwd to connect to the share drive?
Is it a problem in the development environment, or only when deployed?