I have created a directory in Oracle via the command create directory. The directory is called temp. I have a file located on:
G:\Documents\SO\Content\102010 Stack Overflow\test.xml
that I want to put in this directory. How can I do this via a SQL statement?
Oracle's directory feature is not meant to enable file transfer from a client to the server. If you want to put a file there for Oracle's use (e.g. as an external table), then the standard way that is done is for the dba or system administrator to move the file into the appropriate directory on the server; or for them to grant you access via standard file-sharing means.
Related
We have an on-prem server which is also in our backup solution.
We were trying to create a new delegated LDAP permission for a different team by using the Add Directory option within User Directories in the Administration. We were able to create the directory but when we deleted it post the testing, the configuration made in the original directory were removed, specifically from the ldap.group.filter
Now we are trying to retrieve the old config file however I dont know the location where it is stored. Is it stored on the server or does it also reside in the DB? Is there a way to restore it without restoring the entire server?
I am attempting to give access to parquet files on a Gen2 Data Lake container. I have owner RBAC on the container but would prefer to limit access in the container for other users.
My Query is very simple:
SELECT
TOP 100 *
FROM
OPENROWSET(
BULK 'https://aztsworddataaipocacldl.dfs.core.windows.net/pocacl/Top/Sub/part-00006-c62926ba-c530-4ad8-87d1-cf38c67a2da3-c000.snappy.parquet',
FORMAT='PARQUET'
) AS [result]
When I run this I have no problems connecting. I have attempted to add ACL rights onto the files (and of course the containing folders 'Top' and 'Sub').
I've give RWX on the 'Top' folder using Storage Explorer and default so that it cascades to the 'Sub' folder and parquet files as I add them
When my colleague attempts to run the SQL script the get the error message. Failed to execute query. Error: File 'https://aztsworddataaipocacldl.dfs.core.windows.net/pocacl/Top/Sub/part-00006-c62926ba-c530-4ad8-87d1-cf38c67a2da3-c000.snappy.parquet' cannot be opened because it does not exist or it is used by another process.
NB similar results are also experienced in Spark but with a 403 instead
SQL on-demand provides a link to the following help file after the error, it suggests:
If your query fails with the error saying 'File cannot be opened because it does not exist or it is used by another process' and you're sure both file exist and it's not used by another process it means SQL on-demand can't access the file. This problem usually happens because your Azure Active Directory identity doesn't have rights to access the file. By default, SQL on-demand is trying to access the file using your Azure Active Directory identity. To resolve this issue, you need to have proper rights to access the file. Easiest way is to grant yourself 'Storage Blob Data Contributor' role on the storage account you're trying to query.
I don't wish to grant Storage Blob Data Contributor or Storage Blob Data Reader as this gives access to every file on the container and not just those I want end users to be able to query. We have found the same experience occurs for SSMS connecting to parquet external tables.
So then in parts:
Is this the correct pattern using ACL to grant access, or should I use another method?
Are there settings on the Storage Account or within my query/notebook that I should be enabling to support ACL?*
Has ACL been implemented on Synapse Workspace to date given that we're still in preview?
*I have resisted pasting my entire settings as I really have no idea what is relevant and what entirely irrelevant to this issue but of course can supply.
It would appear that the ACL feature was not working correctly in Preview for Azure Synapse Analytics.
I have now managed to get it to work. At present I see that once Read|Execute is provided to a folder it allows access to the files contained within that folder and sub folders. Access is available even when no specific ACL access is provided on a file in a sub folder. This is not quite what I expected however it provides enough for me to proceed: only giving access to the Gold folder allows for separation of access to the files I want to let users query and the working files that I want to keep hidden.
When you assign ACL to a folder it's not propagated recursively to all files inside the folder. Only new files inherit from the folder.
You can see this here
Go to azure storage explorer change ACL permissions in the route Folder and right click on your storage and click on "propogate access control lists"
I want to check all the files and folders permissions in T-SQL.
For example:
Folder name: Root
Items inside the root are File1, file2, folder1
I want the list of users who has permission for these files and folders in T-SQL.
To answer your question; yes it can; however that'll require you to open up permissions that are so awful I'll not tell you how.
If you absolutely must do this then creating an External Access assembly using .Net and calling that is your answer. If you traverse this road then do NOT go the 'Trustworthy' route and bypass security. Create a asymmetric key and a user and sign your code accordingly.
Although NOT recommended, but you can use xp_cmdshell to query underlying OS/file-system from within SSMS (SQL Server Management Studio).
If you need to check if a folder/UCN-path is accessible from within SSMS, place a small database-backup file (.bak) there then use FILELISTONLY restore to simply read it, e.g.:
RESTORE FILELISTONLY FROM DISK = '\\folder_to_check\db.bak' --this will only read the file (without performing the Restore operation.
If above succeeds in reading the .bak file from your <folder_to_check> folder - it means the folder in question is accessible (via T-SQL / from within SSMS).
If not, grant access (such as READ/WRITE access on that folder) to the service account that executes your SQL-instance, which normally is a local system account or an AD-Service account.
To obtain this Service account's name, view Properties of SQL Server service in "Windows Services" (services.msc) or "SQL Server Configuration Manager" (SQLServerManager<your_SQLServer_Version_number>.msc) alternatively you can run following query:
select * from sys.dm_server_services --This will list the Accounts-Names that execute SQL Server Instance/engine service & SQL Agent service, and Full-Text Search services etc.
HTH.
I'm trying to go through multiple .trc files to find out who has been logging into SQL Server over the last few months. I didn't setup the trace, but what I've got are a bunch of .trc files,
ex:
C:\SQLAuditFile2012322132923.trc,
C:\SQLAuditFile201232131931.trc
etc.
I can load these files into SQL Profiler and look at them individually, but I was hoping for a way to load them all up, so that I can quickly scan them for logins. Either using a filter, or better yet, load them into a SQL Server table and query them.
I tried loading the files into a table using:
use <databasename>
GO
SELECT * INTO trc_table
FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', 10);
GO
But when I do this, i get the error message:
File 'C:\SQLAuditFile2012322132923.trc' either does not exist or is not a recognizable trace file. Or there was an error opening the file.
However, I know the file exists, and I have the correct name. Also they appear to be recognizable because I can load them up into SQL Profiler and view them fine.
Anybody have an idea why I'm getting this error message, and if this won't work, perhaps another way of analyzing these multiple .trc files more easily?
Thanks!
You may be having permissions issues on the root of C:. Try placing the file into a subfolder, e.g. c:\tracefiles\, and ensuring that the SQL Server account has at least explicit read permissions on that folder.
Also try starting simpler, e.g.
SELECT * FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', default);
Anyway unless you were explicitly capturing successful login events, I don't know that these trace files are going to contain the information you're looking for... this isn't something SQL Server tracks by default.
I had pretty much the same issue and thought I'd copy my solution from
Database Administrators.
I ran an SQL trace on a remote server and transferred the trace files to a
local directory on my workstation so that I load the data into a table on my
local SQL Server instance for running queries against.
At first I thought the error might be related permission but I ruled this
out since I had no problem loading the .trc files directly into SQL Profiler
or as a file into SSMS.
After trying a few other ideas, I thought about it a bit more and realised
that it was due to permissions after all: the query was being run by the SQL
Server process (sqlsrvr.exe) as the user NT AUTHORITY\NETWORK SERVICE –
not my own Windows account.
The solution was to grant Read and Execute permissions to NETWORK
SERVICE on the directory that the trace files were stored in and the trace
files themselves.
You can do this by right-clicking on the directory, go to the Security
tab, add NETWORK SERVICE as a user and then select Read & Execute for
its Permissions (this should automatically also select Read and
List folder contents). These file permissions (ACLs) should automatically
propagate to the directory contents.
If you prefer to use the command line, you can grant the necessary permissions to
the directory – and its contents – by running the following:
icacls C:\Users\anthony\Documents\SQL_traces /t /grant "Network Service:(RX)"
This may be a really simple question, but I'm trying to create the database for ACL in CakePHP. I don't have shell access, so I want to simply upload the sql file through PHPmyAdmin.
The cakePHP instruction manual says to use app/config/sql/db_acl.sql, but the most recent download of cakePHP does not have that file. Instead it has app/config/schema/db_acl.php, which obviously can't be uploaded to create the tables.
Is the sql file still available? Is there another way to create those tables without hand typing it all?
Thanks!
The file should be available if you created the project folder using bake.
If it's not, it should be in /cake/console/templates/skel/config/schema/db_acl.sql.