We have an on-prem server which is also in our backup solution.
We were trying to create a new delegated LDAP permission for a different team by using the Add Directory option within User Directories in the Administration. We were able to create the directory but when we deleted it post the testing, the configuration made in the original directory were removed, specifically from the ldap.group.filter
Now we are trying to retrieve the old config file however I dont know the location where it is stored. Is it stored on the server or does it also reside in the DB? Is there a way to restore it without restoring the entire server?
Related
I am attempting to give access to parquet files on a Gen2 Data Lake container. I have owner RBAC on the container but would prefer to limit access in the container for other users.
My Query is very simple:
SELECT
TOP 100 *
FROM
OPENROWSET(
BULK 'https://aztsworddataaipocacldl.dfs.core.windows.net/pocacl/Top/Sub/part-00006-c62926ba-c530-4ad8-87d1-cf38c67a2da3-c000.snappy.parquet',
FORMAT='PARQUET'
) AS [result]
When I run this I have no problems connecting. I have attempted to add ACL rights onto the files (and of course the containing folders 'Top' and 'Sub').
I've give RWX on the 'Top' folder using Storage Explorer and default so that it cascades to the 'Sub' folder and parquet files as I add them
When my colleague attempts to run the SQL script the get the error message. Failed to execute query. Error: File 'https://aztsworddataaipocacldl.dfs.core.windows.net/pocacl/Top/Sub/part-00006-c62926ba-c530-4ad8-87d1-cf38c67a2da3-c000.snappy.parquet' cannot be opened because it does not exist or it is used by another process.
NB similar results are also experienced in Spark but with a 403 instead
SQL on-demand provides a link to the following help file after the error, it suggests:
If your query fails with the error saying 'File cannot be opened because it does not exist or it is used by another process' and you're sure both file exist and it's not used by another process it means SQL on-demand can't access the file. This problem usually happens because your Azure Active Directory identity doesn't have rights to access the file. By default, SQL on-demand is trying to access the file using your Azure Active Directory identity. To resolve this issue, you need to have proper rights to access the file. Easiest way is to grant yourself 'Storage Blob Data Contributor' role on the storage account you're trying to query.
I don't wish to grant Storage Blob Data Contributor or Storage Blob Data Reader as this gives access to every file on the container and not just those I want end users to be able to query. We have found the same experience occurs for SSMS connecting to parquet external tables.
So then in parts:
Is this the correct pattern using ACL to grant access, or should I use another method?
Are there settings on the Storage Account or within my query/notebook that I should be enabling to support ACL?*
Has ACL been implemented on Synapse Workspace to date given that we're still in preview?
*I have resisted pasting my entire settings as I really have no idea what is relevant and what entirely irrelevant to this issue but of course can supply.
It would appear that the ACL feature was not working correctly in Preview for Azure Synapse Analytics.
I have now managed to get it to work. At present I see that once Read|Execute is provided to a folder it allows access to the files contained within that folder and sub folders. Access is available even when no specific ACL access is provided on a file in a sub folder. This is not quite what I expected however it provides enough for me to proceed: only giving access to the Gold folder allows for separation of access to the files I want to let users query and the working files that I want to keep hidden.
When you assign ACL to a folder it's not propagated recursively to all files inside the folder. Only new files inherit from the folder.
You can see this here
Go to azure storage explorer change ACL permissions in the route Folder and right click on your storage and click on "propogate access control lists"
I'm new to Gluu and have a question regarding import/export sync with LDAP (active directory). I have a server setup locally and am able to successfully import/sync Users from my Active Directory into Gluu locally via LDAP Cache just fine.
My question is, how can I can configure Gluu so that any new users I create locally within Gluu and any AD Imported users whose attributes I update, also get exported/sync'd back to my Active Directory?
Thanks in advance
You could use Apache Directory Studio and export/import any entry you want to manage. Take a look to this section of the documentation:
https://gluu.org/docs/gluu-server/user-management/local-user-management/#import-people-in-oxtrust
Also if it's not clear at all, consider openning a question in the Gluu support platform.
https://support.gluu.org/
I want to check all the files and folders permissions in T-SQL.
For example:
Folder name: Root
Items inside the root are File1, file2, folder1
I want the list of users who has permission for these files and folders in T-SQL.
To answer your question; yes it can; however that'll require you to open up permissions that are so awful I'll not tell you how.
If you absolutely must do this then creating an External Access assembly using .Net and calling that is your answer. If you traverse this road then do NOT go the 'Trustworthy' route and bypass security. Create a asymmetric key and a user and sign your code accordingly.
Although NOT recommended, but you can use xp_cmdshell to query underlying OS/file-system from within SSMS (SQL Server Management Studio).
If you need to check if a folder/UCN-path is accessible from within SSMS, place a small database-backup file (.bak) there then use FILELISTONLY restore to simply read it, e.g.:
RESTORE FILELISTONLY FROM DISK = '\\folder_to_check\db.bak' --this will only read the file (without performing the Restore operation.
If above succeeds in reading the .bak file from your <folder_to_check> folder - it means the folder in question is accessible (via T-SQL / from within SSMS).
If not, grant access (such as READ/WRITE access on that folder) to the service account that executes your SQL-instance, which normally is a local system account or an AD-Service account.
To obtain this Service account's name, view Properties of SQL Server service in "Windows Services" (services.msc) or "SQL Server Configuration Manager" (SQLServerManager<your_SQLServer_Version_number>.msc) alternatively you can run following query:
select * from sys.dm_server_services --This will list the Accounts-Names that execute SQL Server Instance/engine service & SQL Agent service, and Full-Text Search services etc.
HTH.
I don't want to save the file containing the database password in the webroot of apache so I moved it to var/www (the server root) and include it from there.
Is this creating new security issues which weren't there before or can I leave it there?
The security issue that it creates, is that anyone who has access to the server, can get your DB credentials. Also, depending on the file and folder, there is a possibility that the file can be downloaded (really depends on the file and security settings on your web app). The industry standard solution is to encrypt the credentials in that file.
I have a windows form application that requires users to log in to access the information. I have created a local compact database file for the credentials to be stored. I added the database file to my the folder but when I open my application and try to log in it tells me that it cannot find the database file.
Should the file be stored on a different folder, or should I need to install an instance of sql on the user computer.
This is my first deployment so I am not sure how to go about it. I have done some research on the subject, but it does not seem related to my issue. The help section of Intallshield was not clear either.
I am looking for some resources on how to accomplish this.
I figure out the issue, in order to work all files, including the database files need to be dumped under the userprofile folder.