Via Documentum Administrator, how is it possible to list the docbroaker configured with the actual accessed docbase?
Using the DQL tools, execute the following command:
EXECUTE list_targets;
Related
I'm working on teradata fastload scripts and I'm using the LOGON command to establish a connection with the database.
LOGON DBC_ip/username,password;
But for security purpose I would like to get the password from a vault like application using a shell script.
Initialy I was trying to create a wrapper shell script that would get the password from the vault and use it in the fast load script.
wrapper_shell_script--> fetch_password($password) --> execute fastload script using $password.
Example: LOGON DBC_ip/username,$password;
My question:
Is it possible to use external variables in fastload scripts. if yes, can it done using this process.
Could anyone please help me if this is possible or if there any other bette way to implement this.
Let me know if you need more details
Thank you in advance!!
Setup: A pretty standard data export SSIS package (SQL Server 2016 compatible), created in VS2019/Data Tools and deployed using the SSIS Project Deployment model to the Integration Services Catalog of a SQL Server 2016 instance. The package creates files in a network folder before sending the file out via FTP and putting a copy of the file in a Sent folder.
The project requirements include having the package running on a schedule using "default" parameter values, as well as allowing users to manually run the package using "non-default" parameter values from within a stand-alone application.
Current behavior: the package behaves correctly when run from a SQL Server Agent Job that is configured with a SQL proxy and credentials mapped to a domain login with the proper permissions for the network folder.
Problem: the Data Flow task fails to create the file with a "Cannot open the datafile" error when running the package directly using any of the following methods (even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job):
Using SSMS to right-click on the package and selecting Execute
Using the DTEXEC SQL utility
Using the SSISDB.catalog.start_execution SQL Server stored procedure
As far as I'm aware, these are the only methods capable of starting a SSIS package and changing the package's parameter values. I either need to get one of the latter 2 methods to work, find another option that allows for changing the parameter values while launching the package, or use one of 2 techniques I'm aware of (detailed below) that would add yet another failure point to the process as well as other potential issues.
Note: If the process is changed to initially create the file on the SQL Server's local harddrive, then the Data Flow task succeeds, but the later copy to Sent folder task fails with a very similar permissions error.
Alternative #1: this technique requires creating a new table, loading the parameter values to the table, changing the package to check the table and potentially set it's parameters/variables based on what it finds. The package can then be launched using a SQL Server Agent Job (for which there are multiple methods to manually launch them) and if the calling object has correctly populated the table, the package will behave as if it's parameters were changed at runtime otherwise it will run with the default values.
Alternative #2: Change all folders used by the package to point to folders local to the SQL Server instance and then create a separate scheduled task/application/whatever, with the valid credentials, that would synchronize or move the files to their proper network folders.
even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job
This is probably because the account is not logged on locally at the SQL Server, and so it's a Double-Hop Impersonation scenario, and would require Kerberos Constrained Delegation to be configured.
And you are correct in assessing the options. The general solution is to invoke catalog.start_execution from a session running on the SQL Server, and an Agent Job is the simplest built-in way to do this (the others being xp_cmdshell, Service Broker Activation, or SQL CLR).
On my linux server, I have a 'load_app' user account. My plan was to write a generic shell script to load data into bigquery using bq tool. gcloud is installed and I have multiple configurations in my environment. These configurations point to different service and user accounts for different projects. I am trying to set it up such that I can run multiple executions of this script at the same time for different configurations.
I tried setting CLOUDSDK_ACTIVE_CONFIG_NAME= in my script, the config-name value is being passed to the script. This setting should switch the active configuration, however, I am getting the following error :
ERROR: (bq) There was a problem refreshing your current auth tokens: invalid_grant: Bad Request
I am trying to move all .zip in a specific folder to another folder. the source folder is located on another server, currently i am using
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
Which is working if I am logged into both server, but the goal is to automate this process VIA sql server job agent. I have tried
EXECUTE sp_xp_cmdshell_proxy_account 'domain\useracc','pass'
GO
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
but I am receiving the following error;
An error occurred during the execution of sp_xp_cmdshell_proxy_account. Possible reasons: the provided account was invalid or the '##xp_cmdshell_proxy_account##' credential could not be created. Error code: '0'.
And also not sure if this is my solution. Please help with how I can achieve this. The file names on server1 change name and quantity everyday.
I would strongly advise...Do not use xp_cmdshell. It opens up large security wholes in your surface area and makes you vulnerable to attack. xp_cmdshell should be disabled!
Instead, if you want to automate this with server agent you have 2 options. My preference would be to write a simple SSIS package with a file system task and schedule this package with server agent. SSIS is underutilized for this kind of task but is actually pretty good at it.
Alternatively re-write your script to use Server Agent CmdExec job steps. This does not require xp_cmdshell to be enabled and reduces the attack surface.
I Found that the following worked for me;
In the command prompt, type services.msc, this would open the list of all services on the server.
In the list of services, look for SQL Server Agent, Right Click -> Properties. Go to Logon Tab
Change the logon to a user with access on both servers. then re-write your script to use Server Agent CmdExec job steps(Thank you Pete Carter)
I have a deployment package that needs to run against about 3 different enviroments.
I want to specify a sql script to run (source) with the enviroments database (destiniation).
I don't want to specify the connection string in the deploy script because it contains sql login info.
I would like to be able to read a setting from the destination for the connection string.
Can I mark this a parameter to be specified when unpackaging the deployment package on the server? If so, how so I use the parameter in the dest:sql="connection string"?
Any suggestions would be great.
Scott Guthrie has a pretty good write up on this sort of thing here. He specifically mentions the changing of parameters both in prompts for the admin and in an automated fashion via the command line within deployment and/or automation scripts.