Processing cubes automatically and daily on Microsoft Analysis Services - ssas

I've followed the steps at the site http://www.dotnetspider.com/resources/24960-How-Process-SSAS-Cubes-Automatically.aspx
It works in development phase, but I need to change the target of the cube in deployment environment.
I opened the package file and I've edited it manually, but it doesn't works...
I don't know if is authentication problems. But my questions is, how to parametrize the target of the cube that I want to process?
Thanks.
obs: I'm not expert in Analysis Services but I need to execute this job.

The best way is to, in SSIS, base your Analysis Services connection on an expression:
Create a variable #[Server] to hold the name of your Analysis Services server.
Add an expression to you Analysis Services connection, pointing the property ServerName to that variable.
Add a Package Configuration to your package, so you can have different configurations according to where you want to deploy the package.

I did this:
Make a DOMAIN\USER administrator of SQL Server Analysis Services
Give the same DOMAIN\USER the fixed-role "sysadmin" on the SQL Server.
Create new credentials in SQL Server with Login Data of this DOMAIN\USER.
Create a proxy user on SQL Server with the new created credentials, and allow AS Service Command and AS Service Query execution.
Create your SQL Server Agent Job that execute a query for cube processing and select the created proxy user.

Related

What is the meaning of "SQL server agent service account" in SQL-job?

Could anyone please help me to understand this code under SQL-Server Job steps. One of the step having this process and I am not getting its process behavior with Type - "Operating System (CmdExe)" and Run-as - "SQL Server Agent Service Account".
Also, What is the actual role of these Type & Run-as in option?
Run as defines the proxy account to be used to run this step. Proxy accounts defines a security context in which this job step runs. Each proxy corresponds to a security credential. For example, if you try to execute a copy command with CmdExec type, you must use a credential (e.g. Windows user account) that has rights to read the source file and rights to write in the destination folder.
Job steps can be different types:
Executable programs and operating system commands.
Transact-SQL statements, including stored procedures and extended stored procedures.
PowerShell scripts.
Microsoft ActiveX scripts.
Replication tasks.
Analysis Services tasks.
Integration Services packages.
Each type is executed differently. T-SQL scripts are sent to the database engine, executable programs (CmdExec) starts external programs (e.g. copy to copy files, or DTSRun to run a DTS package outside of SQL Server, as in your example), etc.

SQL Server Service Broker and Linked Servers

I've setup a Service Broker on one of databases to automatically pick up stored procedures in a specific schema (Build) and run them on a daily basis. So far everything has been running fine, however we've now got a need where we need to access a remote sql server which is running SQL Server 2017 and has the latest Machine Learning Service installed.
I've given the service account that runs the primary SQL Server (SQL 2016) access on the remote server, and ensured the Service Broker is executing under the service account, rather than the local SQL account (sa).
Whenever we try to access the remote server, we are getting the following error:
Linked servers cannot be used under impersonation without a mapping
for the impersonated login.
I've tried adding an EXECUTE AS to the process, however this doesn't seem to make a difference. I've also ensured it is running under the service account, and it is.
I can get it to work using a mapped login on the linked server, however this isn't ideal, as we don't want to run the sql server in mixed authentication as this has been determined as an IT risk.
I've run out of ideas or what I can do here, and can't find any other help pieces with this same problem. I did want to roll this process out to more of our warehouse builds, however this is a deal breaker at this stage.
help?

Credentials to deploy SSAS Cube

I am working on DW with SSAS cube. While development, my Cube is hosted on SQL SERVer 2008R2 on a development Server (Windows Server 2003).
Now, post to development phase I need to host the cube on test Server which would be on a remote location to which I do not have the access.
What are the possible ways I can host it on the server keeping in mind that If needed I need to re-deploy on server from BIDS Studio (when some bug arises).
What credentials I'll be needing (Will it do If I have a SQL sys rights or a windows account in that domain is a must)?
Thanks in advance!!1
In theory you shouldn't worry about it, taking into consideration of course, you have someone with access to the destination server.
You should never use BIDS to deploy your cube, it cant deal with partitions or security for example. Every deployment would overwrites these management settings of the target server.
Instead, you should use the Deployment Wizard to create your script and the you would send it to the person responsible for the deployment.
If you need more info about the deployment wizar, check my answer on this post

OLAP Cube deployment issues

I'm really new to this, so I am probably making a simple mistake.
I need to make an OLAP cube using a remote database.
After I set up the dimensions and measures and create the cube, I can not get the cube to launch to the local server.
I keep getting the error,
"The project could not be deployed to the 'localhost' server because of the following connectivity problems : A connection cannot be made. Ensure that the server is running. To verify or update the name of the target server, right-click on the project in Solution Explorer, select Project Properties, click on the Deployment tab, and then enter the name of the server."
However, the local SQL server is running(from as far as I can tell), and I have no idea on how to go about fixing this. I've tried replacing "localhost" with "." and the IP, but that hasn't worked either.
Here's the guide I was following:
http://www.mssqltips.com/sqlservertip/1532/build-a-cube-from-an-existing-data-source-using-sql-server-analysis-services/
Maybe the SQL Server isn't really running? How can I check?
Or am I skipping over something important when I try to process the cube?
you need to deploy the cube to a SSAS instance. See here I have the SQL Server instance and the SSAS instance (check the icon to see the difference):
you can check if you have it running on the services:
if you dont have it, yo ucan install from the sql server installation CD
You need to change the target Server name as the remote sql server name or ip.
In BIDS right click on your project solution and select properties window. There you can give the server name as given below figure.
Hope it helps...
OLAP cubes will not deploy to a SQL server instance. They must be deployed to an Analysis Services instance. It will be listed as 'SQL Server Analysis Services' in your service list.
This is what solved it for me:
Copy the connection from the MSSQL Server Management Studio:
Project > Properties > Deployment > Paste:
"localhost" refers to the machine you are running the Business Intelligence Studio from.
You say you want to deploy to a remote server, so replace localhost with the server name or the ip address (e.g. "74.125.237.146")
to deploy a cube AS you need an SQL instance as source, and an SSAS instance as destination.
regards,
I got the same error while following that tutorial. I solved it by:
Double Clicking the Project Data Source
In the Impersonation Information Tab, I entered the credentials for the Windows user instead of using the service account.
Make sure you have the proper Server\Instance name instead of local host in the project deployment server.
One more reason can be Express version of SQL Server. Only standard version has Analysis Service.
In Microsoft SQL server go to registered servers and click on:
Analysis services icon -> open local server groups
Click right on your username for the server and
service control -> start
It will work ;)

SQL Server agent job account issue

I am using SQL Server 2008. I am confused about which account will be used when a SQL Server agent job runs. My confusions are,
SQL Server agent as a Windows Service which we could control from Windows Service Management Console, from there we could set the account to run SQL Server Agent (LocalSystem in my computer);
Could I set SQL Server agent job level account to run on?
Could I set in each step which account SQL Server agent job step will run on?
I have above confusions because 3 different account systems may be used and my concern is what is the actual account each step will run on, and I want to avoid any permisson issues (i.e. I want to make sure the account have enough permission.). Any comments or advice? Appreciate anyone could clarify the 3 levels of accounts, which makes me very confused.
thanks in advance,
George
I would typically run the SQL Server Agent jobs under the same account as your app accesses the database.
If that account is too limited in its permissions (which might be a good thing!), I would create a single account for that app and all its SQL jobs (if that's possible) and run all SQL jobs under that account.
You could potentially run each step under a different account, but I wouldn't use that in general (it just makes it really hard to know and understand what is run under which account). Only use it if you have to run a particularly sensitive step that needs a bunch of extra permissions and those permissions are only available to a particular system account or something.
The account under which the SQL Server Agent windows service runs really doesn't have an impact on what your job steps will be run under.
So it boils down to really just two accounts:
one account is needed to run the SQL Server Agent Windows service - this is a Windows account on your machine / server which needs to have enough permissions to run the service, start and stop it - either use LocalSystem, Network Service, or whatever other Windows account you have to run services with
The other account would be the account to run your SQL Server Agent steps under - that's typically a SQL Server account (which could be based on a Windows account), and it needs enough privileges inside SQL Server to do its job, e.g. it needs access to the database objects and all. I would strive to have just one account for each app that runs the SQL Server jobs - makes life a whole lot easier!
Marc
PS: To set the user to run a step under, you need to use the "Advanced" page on the Job step property dialog and select the user from a popup window:
You can create Credentials in SQL Server (use Mgt Studio, under Security). Then create a Proxy in SQL Agent to use those credentials, telling it what kind of job steps can be used by the proxy. Then you get the choice to use that Proxy in the job step itself.
So... I make accounts for various SSIS packages to run under, so that I can keep the SQL Agent Service Account low privilege, and use a proxied credential with slightly higher privilege (not admin though, just enough permission to connect to other systems, including the File System).
Rob