I am new to analysis services and data cubes. I inherited someone else's project, and I am using BIDS 2005. The company I work for recently relocated my analysis database to another server--lets say from "Server1\tst1" to "Server2\tst1". Now every time I reopen BIDS and want to deploy my data cube to the new server, I have to go Project -> Properties -> Deployment -> and modify the target server value to deploy to the new location.
How do I change the default deployment location value as to not recreate this issue everytime I open BIDS and deploy?
The read/write attribute on the dwproj.user file was set to read only by source control. Once read/write was available, the change in BIDS was retained each time the application reopens.
Related
Using SQL Server Analysis Services 2019 running in Tabular mode, I get this error every time I open an existing Tabular Project solution in Visual Studio 2017 (version 15.9.3, though I don't think the version is the issue). Even creating a new Analysis Services Tabular Project, closing it, and opening a second time again causes the same error.
An error occurred while opening the model on the workspace database.
Reason: The operation cannot be executed since the database with the
name of 'Data Warehouse Tabular_5a21b9d1-2c2e-43e3-9174-981ccddf6f66',
ID of 'Data Warehouse Tabular_5a21b9d1-2c2e-43e3-9174-981ccddf6f66'
already exists in the detached state in folder '\?\C:\Program
Files\Microsoft SQL Server\MSAS15.MSSQLSERVER\OLAP\Data\Data Warehouse
Tabular_5a21b9d1-2c2e-43e3-9174-981ccddf6f66.0.db'. Either attach the
database or delete the folder and retry the operation.
This error is described very well here: https://blogs.msdn.microsoft.com/jason_howell/2013/07/22/cannot-reopen-an-analysis-services-tabular-project-the-second-time-error-database-already-exists-in-the-detached-state/
Unfortunately, implementing the suggested fix of making sure that my DataDir was referenced in my AllowedBrowsingFolders setting did not make a difference. Here are my current settings:
Running SystemGetSubdirs 'C:\Program Files\Microsoft SQL Server\MSAS15.MSSQLSERVER\OLAP\Data\' in an MDX connection returns no results. However, running SystemGetSubdirs on the parent OLAP folder does return three of the six folders in that directory (including the Data folder). I have forced the Data folder to inherit the permissions of the OLAP folder, and forced those permissions on all child objects, and I have tried giving Full Control to the Data folder to the 'Everyone' user, to my user and to the SSAS service account user. I have tried creating a new data folder on the root of my C: drive and holding the databases there, but none of this has made a difference. I've restarted the SSAS service after these changes.
I'm using SSDT to create my Tabular model, I'm creating a table that I'm partitioning (Two weeks of data - 24 Partitions per year) See below.
Usually I'm preparing 2 years of data partitioned (meaning 48 partitions).
When I'm deploying the model to Analysis Services I can access it from SSMS by connecting to my Analysis Services instance
My question is,
I've managed to create an automated script that generates the XMLA query of creating the partitions in SSMS, I'm executing it and I can see the partitions being created, However when returning to SSDT and opening the solution these partitions are not reflected there. is there a way to "force" SSDT to read the meta data from the analysis services instance upon opening the solution again?
Additionally, If I continue developing the model in SSDT, once I'll deploy it again all the changes I made via SSMS will be overridden, is there a way to avoid that?
Creating partitions manually in SSDT can be very painful...
I've managed to create a script that will automate it, but not in SSDT
Any suggestions?
As userfl89 already pointed out, any partitions that you create in SSMS need to be "backported" into your SSDT project, for example by using the "Import From Server (Tabular)" option when creating a new project. Otherwise, you risk losing the partitions (and the data contained in them) when deploying from SSDT.
Alternatively, you can use BISM Normalizer - a plugin for Visual Studio - to merge changes (such as partitions) back and forth between SSDT and the deployed database.
There's also the Analysis Services Deployment Wizard which takes the contents of your projects \bin\ folder and lets you deploy to a database, specifying that you don't want to overwrite existing partitions.
Lastly, if you haven't already, I would recommend taking a look at Tabular Editor. It's an alternative to SSDT for developing the model, so there will be some learning involved of course, but the good news is that you can do partial deployments, in order to avoid affecting the partitions on the already deployed database.
The database that you're accessing in SSDT is your workspace database. The workspace database is essentially a local copy of the tabular model. The partitions you added to the model in SSMS were created, the workspace database is just out of sync. Your can overwrite your workspace database with the current version of the model by deleting/moving the files used in your local SSAS project, then creating a new Analysis Services project in SSDT and using the "Import From Server (Tabular)" option, then selecting the current version of the tabular model. This will create a new workspace database using the current version of the model. When doing this, make sure that when you delete or move the files from your local SSAS project, the files you move are for your local project, not the actual model. If you need to verify the location of the files used by the model, the DataDir property of the SSAS instance in SSMS will show this file path.
I installed Microsoft SQL Analysis Service because I need it to run a forecast analysis from Excel using the Data Mining Plug-in.
When I open MS Management Studio and connect to the SSAS I don't know how to create a new database.
When I right-click over Databases there is nothing like Create Database or New Database.
This is the image of my problem:
Well... I solved my problem reinstalling the suite selecting all of its options.
I still don't know what may cause this problem if not chosen in the instalation options but now I have the New Database option and could finish my job.
Thanks.
I managed to solve this problem by switching Analysis Services from Tabular to Multidimensional mode:
Stop SQL Server Analysis Services service
Go to the config folder, e.g. C:\Program Files\Microsoft SQL Server\MSAS15.SQL2019\OLAP\Config
Copy file msmdsrv.ini to another folder (you can't edit it here directly)
Open the copied file and search for the DeploymentMode xml tag
Set the value to zero
Save the file, copy it back to the Config folder
Start the service
You can use SQL Server Management Studio to create a new, empty database on an instance of SQL Server Analysis Services.
To create an Analysis Services database
Connect to an Analysis Services instance.
In Object Explorer, expand the node for the connected Analysis Services instance.
Right-click the Databases node of the Analysis Services instance and select New Database.
In the New Database dialog box, in Database name, type the name of the new database.
In Impersonation, provide impersonation information for the new database.
In Description, type the optional description for the new database.
Click OK.
In my cube, partition creation and processing is automated. Now, if I make any change to the cube structure through BIDS, while deploying the changes it will delete all the partitions which are not defined in the cube. Is there any way to avoid this?
Could you create a new solution in bids by importing from your server? That way any partition definitions will be present along with any new changes you make.
BIDS normally works with local files. Each deployment overwrites the structure on the server with the version you have in your local files.
If you do structural changes on the server independently from BIDS that you want to keep, but add changes in BIDS, you can just get the current state of the server structure back to local files by selecting File/New/Project/Business Intelligence Projects/Import Analysis Services Database. Make sure you set the project settings as required in the bottom of the dialog before hitting the OK button.
Another possibility to work with BIDS is in online mode: In this mode, BIDS does not work on local files, but directly on the structure as it is on the server. To use this mode, select File/Open/Analysis Services Database, and select the server and database you want to open. Some menu entries in BIDS change, and each time you hit the "Save" toolbar icon or its menu counterpart, the changes are directly written to the server structure. Note however, that you will not have a local copy of the database structure in this case, which means that e. g. version controlling your Analysis Services database structure is impossible.
TThanks, for reading, I'll try to explain my issue in a detailed format as the question I'm asking is a bit high-level for my experience-level.
I'm using VS2005 and SQL Server 2005 with Reporting Services. All of my reports are built in VS2005. The reports are deployed to folders named "Amort" or :Amort_Test" on the Report server depending on the configuration I choose when I deply (Production delpoys to "Amort", Test deploys to "Amort_Test").
In Reporting Services Report manager, I have a data source setup call AMORT (and that is the datasource in my VS2005 reports). The datasource is of type Microsfot SQL Server and the connection string is "Data Source=uslibsql310;Initial Catalog=AMORT_P".
What I'd like to do is have the ability for the reports in the "Amort" folder point to a database called AMORT_P on my server (uslibsql310) while the reports in the folder "Amort_Test" point to the database called AMORT_T on the same server (uslibsql310). Obviously my current configuration, where reports in both folders point to the same datasource, says that reports point to the AMORT datasource which currently points to AMORT_P.
My initial thought was that I could create a new datasources, call it AMORT_Test and have its connection string be ""Data Source=uslibsql310;Initial Catalog=AMORT_P". However, every time I'd deploy my reports, I'd have to change the datasource in VS2005 to read AMORT_Test instead of AMort and then deply, which would be abit of a hassle.
Can anyone think of a more user-friendly solution to this? I'm one who normally finds the quickest solution and goes with it, but in this case I think there must be a way to set this up so that the reports in one folder know to pick one DB and the reports in another folder know to pick a different DB, but my current setup doesn't allow that. I'm not sure where to start in trying to figure this out as I'm a bit of an RS novice.
You're almost there, I think. If I understood correctly, here's your current setup:
One shared datasource
Reports all use that shared datasource for datasets
Two configurations: test and production, each with its own target folder
What you can do now is set OverwriteDataSources to False. Manual labor is required to set the connection string for deployed reports only:
For initial deployment of reports
When you want/choose to change the connection for deployed reports
This manual labor can be either:
Changing the connection string, temporarily enable OverwriteDataSources, and re-deploy
Going to the report manager web frontend to change the connection string
However, your default setup would be to deploy reports to both configurations, without having to worry about connecting test reports to production databases and vice versa.