I am a beginner user of SQL Reporting Services. I wrote a few reports. They all use stored procedures on my development SQL server. Now I want to deploy them into production, and repoint the datasets and datasources to my production SQL db.
How best to do this? Can I just do a global change from development_server_name to production_server_name?
Thank you.
You set up your reports to use a shared data source. The data source is published along with the reports. You deploy your datasource, then go into SSRS management URL and modify the newly deployed data source to point to a the production server. All subsequent deployments will preserve this change, so you can now deploy modified and new reports at leisure, they will automatically pick up the redirect to the production server once they are on the SSRS site.
Of course, if you used stored procedures, you need to deploy those as well on the production server.
If you didn't used shared data sources on the reports, then lear your lesson, go back and change them to use a shared data source, then deploy.
Related
How to generate 1 click deployment for SQL Scripts across environments?
We have bunch of scripts which can be Alter Stored Procedure , Insert scripts for data configuration which are deployed periodically. At this moment, we are deploying the scripts manually executing the scripts.
Right now, we are planning for one click deployment which will automatically deploy the scripts across the environments.
To my mind use SQL Server Database Project in Microsoft Visual Studio. You can develop all database objects like tables, view, functions, stored procedure and every script. And also import database schema from existing database. You can provide code for post deploy and every data configurations.
some benefits:
version and source control
develop all database objects
easy and automated deployment
and more benefits...
How to: Create a New Database Project
I have a SQL server reporting service production environment. I need to duplicate this environment on a totally different machine including the source data, let's call it Dev environment. I have backed up and restored both the source database (sourceDataDB) and the report databases (ReportServer and ReportServerTempDB) to this new machine. Re-configured the new reporting service to point to the new report database. Everything works except when I run a report, the report pull out data from the original source database instance instead of the newly created instance. Of course, I can manually modify the data source information from the Report Manager on the new report server. The challenge is every time the ReportServer and ReportServerTempDB got refreshed from the production database, the modified data source got replaced with the one from production environment.
I wonder if there is a way to automate the process of modifying data source information after each database refresh, either from the database end or from the report manager. I only have one data source that is shared with all the reports. This is a SQL Server 2008 R2
Thanks in advance.
TThanks, for reading, I'll try to explain my issue in a detailed format as the question I'm asking is a bit high-level for my experience-level.
I'm using VS2005 and SQL Server 2005 with Reporting Services. All of my reports are built in VS2005. The reports are deployed to folders named "Amort" or :Amort_Test" on the Report server depending on the configuration I choose when I deply (Production delpoys to "Amort", Test deploys to "Amort_Test").
In Reporting Services Report manager, I have a data source setup call AMORT (and that is the datasource in my VS2005 reports). The datasource is of type Microsfot SQL Server and the connection string is "Data Source=uslibsql310;Initial Catalog=AMORT_P".
What I'd like to do is have the ability for the reports in the "Amort" folder point to a database called AMORT_P on my server (uslibsql310) while the reports in the folder "Amort_Test" point to the database called AMORT_T on the same server (uslibsql310). Obviously my current configuration, where reports in both folders point to the same datasource, says that reports point to the AMORT datasource which currently points to AMORT_P.
My initial thought was that I could create a new datasources, call it AMORT_Test and have its connection string be ""Data Source=uslibsql310;Initial Catalog=AMORT_P". However, every time I'd deploy my reports, I'd have to change the datasource in VS2005 to read AMORT_Test instead of AMort and then deply, which would be abit of a hassle.
Can anyone think of a more user-friendly solution to this? I'm one who normally finds the quickest solution and goes with it, but in this case I think there must be a way to set this up so that the reports in one folder know to pick one DB and the reports in another folder know to pick a different DB, but my current setup doesn't allow that. I'm not sure where to start in trying to figure this out as I'm a bit of an RS novice.
You're almost there, I think. If I understood correctly, here's your current setup:
One shared datasource
Reports all use that shared datasource for datasets
Two configurations: test and production, each with its own target folder
What you can do now is set OverwriteDataSources to False. Manual labor is required to set the connection string for deployed reports only:
For initial deployment of reports
When you want/choose to change the connection for deployed reports
This manual labor can be either:
Changing the connection string, temporarily enable OverwriteDataSources, and re-deploy
Going to the report manager web frontend to change the connection string
However, your default setup would be to deploy reports to both configurations, without having to worry about connecting test reports to production databases and vice versa.
Friends I have a problem with SQL Server Reporting Services ..!!!
I have some reports that show no data, the reports are on a separate server from the server that contains the database, the Data Sources of the reports is well configured.
Stored procedures are executed while the server (display data)
users have the necessary permissions.
What is the problem ..? Thanks
Also, look at reporting services logs. they all end in .log, and begin with Report. I forget where they are on the server, but am pretty sure they are under the program files directory for mssql, just as you would find logs for sql server.
It sounds as though the definition of your data sources in production may be different from that in development. Have you tried redeploying your data sources?
You should be able to access the Database on a server separate from the report server. That all sounds good. When you run the Reports in Visual Studio Preview mode....do you see any data?
hope that everybody here is OK.
We are using VS 2008 as development tool, TFS 2008 as version control as well as build automation. Some of our developer use dbpro for databases changes and some use SQL Server management studio.
I am trying to automate build for Web Application built using C# and VB.Net.
Our scenario is such that we have a central database to which our web application connects.
Whenever we supply our clients with a new functionality or a bug fix, we supply them incremental builds.
The SQL script is checked into source control for every incremental build when they have made and tested there changes on our central DB Server.
I want to generate Differential script that can be run at the client as an incremental update script. Now to come about it is a problem. Sometimes our developers tend to forget the database change-sets and the script in the source control is missing an SP or a two.
Also, sometimes we need to insert default data into some of the tables that have strict stringent values and not test values. Like a table that contains Services provided by the panel, we add a new service name, signature, credentials and service address, etc etc in the ServiceTable. Besides this many other tables may have test data that may not be needed.
If we use DataCompare, it will generate changeset for required data (important for client to enable certain services) and our test data that was added to the database as a result of our testing of the functionality or bug fix.
Currently i am using SQLSchemaCompareTask (from Visual Studio 2008 Team Database Professional Power Tools API) in the TFSBuild.proj file of the build definition for TFS 2008.
Using SQLSchemaCompareTask, the script generated contains database names like [dbo]. etc which are not desired as the script fails when run against SQL Server 2000 databses (Some of our client still use SQL Server 2000) databases as teh backend of the application.
Also default data can't be generated by this process.
To overcome this problem, i have to come up with a solution that can compare databases and generate script automatically that does not have to be manually reviewed again before being sent to the client.
Please suggest effective methodology of such SQL script generation and suggest whether two different databases may be used or something ? Is there any toolkit or api that can enable build automation for SQL Server databases?
Thank you all.
Regards
Steve
Try to use SQL Examiner Suite for this:
http://www.sqlaccessories.com/SQL_Examiner_Suite/
The tool compares both schema and data and produces synchronization scrips (or differentials scripts). You can automate script creation with supplied command-line tool.
Rather than collating many individual change set scripts (and therefore occasionally missing objects out), why not use schema compare and data compare to create a single script from your database project using a database equivalent to your client's on the target? This should create a script tailored to their requirements.
In data compare you can exclude test data records that you don't want pushed to your client by unchecking them in the lower grid.