MS Project data in SQL Server - sql

I have been given the task to Load all our company's project data from MS Project into SQL Server to be able to create reports and dashboards from the project data.
I know you can export a specific projects data into a access database, but every project must have it's own access database. So my first thoughts was to create some kind of dynamic access SSIS connector and ETL. So that the project managers export their projects to access and the SSIS package takes care of the rest.
Is there a simpler way of doing this?
My company also have a SharePoint infrastructure as well as SAP BusinessObjects infrastructure. Is there a way I can accomplish this by the use of SharePoint or bobj

The simplest but costy way is called MS Project Server - it stores all data in MS SQL database already, has precalculated reporting database and so on. Here is a link http://technet.microsoft.com/en-us/evalcenter/hh973404.aspx. The last version can be either installed locally or hosted by Microsoft.
Another option I see is to use kind of automation around MS Project which will load project plan, extract all interested information and upload to to your database. There is no big magic in this solution.
The third option is to export projects to XML and then use the XML to upload data to SQL Server database. This solution is also doable through Project automation

After defining your database and column structure in SQL Server, just use Project VBA to A) collect the project and task data into an array, B) set a connection string to your database, then C) send it. I have created several applications around this procedure and it works very well.

Related

How to write a query procedure that can be send to other local machines?(Beginner)

I have the database, tables, PK's and FK's, i need to send it to the other local machine, i tried writing some query's but it failed.
(Please be polite, just started :) )
You need to be more clear on your actual goal, the question is very vague. If you are trying to transfer data from one data base to another database, which is what I assume you mean, I would highly recommend using SQL server Integration Services. You create SSIS packages in Visual Studio and it will allow you to write simple queries from your data source and insert them into the tables of the destination database. You will need to download Visual Studio and install the integration services add on. Hope this helps.

Clients Reports from web app

I have an web app with some static reports that is running with SQL Server 2012 Web Edition and I dont want the impact of a report process on usage of the app.
So, I decide to split the database:
Database Master - With all data. Used for all web app precess, except the reports
Database X - With all data from the client X. Used only for report and updated every day.
Database Z - With all data from the client Z. Used only for report and updated every day.
Database Y - With all data from the client Y. Used only for report and updated every day.
There is no limitation number of database that could have.
What would be the best way?
-Cubes with Analises Services? I think I cant use this solution with Web Edition.
-Snapshot databases? Maybe in another server?
-Another solution?
Tks.
I could solve my problem using the SSIS (Integration Services) with Visual Studio
I create the Azure SQL databases with table that I want to public to my client
In SSIS I created packages for each client with their connection to Azure database. Also created a variable with the client ID that will be used by the filters.
Published the project in Integration Services Catalog and add JOBs to execute the packages every midnight.
For my clients I give the Azure credential (read-only) that will be used in Excel PowerPivot
Here is a link to a good SSIS tutorial. It worth for who has similar problem or need to migrate/extract/transform data. You can also generate Excel sheets.
SSIS tutorial with SQL Azure

Automating the Export Data Task

I have a database on one server that I need to copy to another server. I can do this manually using the Export Data task, which is fine for a one time export, but I would like to speed this up as it is going to be repeated.
The database will always contain the same set of tables, I just need to get a copy of this database with it's tables and their data from one server to another.
I'd like to create some sort of reusable tool that allows you to specify the source and target database servers and then copies this specific database from one to another. Is this possible?
The Export Data task in SQL 2005 and later uses SQL Server Integration Services (SSIS) under the hood. You can save the package you're already using and run it on a schedule or on demand. You can also edit it (once it is saved) using the Business Intelligence Development Studio (BIDS).
At the end of the Export wizard (on the "Save and Run Package" screen), you can tick the "Save SSIS Package" check-box to store the package either within SQL server or on the file system. The file system is probably simpler.
Once you have the package you can execute it from the command line using the dtexec tool, or from a SQL Agent job using an Execute SSIS task.
SSIS is too big a subject to cover in full here - there are decent tutorials within SQL server books online if you need more details - alternatively, as another SO question if you get stuck.

Offline database solution for sql server

Here's task: We have an sql server database. which is hosted at our server. What we need to do is: we need to create a non-techy-users interface (basically insert/edit forms) and let these non-techy-users to install this database locally, since they are located in the areas without internet connection. Then when they're done using the database we get the data from them and inster it in our database.
The biggest concern is that it is not trivial for non-it people to install sql server. Can you please advise me what solution should I choose? Simple Access should work fine, but i really do not want to mess with it and have data conversion back and forth between engines.
Sync Framework for SQL Server: your application uses a lite weight, embedded SQL Server CE (no installation, just a couple of DLLs deployed along with your app) and the sync framework manages the synchronization with the 'mother ship' SQL Server.
Out of interest, why do they need their own installation? Can't you create a new database on your existing instance?
If you're looking for an easy way to create insert/edit forms on your database, have you considered looking at Microsoft's new LightSwitch product (currently in Beta) or Microsoft's Dynamic Data?

Automatic incremental SQL Script generation for incremental, nightly builds when using Team Build in TFS 2008 and Visual Studio 2008?

hope that everybody here is OK.
We are using VS 2008 as development tool, TFS 2008 as version control as well as build automation. Some of our developer use dbpro for databases changes and some use SQL Server management studio.
I am trying to automate build for Web Application built using C# and VB.Net.
Our scenario is such that we have a central database to which our web application connects.
Whenever we supply our clients with a new functionality or a bug fix, we supply them incremental builds.
The SQL script is checked into source control for every incremental build when they have made and tested there changes on our central DB Server.
I want to generate Differential script that can be run at the client as an incremental update script. Now to come about it is a problem. Sometimes our developers tend to forget the database change-sets and the script in the source control is missing an SP or a two.
Also, sometimes we need to insert default data into some of the tables that have strict stringent values and not test values. Like a table that contains Services provided by the panel, we add a new service name, signature, credentials and service address, etc etc in the ServiceTable. Besides this many other tables may have test data that may not be needed.
If we use DataCompare, it will generate changeset for required data (important for client to enable certain services) and our test data that was added to the database as a result of our testing of the functionality or bug fix.
Currently i am using SQLSchemaCompareTask (from Visual Studio 2008 Team Database Professional Power Tools API) in the TFSBuild.proj file of the build definition for TFS 2008.
Using SQLSchemaCompareTask, the script generated contains database names like [dbo]. etc which are not desired as the script fails when run against SQL Server 2000 databses (Some of our client still use SQL Server 2000) databases as teh backend of the application.
Also default data can't be generated by this process.
To overcome this problem, i have to come up with a solution that can compare databases and generate script automatically that does not have to be manually reviewed again before being sent to the client.
Please suggest effective methodology of such SQL script generation and suggest whether two different databases may be used or something ? Is there any toolkit or api that can enable build automation for SQL Server databases?
Thank you all.
Regards
Steve
Try to use SQL Examiner Suite for this:
http://www.sqlaccessories.com/SQL_Examiner_Suite/
The tool compares both schema and data and produces synchronization scrips (or differentials scripts). You can automate script creation with supplied command-line tool.
Rather than collating many individual change set scripts (and therefore occasionally missing objects out), why not use schema compare and data compare to create a single script from your database project using a database equivalent to your client's on the target? This should create a script tailored to their requirements.
In data compare you can exclude test data records that you don't want pushed to your client by unchecking them in the lower grid.