Which of my two approach is right to update SQL table from TFS workitem collection object - sql

I had developed one small application which will fetch data from TFS project workitemcollection object and store it into SQL table.
Now I want to update database regularly. What I have done is, I have created as service which will create new tfs project workitemcollection object and update database based upon two checks i.e. if defect id is already present in table it will update the SQL row with new values and if not it will insert the new values.
ISSUE: In one scenario the defect from one tfs project has been moved to another tfs project but SQL still contains that id in old project.
Ways I can think of: another check from opposite direction to verify if defectId is not present in collection then delete it.
OR,
Every time service run delete the old table and fill the complete table with new and updated data.
What will be the best option or if there is third one?

You can consider creating a server side plugin for TFS to listen to the work item change event. Once the event triggered, create/update/delete the record in your SQL table.
If you use TFS 2013 and previous versions, check TFS 2013 event handling on work item change to create the server plugin.
If you use TFS 2015 and later versions, including the solution above in TFS 2013, you can also use Web Hooks to send a JSON representation of an event to any service.

Related

Auto update reports generated in report studio, when new rows are added in database

I am using Cognos 10. I have created a model in FM using some tables in DB2. Then I have created a package and published it. I have also generated reports on this in report studio. Everything went well and the report looks fine.
Now the actual problem is, I added some rows to the tables which I used for this project and the previous report is not updating. I tried refreshing data source in FM but it didn't update. So I had to create a new model and publish a package and generate a new report each time I add data to my DB2 tables.
I have seen Tableau reports being auto-update when database changes occurs. Is this possible with Cognos?
Is there any other way in which I can just do something in report studio and the report automatically updates to the new data present in DB2 tables?? So that I don't have to create a new model,package and report for each DB2 insert??
I met the same issue here if I use a Framework Manager model. The only solution that I found right now is to write SQL directly in Report Studio.
It can be problem with local cache. I had this issue before. The new data row was not shown in Report after insert. I needed to disable local cache in the Framework to make it work.

Use SQL Server Management Studio to update code first project

I have created a project and added an Entity Data model using Code First from Existing Database.
Now, I like using SSMS (2008) to make changes like adding new columns, tables or changing some of the properties table or column properties.
Is there a way that I can make my changes in SSMS then have the Models and Migrations updated accordingly or once you have selected Code first and created you models are you stuck to only making changes in code and running the add migrations and update database from the package manager?
I also need to make sure when I publish the solution the server version of the database is updated accordingly.
Cheers in advance,
Kevin.

Adding Sharepoint 2010 List Items to a List with external Items

I have a list in SharePoint 2010 that has external items.
It is easiest to explain the specific scenario.
We have sales orders that have information that is being pulled from our MRP system.
The unique key is the Sales order number.
There are several columns in the list that do not come from MRP that show the status of the order on our production floor.
The way that it works now, every time a new sales order is created, the user must go to the SharePoint list, click new item, type in the SO number, click the check external item button and click ok which subsequently populates several fields in the list..... Then someone out on the floor populates the rest of the info
The part in bold is what I would like to automate.
I understand it may need to be a stored procedure or some powershell script.
The issue is, that because of the external content type in the list, all of the canned SharePoint tools wont allow me to feed data back into the list.
One option is to create an SQL server trigger. This trigger would fire whenever a new sales order is created in your source database. You could make it a CLR trigger, and in the function make use of the SharePoint client interface.
After you provided a clearer description of the problem: I believe that you're looking for something like the list event handler. This will run on the events you care about, and you can pull from the database at the appropriate time.
Essentially, you need to make a Visual Studio SharePoint project (from my past experience this method requires you to use VS right on the SharePoint server, or else you have to copy a lot of DLLs manually from the server); and make an event receiver.

Refreshing a table with Visual Studio 2010, Entity Framework 5.0, .Net Framework 4.0 forces you to delete the table and re-add it

I wanted to make a change to a table in the edmx designer since I added a column in our Sql Server DB (Update Model fromm Database). When the wizard came up to refresh a table, the table that was to be refreshed WAS NOT IN THE LIST (in the Refresh tab), even though it was sitting on the designer.
I had to delete the table, go thru the wizard again to see the table underneath the "Add" tab and was able to re-add it with the new column.
The problem with this approach is it's fine if you have one or two tables, but what if you have many tables in your model? You'd have to delete the entire model and re-select each table again to re-create your model.
Am I missing a step here or what?
As far as I can tell, the same thing happens using Visual Studio 2012 with the 4.5 framework.
I had to apply SP 1 to both VS 2010 & 2012 in order to get it to work.

Branching strategy for release based db project

We have a asp.Net vb.net 2008 project on tfs2010. The project has one main branch and for any release we create a new feature branch which is finally deployed. Post production deployment we merge back the branch back into main branch.
We are now also adding a db project to manage our SQL too. Question is how to version control the differential scripts. The db project contains all create scripts which is fine if we had to deploy thep project from scratch but the project is already live. So now any new release or hotfix would normally contain alter or change script practically.
Any ideas how to best manage both the create scripts and per release change scripts?
The way that we have been doing this for years is through the use of update database scripts which can update the database from a specific version to another.
There are two types of update scripts that we apply: table changes and data changes.
The table changes are recorded by hand as they are made and are designed in such a way that the script can be safely run multiple times against the same database without error. For example, we only add a column if it doesn't already exist in the table. This approach allows this version-specific script to be used for applying hotfixes as well as upgrading from one version to the next. The hotfixes are simply applied as additional entries at the end of the file.
This approach requires developer discipline, but when implemented correctly, we have been able to update databases that are 4 major revisions and 4 years out of date to the current version.
For the data changes, we use tools from Red Gate, specifically SQL Data Compare.
As far as the database programmability (stored procedures, triggers, etc), we keep one script that, when executed, drops all of the current items and then re-adds the current versions. This process is enabled by using a strict naming convention for all programmability elements (stored procedures are named starting with s_prefix_, functions with fn_prefix_, etc).
To ensure the correct script versions are applied, we added a small versions table (usually 1 row) that is stored in the database to record the current version of the database. This table is updated by the table update script when it is applied. We also update this table in the script that is applied to create the database from scratch.
Finally, in order to apply the scripts, we created a small tool that reads the current version of the database and a manifest that specifies which scripts to apply based on the current version of the database.
As an example:
Assuming that we have issue two major versions, 3 and 4, there are two update scripts, update_v3.sql and update_v4.sql. We have one initial structure script, tables.sql, and one programmability script, stored_procs.sql. Given those assumptions, the manifest would look something like:
tables.sql > when version = 0
update_v3.sql > when major_version <= 3
update_v4.sql > when major_version <= 4
stored_procs.sql > always
The tool evaluates the current version and applies the scripts in the order specified in the manifest to ensure that the database is always updated in a known manner.
Hopefully this helps give you some ideas.