TFS2010 - Track Merges - tfs-sdk

Given a changeset c and given that c contains merge operations, I would like to get a list of all changesets that have been merged and resulted in c.
For example:
Changeset 1 contains some edits for some files.
Changeset 2 contains some edits for some other files.
Changeset 3 is a merge of changesets 1+2 to a parent branch.
Now I would like to get changesets 1+2 from asking changeset 3 which changeset merges it contained.
I want to do this using the TFS API. I came across the versionControlServer.TrackMerges method, however I do not understand what the ItemIdentifiers that the method expects should be. Unfortunately, I cannot find an example of how to use this method. Also I'm not sure if that is really the correct one.

Okay, it took me really long, but I think I found out how to do this. This is the Code that will find all the "parent" changesets:
/// <summary>
/// Gets the changesets which have resulted in the given changeset due
/// to a merge operation.
/// </summary>
/// <param name="changeset">The changeset.</param>
/// <param name="versionControlServer">The version control server.</param>
/// <returns>
/// A list of all changesets that have resulted into the given changeset.
/// </returns>
public static List<Changeset> GetMergedChangesets(Changeset changeset, VersionControlServer versionControlServer)
{
// remember the already covered changeset id's
Dictionary<int, bool> alreadyCoveredChangesets = new Dictionary<int, bool>();
// initialize list of parent changesets
List<Changeset> parentChangesets = new List<Changeset>();
// go through each change inside the changeset
foreach(Change change in changeset.Changes)
{
// query for the items' history
var queryResults = versionControlServer.QueryMergesExtended(
new ItemSpec(change.Item.ServerItem, RecursionType.Full),
new ChangesetVersionSpec(changeset.ChangesetId),
null,
null);
// go through each changeset in the history
foreach (var result in queryResults)
{
// only if the target-change is the given changeset, we have a hit
if (result.TargetChangeset.ChangesetId == changeset.ChangesetId)
{
// if that hit has already been processed elsewhere, then just skip it
if (!alreadyCoveredChangesets.ContainsKey(result.SourceChangeset.ChangesetId))
{
// otherwise add it
alreadyCoveredChangesets.Add(result.SourceChangeset.ChangesetId, true);
parentChangesets.Add(versionControlServer.GetChangeset(result.SourceChangeset.ChangesetId));
}
}
}
}
return parentChangesets;
}
Edit:
I just realized that there is a small "bug" in the above code, I used to write "VersionSpec.Latest" in the query, however: "new ChangesetVersionSpec(changeset.ChangesetId)" would be better, because then the changesets would also be tracked once the source branch has been deleted.

I think this page by a Ben Clark-Robinson answers the original question how to use the TrackMerges() API:
Here's a verified example:
using tfvcc = Microsoft.TeamFoundation.VersionControl.Client;
var sourcePath = "$/projectName/branchObjectName1";
var targetPath = "$/projectName/branchObjectName2";
versionCtl.TrackMerges(
sourceChangesetIds: new[] { 1000 },
sourceItem: new tfvcc.ItemIdentifier(sourcePath),
targetItems: new[] { new tfvcc.ItemIdentifier(targetPath) },
pathFilter: null)

Related

How do I add a lazy loaded column in EntitySpaces?

If you do not have experience with or aren't currently using EntitySpaces ("ES") ORM this question is not meant for you.
I have a 10 year old application that after 4 years now needs my attention. My application uses a now defunct ORM called EntitySpaces and I'm hoping if you're reading this you have experience or maybe still use it too! Switching to another ORM is not an option at this time so I need to find a way to make this work.
Between the time I last actively worked on my application and now (ES Version 2012-09-30), EntitySpaces ("ES") has gone through a significant change in the underlying ADO.net back-end. The scenario that I'm seeking help on is when an entity collection is loaded with only a subset of the columns:
_products = new ProductCollection();
_products.Query.SelectAllExcept(_products.Query.ImageData);
_products.LoadAll();
I then override the properties that weren't loaded in the initial select so that I may lazyload them in the accessor. Here is an example of one such lazy-loaded property that used to work perfectly.
public override byte[] ImageData
{
get
{
bool rowIsDirty = base.es.RowState != DataRowState.Unchanged;
// Check if we have loaded the blob data
if(base.Row.Table != null && base.Row.Table.Columns.Contains(ProductMetadata.ColumnNames.ImageData) == false)
{
// add the column before we can save data to the entity
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
}
if(base.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull)
{
// Need to load the data
Product product = new Product();
product.Query.Select(product.Query.ImageData).Where(product.Query.ProductID == base.ProductID);
if(product.Query.Load())
{
if (product.Row[ProductMetadata.ColumnNames.ImageData] is System.DBNull == false)
{
base.ImageData = product.ImageData;
if (rowIsDirty == false)
{
base.AcceptChanges();
}
}
}
}
return base.ImageData;
}
set
{
base.ImageData = value;
}
}
The interesting part is where I add the column to the underlying DataTable DataColumn collection:
this.Row.Table.Columns.Add(ProductMetadata.ColumnNames.ImageData, typeof(byte[]));
I had to comment out all the ADO.net related stuff from that accessor when I updated to the current (and open source) edition of ES (version 2012-09-30). That means that the "ImageData" column isn't properly configured and when I change it's data and attempt to save the entity I receive the following error:
Column 'ImageData' does not belong to table .
I've spent a few days looking through the ES source and experimenting and it appears that they no longer use a DataTable to back the entities, but instead are using a 'esSmartDictionary'.
My question is: Is there a known, supported way to accomplish the same lazy loaded behavior that used to work in the new version of ES? Where I can update a property (i.e. column) that wasn't included in the initial select by telling the ORM to add it to the entity backing store?
After analyzing how ES constructs the DataTable that is uses for updates it became clear that columns not included in the initial select (i.e. load) operation needed to be added to the esEntityCollectionBase.SelectedColumns dictionary. I added the following method to handle this.
/// <summary>
/// Appends the specified column to the SelectedColumns dictionary. The selected columns collection is
/// important as it serves as the basis for DataTable creation when updating an entity collection. If you've
/// lazy loaded a column (i.e. it wasn't included in the initial select) it will not be automatically
/// included in the selected columns collection. If you want to update the collection including the lazy
/// loaded column you need to use this method to add the column to the Select Columns list.
/// </summary>
/// <param name="columnName">The lazy loaded column name. Note: Use the {yourentityname}Metadata.ColumnNames
/// class to access the column names.</param>
public void AddLazyLoadedColumn(string columnName)
{
if(this.selectedColumns == null)
{
throw new Exception(
"You can only append a lazy-loaded Column to a partially selected entity collection");
}
if (this.selectedColumns.ContainsKey(columnName))
{
return;
}
else
{
// Using the count because I can't determine what the value is supposed to be or how it's used. From
// I can tell it's just the number of the column as it was selected: if 8 colums were selected the
// value would be 1 through 8 - ??
int columnValue = selectedColumns.Count;
this.selectedColumns.Add(columnName, columnValue);
}
}
You would use this method like this:
public override System.Byte[] ImageData
{
get
{
var collection = this.GetCollection();
if(collection != null)
{
collection.AddLazyLoadedColumn(ProductMetadata.ColumnNames.ImageData);
}
...
It's a shame that nobody is interested in the open source EntitySpaces. I'd be happy to work on it if I thought it had a future, but it doesn't appear so. :(
I'm still interested in any other approaches or insight from other users.

Catching and logging MVC4 message about "Minification failed. Returning unminified contents"

There are a number of questions on StackOverflow that talk about getting the Minification failed. Returning unminified contents error from the MVC4 minification.
I'd like to know if there is a way to be notified about this error when it happens and to be able to log it.
It is nice that when there is an error the bundler returns the original contents so my site doesn't break, but I would like to know about these errors automatically rather than having to visit each css/js bundle url to see if there is an error.
So that logic is actually in the implementation of the default transforms that Script/StyleBundle are using. If you want to catch those errors yourself, you can change the transforms on your bundles to something that surfaces those errors:
So to actually detect the errors, you would have to manually enumerate all of your bundles (to trigger them to get generated), and also be able to listen to errors that happened (so the GenerateErrorResponse equivalent below would need to report any errors to someplace that you would see)
Here's what JsMinify does in its process for reference:
/// <summary>
/// Transforms the bundle contents by applying javascript minification
/// </summary>
/// <param name="context">The <see cref="BundleContext"/> object that contains state for both the framework configuration and the HTTP request.</param>
/// <param name="response">A <see cref="BundleResponse"/> object containing the bundle contents.</param>
public virtual void Process(BundleContext context, BundleResponse response) {
if (!context.EnableInstrumentation) {
Minifier min = new Minifier();
// NOTE: Eval immediate treatment is needed for WebUIValidation.js to work properly after minification
// NOTE: CssMinify does not support important comments, so we are going to strip them in JS minification as well
string minifiedJs = min.MinifyJavaScript(response.Content, new CodeSettings() { EvalTreatment = EvalTreatment.MakeImmediateSafe, PreserveImportantComments = false });
if (min.ErrorList.Count > 0) {
GenerateErrorResponse(response, min.ErrorList);
}
else {
response.Content = minifiedJs;
}
}
response.ContentType = JsContentType;
}

Caching Facet Queries

Just tracked down a bug relating to what appears to be a facet query being cached in RavenDB.
I found the line of code that sets aggresive caching but only for the given session. In this session, it appears the code is only getting a 'FacetSetup' doc and not executing a query on facets. However, I have confirmed that when the given line of code is commented out, I receive the current facet count and when I reinstate the comment and make an update, I receive the stale facet count.
My question is: What does this line actually tell Raven to cache?
/// <summary>
/// Gets all the facet names from the main facet document.
/// That way, we know all the possible facets to query over.
/// We will execute a facet query for each facet in the FacetSetup
/// </summary>
private IEnumerable<string> GetFacetNamesFromFacetSetup(IDocumentStore store, string facetDocumentName)
{
IList<string> facetFromDocument = new List<string>();
using (IDocumentSession docSession = store.OpenSession())
{
//The line below here!!
docSession.Advanced.DocumentStore.AggressivelyCacheFor(TimeSpan.FromMinutes(30));
FacetSetup facetSetup = docSession.Load<FacetSetup>(facetDocumentName);
foreach (Facet facet in facetSetup.Facets)
{
facetFromDocument.Add(facet.Name);
}
}
return facetFromDocument;
}
AggressivelyCacheFor is global for the document store.
And it returns an IDisposable.
Until you dispose it, all operations will be eagerly cached.

Load all Appointment Properties, including extended properties, in Exchange 2010

I'm trying to list all the properties associated with a given calendar appointment, but I can't figure out if there's a way to do it without loading each property individually. Based on some of the code I've seen online, I know I can do something like the following:
/// <summary>
/// List all the properties of an appointment
/// </summary>
public void listProps()
{
Folder myCalendar = Folder.Bind(service, WellKnownFolderName.Calendar);
ItemView view = new ItemView(10);
FindItemsResults<Item> findResults;
do
{
findResults = myCalendar.FindItems(new SearchFilter.IsEqualTo(ItemSchema.ExtendedProperties, MyPropertySetId), view);
service.LoadPropertiesForItems(findResults, new PropertySet(ItemSchema.Subject, ItemSchema.Body));
foreach (Item item in findResults)
{
Console.WriteLine(item.Subject);
Console.WriteLine(item.Body);
}
if (findResults.NextPageOffset.HasValue)
{
view.Offset = findResults.NextPageOffset.Value;
}
} while (findResults.MoreAvailable);
}
However, what I'm really looking for there in the middle is something like this pseudo code:
service.LoadPropertiesForItems(findResults, new PropertySet(*.*));
foreach (Item item in findResults)
{
Console.WriteLine(property)
}
Is this possible?
Thanks!
No, it is not possible.
You can use predefined property sets like: PropertySet.FirstClassProperties to get the known set of properties.
Getting extended properties this way is not supported at all. You have to specify explicitly which extended properties you want to access. For example:
ItemView view = new ItemView(10);
ExtendedPropertyDefinition extendedPropertyDefinition =
new ExtendedPropertyDefinition("Expiration Date", MapiPropertyType.String);
view.PropertySet =
new PropertySet(BasePropertySet.FirstClassProperties, ItemSchema.Subject,extendedPropertyDefinition);

LINQ SQL Attach, Update Check set to Never, but still Concurrency conflicts

In the dbml designer I've set Update Check to Never on all properties. But i still get an exception when doing Attach: "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported." This approach seems to have worked for others on here, but there must be something I've missed.
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
test.Name = "test2";
using(TheDataContext dc = new TheDataContext())
{
dc.Members.Attach(test, true);
dc.SubmitChanges();
}
The error message says exactly what is going wrong: You are trying to attach an object that has been loaded from another DataContext, in your case from another instance of the DataContext. Dont dispose your DataContext (at the end of the using statement it gets disposed) before you change values and submit the changes. This should work (all in one using statement). I just saw you want to attach the object again to the members collection, but it is already in there. No need to do that, this should work just as well:
using(TheDataContext dc = new TheDataContext())
{
var test = dc.Members.FirstOrDefault(m => m.fltId == 1);
test.Name = "test2";
dc.SubmitChanges();
}
Just change the value and submit the changes.
Latest Update:
(Removed all previous 3 updates)
My previous solution (removed it again from this post), found here is dangerous. I just read this on a MSDN article:
"Only call the Attach methods on new
or deserialized entities. The only way
for an entity to be detached from its
original data context is for it to be
serialized. If you try to attach an
undetached entity to a new data
context, and that entity still has
deferred loaders from its previous
data context, LINQ to SQL will thrown
an exception. An entity with deferred
loaders from two different data
contexts could cause unwanted results
when you perform insert, update, and
delete operations on that entity. For
more information about deferred
loaders, see Deferred versus Immediate
Loading (LINQ to SQL)."
Use this instead:
// Get the object the first time by some id
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
// Somewhere else in the program
test.Name = "test2";
// Again somewhere else
using(TheDataContext dc = new TheDataContext())
{
// Get the db row with the id of the 'test' object
Member modifiedMember = new Member()
{
Id = test.Id,
Name = test.Name,
Field2 = test.Field2,
Field3 = test.Field3,
Field4 = test.Field4
};
dc.Members.Attach(modifiedMember, true);
dc.SubmitChanges();
}
After having copied the object, all references are detached, and all event handlers (deferred loading from db) are not connected to the new object. Just the value fields are copied to the new object, that can now be savely attached to the members table. Additionally you do not have to query the db for a second time with this solution.
It is possible to attach entities from another datacontext.
The only thing that needs to be added to code in the first post is this:
dc.DeferredLoadingEnabled = false
But this is a drawback since deferred loading is very useful. I read somewhere on this page that another solution would be to set the Update Check on all properties to Never. This text says the same: http://complexitykills.blogspot.com/2008/03/disconnected-linq-to-sql-tips-part-1.html
But I can't get it to work even after setting the Update Check to Never.
This is a function in my Repository class which I use to update entities
protected void Attach(TEntity entity)
{
try
{
_dataContext.GetTable<TEntity>().Attach(entity);
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
catch (DuplicateKeyException ex) //Data context knows about this entity so just update values
{
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
}
Where TEntity is your DB Class and depending on you setup you might just want to do
_dataContext.Attach(entity);