Exchange: Custom Extended Properties as Part of Migration from 03 to 10 - api

We have an in-house app that uses the http://schemas.microsoft.com/exchange/permanenturl property to identify calendar appointments in 2003. Now we're getting ready to migrate to 2010, and our understanding is that the permanent urls are formed differently for calendar items, and that permanenturls from 2003 can't be upgraded directly to 2010. Someone has even suggested that the new API doesn't expose the permanenturl So, I have a few questions.
First, are we right? Can the permanenturl property not be upgraded to 2010 from 2003? If it can, is there an algorithm that allows us to predict or produce the new url? We're going to need them both on hand through the transition.
Second, I know that from 2007 and up that we can create custom extended properties that we can make persistent. I'm having trouble figuring out if we can create custom extended properties in 2003 a) at all, and b) that will persist in the migration.
I'm a total noob at Exchange programming, so I'm not even sure where to start on the code. Any pointers in the right direction would be greatly appreciated!
Thanks!

After some research and experimentation, we determined that the permanenturl doesn't persist, but that the GUID for Calendar items should be fairly reliable. The GUIDs for Items in Exchange appear to include a hash of the entry point, so moving messages between folders can change their GUID. Since calendar items stay in the same folder, the GUID shouldn't change. So we're going to use that as our new index.
There was still the matter of updating the old index. We decided to take the following course:
Copy the permanentURL value to the body of the message (actually, we're hashing a few things together)
Migrate the mailbox
Find the permanentURLs in the message bodies, get the new GUID, and update our table
Use the 2010 GUIDs going forward
This was my first time working with Exchange, and I have to say that I'm a fan of EWS. It makes things so much simpler.

Related

How to force a cache refresh in MS Access

I am working on migrating a MS Access Database over to a newer SQL platform.
But, with all of the users who are currently using it, we're migrating slowly/carefully.
The first step is that we are re-writing the VBA code into C#, which is then deployed in a .dll along with the database.
Now, the VBA code calls into the C# to do the business logic, then the VBA continues to do the displays/UI, while Access still hosts the database.
The problem comes in that I have a report that is being run after the business logic from the C# in one place, and apparently MS Access has a cache, which clears every 5 seconds. So, the transaction that occurs in the C# code writes to the database, but the VBA code is still using the cache. This is causing errors, as the records added to the database (which the VBA report is trying to report on) don't exist in the cache yet...
I'm guessing that the C# .dll must be getting treated as a "second connection" to the MS Access database, which is what seems to typically cause this error in my searches (thinks that one process is writing, and the other is reading).
Since the cache is cleared out every 5 seconds, we can just put the process to sleep, and wake it up after 5 seconds, and then run the report, but that's pretty terrible for an end user.
And, making things difficult, the cache seems like it only gets used in the deployed version (so, when running from source / in debug mode, the error never happens).
Doing some searches, there seems to be plenty of people who have said "just refresh the cache." But, the question is: within VBA, how do you refresh the cache?
Any advice would be welcome.
Thanks
I've been fighting the same issue for years as I write a lot of tools around an old Powerbuilder application that has an Access MDB back end.
The cache does exist and it is VERY real. When data is inserted on a different connection than it is queried on, the cache can be directly observed and measured. It was also documented by Microsoft before they blackholed a bunch of their old articles...
Microsoft Jet has a read-cache that is updated every PageTimeout milliseconds (default is 5000ms = 5 seconds). It also has a lazy-write mechanism that operates on a separate thread to main processing and thus writes changes to disk asynchronously. These two mechanisms help boost performance, but in certain situations that require high concurrency, they may create problems.
I've found a couple workarounds that are not the best, but somewhat make due until I find something better or can re-write the app with a better back end database.
The seemingly best answer I've found (that may actually work for you since you say you need VBA) is to use JRO.RefreshCache. I've been trying to figure out how to implement this using C# or VB.net without any luck. Below is a link to a code example where you execute the RefreshCache method on your 2nd connection that needs to pull the data. I have not tested this myself.
https://documentation.help/MSJRO/jrmthrefreshcachex.htm
A workaround I've found that will deliver the query results within 500ms to 1000ms of insert time (instead of anywhere between 500 and 5000 ms - or more):
Use System.Data.ODBC instead of OleDB, with connection string: Driver={Microsoft Access Driver (*.mdb, *.accdb)};Dbq=;
If someone knows how to use the JRO.RefreshCache method with OLEDB and C# or VB.net, I'd be forever grateful. I believe the issue is it's looking for an ADO connection to be passed in, not an OLEDB connection.
I not aware of ANY suggesting that some 5 second cache exits? Where did this idea come from????
Furthermore, if you have 5 users, then you not going to be able to update their cache, are you?
In other words, the issue of some cache for one user still not going to solve or work with mutli-users anyway, is it?
The simple matter is if you load up a form with 100 reocrds, and then other users are ALSO working on that 100 rows, then all users will not see other changes until such time you tell access to re-load the form.
You can do this with a me.Refresh in the form, and then it will show changes made by other users (or even your c# code!!!).
However, that not really the soluion here.
How does near EVERY system deal with this issue?
Answer:
You don't, you "design" the software to take the user work flow into account.
So, in place of loading up a form with 100 rows of data? (which you should not, unless SUPER DUPER reason exists for doing that).
The you provide a UI in which the user FIRST searches for whatever it is they want to work on.
In other words, say you just booked a user on a tour. Now, they call the office back, and want to change some details of that tour. But, a different tour staff might pick up the phone. So, now a 2nd user opens the tour?
So, you solve that issue by NOT loading all the tours into that form in the first place.
you provide a search screen, so they can search for the user, find the user, maybe type in a invoice number or whatever.
You display the results in a pick list, and then launch the form to the ONE record (and perhaps detail records from child tables).
So there no concpet of a cache in Access anymore then there is in c#.
However, if you load up a datatable in c#, and then display that data?
Well, what about the other users on that system. They will not see changes to that data ANY MORE then the current access form.
So, if you want to update some data in c#? Then fine, but you need/want to do two things:
First, before you call any c# code that may update the current form reocrd? You need to FORCE a data save of that current record BEFORE you call any code, be it VBA code, or c# code that going to update that current reocrd the user is working on.
You can in Access save the current reocrd in MANY different ways, but the typical approach is:
' single record save - current record
if me.dirty then me.dirty = false
' VBA or c# code goes here.
' optional refresh the current form to reflect changes
me.Refresh
So, in most cases, it is the "design" of your software that will solve this issue.
For example, in the tour example, or in fact ANY system, the user can't work, can't update, and can't do their job UNLESS they first find/search and have a means to bring up that form + record data in the first place.
So, ANY typical good design will:
Ask the user for that name, invoce number or whatever.
Display the results of the search, and THEN allow the user to pick the record/data to work on. When they are done, they close that form and are RIGHT BACK to the search form to do battle with the next customer or task or phone call or whatever.
So, a search form might look like this:
In above, I typed in smi, and then displayed a pick list.
The user can further type in say part of the first name, and thus now get this:
So, maybe they type in a invoice number, customer number, booking number or whatever.
So, you display the results, and then they can select the row or "thing" to work on.
thus, we click on the row (or above glasses button), and then jump to the ONE record.
so, the user does whatever they have to do with the customer. Now, when done, they close the ONE thing, the ONE main reocrd.
This not only saves the data (so others in the office can now use that booking data), but it also means the data is saved. and they are NOW right back at the search screen, ready to do battle with the next customer.
So, not only does this mean we have a VERY bandwith friednly design (we only pull the one main reocrd into that form), but it also is better for work flow.
The Access form's cache thus becomes a non issue, since we only dealing with the one record.
And as I pointed out, if the system is multi-user, then you NOT going to be able to udpate and deal with multiple users cached data anyway, are you?
Think of ANY system you EVER used from a software point of view.
When you use google, does it download the WHOLE internet, and then you use ctrl-f to search megs and megs of data in the browser?
Nope!
you search first, get a list of that search, and THEN pick one!!
And when that list is display, maybe others on the internet are udpateing, and add new data - but if that was cached in your browser, then it would not work!!!
And same goes for a desktop accounting system. You don't load up all accounts, and THEN have the user go ctrl-f to search all the data. You search for the customer, invoice number and PICK ONE to work on.
And it does not make sense to load up a form with 1000 customers, and then go ctrl-f to find that customer. Same goes for a instant banking machine. It does not download ALL customers and THEN let you search. It asks you FIRST to get what you need. So, be it browser based, desktop based, or JUST ABOUT ANY software you use?
You quite much elminate the cache issue, since not pre-loading boatloads of data, but asking and letting the user search for the data they need.
So, in regards to the Access form data and cache?
If you are on a form, and call VBA code, or c# code or whatever?
If that code update the current form, you have NO MORE OR LESS of a issue when calling VBA code, or c# code!!!! If that code updates the current form, and the reocrd is dirty (has pending edits), then you get that message about the current form's reocrd having been udpated by another user!!!
So, your cache issue does NOT IN ANY WAY exist MORE or LESS as a issue in typical Access software.
As a genreal rule, if you are on a form with pending edits, and say want to pop up some form to edit releated data?
You have to ensure that pending edits are SAVED before you launch an form that can edit the same data, or run code that can/may edit that data.
As a result, ZERO cache issues should exist, and they no more or no less exist when calling sql or VBA update code in a form then calling some c# code from that form.
So, write the pending update for that form.
Then run your VBA, SQL, or c# code.
And then do a me.Refresh to display any changes made by those external routines.
there is no documetjion, or ANY article I can find that suggests some kind of 5 seocnd cache or update - it is a urban myth, and your software challenge here in regards to use c# or VBA, or even SQL server stored procedures?
They are all the same issue, and I dare say that often access is used as a front end to SQL server, and ALL OF the SAME issues exist when using SQL server with ms-access.

Crm2013/15 Online and queries on huge data volumes

I'm working on a couple of million records, as soon as I try to run an advanced find, and put as a criteria a linked entity, the advanced find goes in timeout.
Create custom views on this allows me to filter properly? Anyone knows the proper way of using the advanced find this way? Are there limitations on the out of the box CRM that i should be aware of?
In CRM 2013 - it is possible to add indexes for specific fields by adding the columns to the quick find view for the entity.
You will need to wait for the Indexing Management Job to run (which is run every 24 hours by default) - see http://blogs.msdn.com/b/darrenliu/archive/2014/04/02/crm-2013-maintenance-jobs.aspx.
In previous version of CRM, it was necessary to add the indexes directly to the database - this may be necessary for more complex queries.
was too early to post an answer. The problem that I encountered was related to the OOB advanced find. Looking for example for an account with some related contacts (a really plain search with a linked entity) i had a SQL timeout. Everything was OOB so I was a little bit clueless and I opened a case to Microsoft. They found a bug, if i was changing the sorting the advanced find started to work again. They are still investigating. So wasn't a setting problem but a crm bug.

Noob Guidance on a Parallel Task Workflow (without Visual Studio)

This is going to be my first workflow, and I could use a little guidance.
I have a list I'm using for requests when a user needs their profile changed (eg: change of office location). The change has to be done in AD, PeopleSoft, and another database. Right now, I have it set up so requesters submit an item to a list, and Alerts go out to the different people responsible for making the updates in AD, PeopleSoft, etc. However, there has been enough frustration with missed emails and the like that I've been asked to track via workflow.
So essentially, I need to track a request that goes out to multiple users who will then need to confirm that the task has been completed. I found !(http://officeimg.vo.msecnd.net/en-us/files/989/238/ZA102615287.jpg), which is a very good representation of what I want to do, but does a very confusing job of explaining how to do it: http://office.microsoft.com/en-us/sharepoint-help/all-about-approval-workflows-HA102771433.aspx
Can someone point me to the workflow type that I need and the steps to implement? OOB/SPDesigner please, I don't have VS on my machine.
Thanks,
Scott
I will start by saying that implementing parallel tasks in a single workflow is hard.
What you can do is customise the OOB approval workflow (the one mentioned in the article) to suit your needs. This will give you an insight on how Sharepoint Workflows work and are designed.
It will look confusing at start (very confusing) since like i said is a complex workflow to setup, until you start to understand how it works.
make sure you make a copy of the approval workflow before modifying it so you can still use it if needed.

how to store Settings per customer (not per application and not per user) in ClickOnce-Excel-Addin

I am working on a VSTO-Excel-Addin (VB.NET) that we are about to sell to different companies.
The addin will (hopefully) be used by a couple of users of each customer. We are using ClickOnce for a web-based deployment.
The addin provides some default-settings - that is no problem with the app.config. User-specific Settings are also working fine with ClickOnce.
However what makes quite some headache is how to store settings per customer. Database-connectionstrings for example cannot be stored on the application-level but it would also be painfull for all users to change the Connection-string manually. What I would imagine is another layer (per Company) where this sort of Setting can be stored.
One alternative would be to create one VSTO-Project per customer but this would bring a lot of disadvantages since we are updating the addin quite regularly.
I would be happy about any ideas!
Regards,
Julian
I would show Excel's built in connection properties dialog and let each client configure this themselves.
You really shouldn't be distributing connection strings for all your clients as part of a shared package. It'll be a maintenance nightmare.
http://office.microsoft.com/en-nz/excel-help/connection-properties-HA010175443.aspx
What about storing the settings as custom properties in Book.xltx and copy it to
C:\Users\username\AppData\Roaming\Microsoft\Excel\XLSTART
This is how you can programatically read and create Custom Properties in Excel.
When a new workbook is created then all the properties from Book.xltx will be available for New Workbook as well. But this solution may not work if the existing workbooks are opened so yo might have to explicitly add these properties when any existing workbooks are opened using the above solution.
Ok, I think I understand the problem. :)
How about storing all the settings in an offsite database that you maintain and using a webservice to allow the settings to be read and set and giving each client a unique license number that would allow them to access this webservice.
That way the individual users would only need a license number to set it up.

SharePoint Content Type Event Receivers Impossible to Remove

I have a very odd situation in my SharePoint staging environment. We recently stood up a new SharePoint 2010 server (single WFE + a DB server), and attached a backed-up content database from our existing environment. We created a new web application, and pointed it at the attached content database. All of our site collections, sites, lists, etc. appeared, and things appeared good.
We had deployed some custom content types to our existing environment prior to moving the database, and we wanted to upgrade those content types. Specifically, we attach event receivers to the content types (using code, not XML) and we needed to update the assembly version that those event receivers point to. So we ran our usual code (part of a feature receiver) to remove the event receivers, but to our surprise, the receivers remained.
In an attempt to remedy the situation, we wrote a console application that iterates over all content types (SPWeb.ContentTypes) in the root site of each site collection and deletes them, and then calls SPContentType.Update(true) on each content type. There are no errors returned from the call to Update, but again to our even greater surprise, SharePoint still reports the event receivers are attached.
In a desparate last ditch effort, we even went into the content database (after taking a snapshot -- and remember, this is staging, not production!) and manually DELETED the offending receivers from the EventReceivers table. We figured that should have at least some kind of effect. Alas, SharePoint still reports the receivers as being present.
We perform these types of upgrades on content type event receivers all the time, but have never run into this issue on any other SharePoint farm. Does it sound like an environmental problem? Is it something that could have been caused by moving the content database? Any help would be appreciated, because we are completely stumped at this point.
1st of all, I will never recommend changing anything in DB. It will surely give you trouble in long run.
You did mention that you tried to remove the event reciever from Web level but not sure if you have tried removing it from List/Library level
Use ContentTypeUsage class and try deleting from List/Library level
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spcontenttypeusage.aspx