I'm trying to implement a customization in SharePoint 2010 so that when a document is uploaded to a library, the file name is changed to include the Document ID in the name. (I know that people shouldn't worry about file names as much any more, but we have a lot of legacy files already named and users who like to have local copies).
I was able to implement a custom Event Receiver on the ItemAdded event that renames the file by adding the Document ID before the file name. This works correctly from the web Upload.
The problem is with the Explorer View. When I try to add the file using WebDAV in the Explorer View, I get two copies of the file. It seems that when a file is uploaded via the Web the events that fire are
ItemAdding
ItemAdded
But when I copy/paste a file into Explorer View I see the following events:
ItemAdding
ItemAdded
ItemAdding
ItemAdded
ItemUpdating
ItemUpdated
The result is I have two files with different names (since the Document IDs are different).
I've found a lot of people talking about this issue online (this is the best article I found). Anyone have any other ideas? Would it make more sense to do this in a workflow instead of an event receiver? I could use a scheduled job instead, but that might be confusing to the user if the document name changed a few minutes later.
This is my code that works great when using the Web upload but not when using Explorer View:
public override void ItemAdded(SPItemEventProperties properties)
{
try
{
SPListItem currentItem = properties.ListItem;
if (currentItem["_dlc_DocId"] != null)
{
string docId = currentItem["_dlc_DocId"].ToString();
if (!currentItem["BaseName"].ToString().StartsWith(docId))
{
EventFiringEnabled = false;
currentItem["BaseName"] = docId + currentItem["BaseName"];
currentItem.SystemUpdate();
EventFiringEnabled = true;
}
}
}
catch (Exception ex)
{
//Probably should log an error here
}
base.ItemAdded(properties);
}
I have found that using a Visual Studio workflow allows me the most flexibility to do this. A SharePoint Designer Workflow would be simpler, but would be harder to deploy to different sites and libraries.
After reading some good articles including this and this I have come up with this code which seems to work. It starts a workflow and waits until the document is not in a LockState and then processes the filename.
The workflow looks like this:
And here is the code behind:
namespace ControlledDocuments.RenameWorkflow
{
public sealed partial class RenameWorkflow : SequentialWorkflowActivity
{
public RenameWorkflow()
{
InitializeComponent();
}
public Guid workflowId = default(System.Guid);
public SPWorkflowActivationProperties workflowProperties = new SPWorkflowActivationProperties();
Boolean continueWaiting = true;
private void onWorkflowActivated1_Invoked(object sender, ExternalDataEventArgs e)
{
CheckFileStatus();
}
private void whileActivity(object sender, ConditionalEventArgs e)
{
e.Result = continueWaiting;
}
private void onWorkflowItemChanged(object sender, ExternalDataEventArgs e)
{
CheckFileStatus();
}
private void CheckFileStatus()
{
if (workflowProperties.Item.File.LockType == SPFile.SPLockType.None)
{
continueWaiting = false;
}
}
private void renameFile(object sender, EventArgs e)
{
try
{
SPListItem currentItem = workflowProperties.Item;
if (currentItem["_dlc_DocId"] != null)
{
string docId = currentItem["_dlc_DocId"].ToString();
if (!currentItem["BaseName"].ToString().StartsWith(docId))
{
currentItem["BaseName"] = docId + currentItem["BaseName"];
currentItem.SystemUpdate();
}
}
}
catch (Exception ex)
{
//Should do something useful here
}
}
}
}
Hope this helps someone else if they have the same problem.
Well i'd go for the workflow workaround... there are 2 options imo:
1) Create a boolean fied in your document library, then create a SPD workflow that fires when the item is added and set that field to "Changed" or something. In the EventReceiver you then check whether that field has been set..
2) Do everything with the SPD workflow - changing the title like in this example should be no problem.
Related
I want to print a node to a pdf file using "Microsoft Print to PDF" printer. Supposing that the Printer object is already extracted I have the next function which is working perfectly.
public static void printToPDF(Printer printer, Node node) {
PrinterJob job = PrinterJob.createPrinterJob(printer);
if (job != null) {
job.getJobSettings().setPrintQuality(PrintQuality.HIGH);
PageLayout pageLayout = job.getPrinter().createPageLayout(Paper.A4, PageOrientation.PORTRAIT,
Printer.MarginType.HARDWARE_MINIMUM);
boolean printed = job.printPage(pageLayout, node);
if (printed) {
job.endJob();
} else {
System.out.println("Printing failed.");
}
} else {
System.out.println("Could not create a printer job.");
}
}
The only issue that I have here, is that a dialog box is popping up and asking for a destination path to save the pdf. I was struggling to find a solution to set the path programmatically, but with no success. Any suggestions? Thank you in advance.
After some more research I came with an ugly hack. I accessed jobImpl private field from PrinterJob, and I took attributes out of it. Therefore I inserted the destination attribute, and apparently it is working as requested. I know it is not nice, but ... is kind of workable. If you have any nicer suggestion, please do not hesitate to post them.
try {
java.lang.reflect.Field field = job.getClass().getDeclaredField("jobImpl");
field.setAccessible(true);
PrinterJobImpl jobImpl = (PrinterJobImpl) field.get(job);
field.setAccessible(false);
field = jobImpl.getClass().getDeclaredField("printReqAttrSet");
field.setAccessible(true);
PrintRequestAttributeSet printReqAttrSet = (PrintRequestAttributeSet) field.get(jobImpl);
field.setAccessible(false);
printReqAttrSet.add(new Destination(new java.net.URI("file:/C:/deleteMe/wtv.pdf")));
} catch (Exception e) {
System.err.println(e);
}
I'm trying to keep a database in sync with the Windows 8 music library and I'm yet to find an efficient solution for doing so. I know that .NET has a FileSystemWatcher that's not available to Windows 8 apps. Currently, my idea is to compare the list of files returned by GetFilesAsync against my database and check if something was modified, deleted or added. I know this is not ideal but I can't find any other useful thing in Windows.Storage. My problem is that I want to make these updates automatically once there is a modification to the music library. Checking the ModifiedDate of the folders is useless when the changes happen in subfolders. Does anybody know if there is a way to tell when has a StorageFolder been modified?
If you are able to get ContentsChanged to fire reliably, then the code below may help you determine what changed.
Note, it isn't fast. GetBasicPropertiesAsync appears to take ~5ms/file... so ~10 seconds to diff a set of 1000 files.
(I cannot get ContentsChanged to fire reliably, and, after hours of googling, it appears many others have the same problem)
private class DiffSet
{
public IReadOnlyList<StorageFile> Added { get; set; }
public IReadOnlyList<StorageFile> Deleted { get; set; }
public IReadOnlyList<StorageFile> Changed { get; set; }
}
private static async Task<DiffSet> Diff(IEnumerable<StorageFile> oldSet, IEnumerable<StorageFile> newSet)
{
var newAsDict = newSet.ToDictionary(sf => sf.Path);
var added = new List<StorageFile>();
var deleted = new List<StorageFile>();
var changed = new List<StorageFile>();
var fromOldSet = new HashSet<string>();
foreach (var oldFile in oldSet)
{
if (!newAsDict.ContainsKey(oldFile.Path))
{
deleted.Add(oldFile);
continue;
}
var oldBasicProperties = await oldFile.GetBasicPropertiesAsync();
var newBasicProperties = await newAsDict[oldFile.Path].GetBasicPropertiesAsync();
var oldDateModified = oldBasicProperties.DateModified;
var newDateModified = newBasicProperties.DateModified;
if (oldDateModified != newDateModified)
{
changed.Add(oldFile);
}
fromOldSet.Add(oldFile.Path);
}
foreach (var newFile in newSet)
{
if (!fromOldSet.Contains(newFile.Path))
added.Add(newFile);
}
return new DiffSet
{
Added = added,
Deleted = deleted,
Changed = changed
};
}
I don't think you can get that info from your Windows 8 app. Your best bet is to query the folders and files asynchronously and compare the info to that stored in the database. See one example of enumerating folders and files here. I know this is not very efficient for what you are trying to do. If you find any other better ways, please share.
You can use the StorageFileQueryResult and its ContentsChanged event to be notified on changes on a folder and its subfolders. However the event does not contain any information about what actually changed, so you need to re-parse the folder and check if anything you're interested in has been modified.
This works for me:
public async void ListenAsync() {
query = storageFolder.CreateFileQuery();
query.ContentsChanged += query_ContentsChanged;
var files = await query.GetFilesAsync();
}
void query_ContentsChanged(IStorageQueryResultBase sender, object args) {
// args has no info about what changed. check manually
}
just got some Problems with Loading Xaml Files by Runtime.
For your Information my Code-Snippet to Load the File as Content of a Usercontrol:
public UserControl LoadXaml(FileInfo paramFile)
{
FileInfo _XamlFile = paramFile;
UIElement rootElement;
FileStream s = new FileStream(_XamlFile.FullName, FileMode.Open);
rootElement = (UIElement)XamlReader.Load(s);
s.Close();
UserControl uc = new UserControl();
if (rootElement.GetType() == typeof(Window))
{
uc.Content = (rootElement as Window).Content;
}
else
{
uc = rootElement as UserControl;
}
return uc;
}
private void lstPDFDokumente_SelectionChanged(object sender, SelectionChangedEventArgs e)
{
var _XamlFile = ((System.Windows.Controls.ListBox)sender).SelectedItem as FileInfo;
if (_XamlFile != null)
{
layoutGrid.Children.Clear();
System.Windows.Controls.UserControl rootElement;
rootElement = XamlController.LoadXaml(_XamlFile);
layoutGrid.Children.Add(rootElement);
}
}
This works fine while Events and x:Class="..." are deleted by hand.
The Problems I try to solve are:
If there is a x:Class="..." at the root element the XamlReader throws the first exception.
When the XamlReader reaches a Control which contains an event, for Example Click or TextChanged, it throws another Exception.
What i try to figure out is how to load a XamlFile, show it inside of a Control in the main Window and to show some of the attributes like Name,Height,Width and so on.
Just read dozens of Websites but never found a topic about to make a preview or things like that.
One of the solutions i tried is to read the Xaml File as XML and delete that code.
The Problem was to get a list of all possible Events in C#.
If there are some Questions to that Code, feel free to ask :)
Greetings
Daniel
I'm simply trying to attach a file named Document.pdf in the DocumentsLibrary to an email using the Share Charm. My code below works perfectly on the Local Machine:
private async void OnDataRequestedFiles(DataTransferManager sender, DataRequestedEventArgs e)
{
List<IStorageItem> shares = new List<IStorageItem>();
StorageFile filetoShare = await Windows.Storage.KnownFolders.DocumentsLibrary.GetFileAsync("Document.pdf");
if (filetoShare != null)
{
shares.Add(filetoShare);
filetoShare = null;
}
if (shares != null)
{
DataPackage requestData = e.Request.Data;
requestData.Properties.Title = "Title";
requestData.Properties.Description = "Description"; // The description is optional.
requestData.SetStorageItems(shares);
shares = null;
}
else
{
e.Request.FailWithDisplayText("File not Found.");
}
}
But when I run the exact same code on a Windows Surface Tablet, I get the dreaded "There's nothing to share right now." on the right in the Charms flyout area.
Here's a little more background to help:
I'm not looking to use a File Picker...I know the exact file I'm looking for
I've enabled the Documents Library Capability in the manifest
I've added a File Type Association for pdf in the manifest
and yes, the file does exist and is in the Documents Library
an email account is properly setup in the Mail App on the surface
I can successfully send text emails from the Tablet...just not emails with attachments
Like I said, this works on my Win 8 Development Machine as expected...just not on the Surface. I'm wondering if the Surface has different file or folder permissions?
Thanks for the help...this is driving me CRAZY
I finally figured it out - the problem was that my Event Handler was async (so that I could use await to set the StorageFile variable).
I solved it by setting the StorageFile variable earlier in my code so that it was already available when the Event Handler was called.
I still have no idea why it worked on my development machine, but no on the WinRT surface...
The handler can be an async method. In this case, it is critical to use DataTransferManager. Please refer to the MSDN page specifically for this scenario. For your convenience, the code from the page is copied to here:
private void RegisterForShare()
{
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
dataTransferManager.DataRequested += new TypedEventHandler<DataTransferManager,
DataRequestedEventArgs>(this.ShareStorageItemsHandler);
}
private async void ShareStorageItemsHandler(DataTransferManager sender,
DataRequestedEventArgs e)
{
DataRequest request = e.Request;
request.Data.Properties.Title = "Share StorageItems Example";
request.Data.Properties.Description = "Demonstrates how to share files.";
// Because we are making async calls in the DataRequested event handler,
// we need to get the deferral first.
DataRequestDeferral deferral = request.GetDeferral();
// Make sure we always call Complete on the deferral.
try
{
StorageFile logoFile =
await Package.Current.InstalledLocation.GetFileAsync("Assets\\Logo.png");
List<IStorageItem> storageItems = new List<IStorageItem>();
storageItems.Add(logoFile);
request.Data.SetStorageItems(storageItems);
}
finally
{
deferral.Complete();
}
}
It is critical to place the following statement before any async method is called:
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
You only have half a second to get the whole job done (getting the file, attaching...etc.). If the half-second deadline occurs you'll get this "driving crazy" message. Consider implementing some resumable logic and replace the message with "the attachment is being prepared please try again in a few seconds" (or else).
Your WinRT device might be just slower than your development machine. The latter just does the job before the deadline...
Hi: I've built a simple sequential workflow that is triggered by the retention policy used by a content type. The workflow takes the metadata of the current item and copies it to a purge log list located on a different site collection.
Having a hard time debugging this because of the retention policy relies on two timer jobs: information management and expiration policy. Attach the degugger to the w3wp and owetimer process but does not reliably catch the activated workflow.
My workflow history list for each file shows the workflow completed successfully. The meat of the workflow is a codeactivity that collects the data from the current item and updates a central list. That list is showing zero items after the process completes. Is it the code located in the codeactivity?
I removed all of the exception handling to simplify what's being shown.
Code
public Workflow1()
{
InitializeComponent();
}
public Guid workflowId = default(System.Guid);
public SPWorkflowActivationProperties workflowProperties = new SPWorkflowActivationProperties();
string purgeLogListPath = #"http://shptserver/sites/sp";
string listPath = #"http://shptserver/sites/sp/Lists/PurgeLog/AllItems.aspx";
private void onWorkflowActivated1_Invoked(object sender, ExternalDataEventArgs e)
{
}
private void codeActivity1_ExecuteCode(object sender, EventArgs e)
{
SPSecurity.RunWithElevatedPrivileges(delegate()
{
using (SPSite site = new SPSite(purgeLogListPath))
{
try
{
SPWeb web = site.OpenWeb("/");
SPList list = web.GetListFromUrl(listPath);
DateTime today = DateTime.Now;
SPListItem item = list.Items.Add();
item["Title"] = onWorkflowActivated1.WorkflowProperties.Item.DisplayName;
item["Encoded Absolute URL"] = onWorkflowActivated1.WorkflowProperties.ItemUrl;
item["Content Type"] = onWorkflowActivated1.WorkflowProperties.Item.ContentType.Name;
item["Date Purged"] = today.ToString("MMMM dd yyyy");
item["DateTime Purged"] = today.ToString("MMMM dd yyyy hh:mm:ss");
item.Update();
}
catch (Exception ex)
{
string errorEntry = ex.Message;
}
}
});
}
}
Don't really see what you question is, but for the debugging troubles try doing it like this:
1) After you deploy you should restart the Timer Service
2) Via the Central Admin you can go to Timer Jobs and choose 'Run now' to trigger the job and start debugging