Identifying file changes in StorageFolder - windows-8

I'm trying to keep a database in sync with the Windows 8 music library and I'm yet to find an efficient solution for doing so. I know that .NET has a FileSystemWatcher that's not available to Windows 8 apps. Currently, my idea is to compare the list of files returned by GetFilesAsync against my database and check if something was modified, deleted or added. I know this is not ideal but I can't find any other useful thing in Windows.Storage. My problem is that I want to make these updates automatically once there is a modification to the music library. Checking the ModifiedDate of the folders is useless when the changes happen in subfolders. Does anybody know if there is a way to tell when has a StorageFolder been modified?

If you are able to get ContentsChanged to fire reliably, then the code below may help you determine what changed.
Note, it isn't fast. GetBasicPropertiesAsync appears to take ~5ms/file... so ~10 seconds to diff a set of 1000 files.
(I cannot get ContentsChanged to fire reliably, and, after hours of googling, it appears many others have the same problem)
private class DiffSet
{
public IReadOnlyList<StorageFile> Added { get; set; }
public IReadOnlyList<StorageFile> Deleted { get; set; }
public IReadOnlyList<StorageFile> Changed { get; set; }
}
private static async Task<DiffSet> Diff(IEnumerable<StorageFile> oldSet, IEnumerable<StorageFile> newSet)
{
var newAsDict = newSet.ToDictionary(sf => sf.Path);
var added = new List<StorageFile>();
var deleted = new List<StorageFile>();
var changed = new List<StorageFile>();
var fromOldSet = new HashSet<string>();
foreach (var oldFile in oldSet)
{
if (!newAsDict.ContainsKey(oldFile.Path))
{
deleted.Add(oldFile);
continue;
}
var oldBasicProperties = await oldFile.GetBasicPropertiesAsync();
var newBasicProperties = await newAsDict[oldFile.Path].GetBasicPropertiesAsync();
var oldDateModified = oldBasicProperties.DateModified;
var newDateModified = newBasicProperties.DateModified;
if (oldDateModified != newDateModified)
{
changed.Add(oldFile);
}
fromOldSet.Add(oldFile.Path);
}
foreach (var newFile in newSet)
{
if (!fromOldSet.Contains(newFile.Path))
added.Add(newFile);
}
return new DiffSet
{
Added = added,
Deleted = deleted,
Changed = changed
};
}

I don't think you can get that info from your Windows 8 app. Your best bet is to query the folders and files asynchronously and compare the info to that stored in the database. See one example of enumerating folders and files here. I know this is not very efficient for what you are trying to do. If you find any other better ways, please share.

You can use the StorageFileQueryResult and its ContentsChanged event to be notified on changes on a folder and its subfolders. However the event does not contain any information about what actually changed, so you need to re-parse the folder and check if anything you're interested in has been modified.

This works for me:
public async void ListenAsync() {
query = storageFolder.CreateFileQuery();
query.ContentsChanged += query_ContentsChanged;
var files = await query.GetFilesAsync();
}
void query_ContentsChanged(IStorageQueryResultBase sender, object args) {
// args has no info about what changed. check manually
}

Related

Easy way to retrieve image source in abp

I'm pretty new to ABP Framework and probably this question has a really simple answer, but I haven't managed to find it. Images are an important part of any app and handling them the best way (size, caching) is mandatory.
Scenario
setup a File System Blob Storing provider. This means that the upload file will be stored in the file system as an image file
make a service that uses a Blob container to save and retrieve the image. So, after saving it, I use the unique file name as a blob name. This name is used to retrieve it back.
the user is logged in, so authorization is required
I can easily obtain the byte[]s of the image by calling blobContainer.GetAllBytesOrNullAsync(blobName)
I want to easily display the image in <img> or in datatable row directly.
So, here is my question: is there an easy way to use a blob stored image as src of a <img> directly in a razor page? What I've managed to achieve is setting in the model, a source as a string made from image type + bytes converted to base 64 string (as here) however in this case I need to do it in the model and also I don't know if caching is used by the browser. I don't see how caching would work in this case.
I am aware that this may be a question more related to asp.net core, but I was thinking that maybe in abp there is some way via a link to access the image.
If you have the ID of the blob then it is easy to do. Just create a Endpoint to get the Image based on the blob id.
Here is the sample AppService
public class DocumentAppService : FileUploadAppService
{
private readonly IBlobContainer<DocumentContainer> _blobContainer;
private readonly IRepository<Document, Guid> _repository;
public DocumentAppService(IRepository<Document, Guid> repository, IBlobContainer<DocumentContainer> blobContainer)
{
_repository = repository;
_blobContainer = blobContainer;
}
public async Task<List<DocumentDto>> Upload([FromForm] List<IFormFile> files)
{
var output = new List<DocumentDto>();
foreach (var file in files)
{
using var memoryStream = new MemoryStream();
await file.CopyToAsync(memoryStream).ConfigureAwait(false);
var id = Guid.NewGuid();
var newFile = new Document(id, file.Length, file.ContentType, CurrentTenant.Id);
var created = await _repository.InsertAsync(newFile);
await _blobContainer.SaveAsync(id.ToString(), memoryStream.ToArray()).ConfigureAwait(false);
output.Add(ObjectMapper.Map<Document, DocumentDto>(newFile));
}
return output;
}
public async Task<FileResult> Get(Guid id)
{
var currentFile = _repository.FirstOrDefault(x => x.Id == id);
if (currentFile != null)
{
var myfile = await _blobContainer.GetAllBytesOrNullAsync(id.ToString());
return new FileContentResult(myfile, currentFile.MimeType);
}
throw new FileNotFoundException();
}
}
Upload function will upload the files and Get function will get the file.
Now set the Get route as a src for the image.
Here is the blog post: https://blog.antosubash.com/posts/dotnet-file-upload-with-abp
Repo: https://github.com/antosubash/FileUpload

How to span a ConcurrentDictionary across load-balancer servers when using SignalR hub with Redis

I have ASP.NET Core web application setup with SignalR scaled-out with Redis.
Using the built-in groups works fine:
Clients.Group("Group_Name");
and survives multiple load-balancers. I'm assuming that SignalR persists those groups in Redis automatically so all servers know what groups we have and who are subscribed to them.
However, in my situation, I can't just rely on Groups (or Users), as there is no way to map the connectionId (Say when overloading OnDisconnectedAsync and only the connection id is known) back to its group, and you always need the Group_Name to identify the group. I need that to identify which part of the group is online, so when OnDisconnectedAsync is called, I know which group this guy belongs to, and on which side of the conversation he is.
I've done some research, and they all suggested (including Microsoft Docs) to use something like:
static readonly ConcurrentDictionary<string, ConversationInformation> connectionMaps;
in the hub itself.
Now, this is a great solution (and thread-safe), except that it exists only on one of the load-balancer server's memory, and the other servers have a different instance of this dictionary.
The question is, do I have to persist connectionMaps manually? Using Redis for example?
Something like:
public class ChatHub : Hub
{
static readonly ConcurrentDictionary<string, ConversationInformation> connectionMaps;
ChatHub(IDistributedCache distributedCache)
{
connectionMaps = distributedCache.Get("ConnectionMaps");
/// I think connectionMaps should not be static any more.
}
}
and if yes, is it thread-safe? if no, can you suggest a better solution that works with Load-Balancing?
Have been battling with the same issue on this end. What I've come up with is to persist the collections within the redis cache while utilising a StackExchange.Redis.IDatabaseAsync alongside locks to handle concurrency.
This unfortunately makes the entire process sync but couldn't quite figure a way around this.
Here's the core of what I'm doing, this attains a lock and return back a deserialised collection from the cache
private async Task<ConcurrentDictionary<int, HubMedia>> GetMediaAttributes(bool requireLock)
{
if(requireLock)
{
var retryTime = 0;
try
{
while (!await _redisDatabase.LockTakeAsync(_mediaAttributesLock, _lockValue, _defaultLockDuration))
{
//wait till we can get a lock on the data, 100ms by default
await Task.Delay(100);
retryTime += 10;
if (retryTime > _defaultLockDuration.TotalMilliseconds)
{
_logger.LogError("Failed to get Media Attributes");
return null;
}
}
}
catch(TaskCanceledException e)
{
_logger.LogError("Failed to take lock within the default 5 second wait time " + e);
return null;
}
}
var mediaAttributes = await _redisDatabase.StringGetAsync(MEDIA_ATTRIBUTES_LIST);
if (!mediaAttributes.HasValue)
{
return new ConcurrentDictionary<int, HubMedia>();
}
return JsonConvert.DeserializeObject<ConcurrentDictionary<int, HubMedia>>(mediaAttributes);
}
Updating the collection like so after I've done manipulating it
private async Task<bool> UpdateCollection(string redisCollectionKey, object collection, string lockKey)
{
var success = false;
try
{
success = await _redisDatabase.StringSetAsync(redisCollectionKey, JsonConvert.SerializeObject(collection, new JsonSerializerSettings
{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
}));
}
finally
{
await _redisDatabase.LockReleaseAsync(lockKey, _lockValue);
}
return success;
}
and when I'm done I just ensure the lock is released for other instances to grab and use
private async Task ReleaseLock(string lockKey)
{
await _redisDatabase.LockReleaseAsync(lockKey, _lockValue);
}
Would be happy to hear if you find a better way of doing this. Struggled to find any documentation on scale out with data retention and sharing.

Trouble Attaching File Programmatically to Email in Windows Metro App C#/XAML using Share Charm

I'm simply trying to attach a file named Document.pdf in the DocumentsLibrary to an email using the Share Charm. My code below works perfectly on the Local Machine:
private async void OnDataRequestedFiles(DataTransferManager sender, DataRequestedEventArgs e)
{
List<IStorageItem> shares = new List<IStorageItem>();
StorageFile filetoShare = await Windows.Storage.KnownFolders.DocumentsLibrary.GetFileAsync("Document.pdf");
if (filetoShare != null)
{
shares.Add(filetoShare);
filetoShare = null;
}
if (shares != null)
{
DataPackage requestData = e.Request.Data;
requestData.Properties.Title = "Title";
requestData.Properties.Description = "Description"; // The description is optional.
requestData.SetStorageItems(shares);
shares = null;
}
else
{
e.Request.FailWithDisplayText("File not Found.");
}
}
But when I run the exact same code on a Windows Surface Tablet, I get the dreaded "There's nothing to share right now." on the right in the Charms flyout area.
Here's a little more background to help:
I'm not looking to use a File Picker...I know the exact file I'm looking for
I've enabled the Documents Library Capability in the manifest
I've added a File Type Association for pdf in the manifest
and yes, the file does exist and is in the Documents Library
an email account is properly setup in the Mail App on the surface
I can successfully send text emails from the Tablet...just not emails with attachments
Like I said, this works on my Win 8 Development Machine as expected...just not on the Surface. I'm wondering if the Surface has different file or folder permissions?
Thanks for the help...this is driving me CRAZY
I finally figured it out - the problem was that my Event Handler was async (so that I could use await to set the StorageFile variable).
I solved it by setting the StorageFile variable earlier in my code so that it was already available when the Event Handler was called.
I still have no idea why it worked on my development machine, but no on the WinRT surface...
The handler can be an async method. In this case, it is critical to use DataTransferManager. Please refer to the MSDN page specifically for this scenario. For your convenience, the code from the page is copied to here:
private void RegisterForShare()
{
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
dataTransferManager.DataRequested += new TypedEventHandler<DataTransferManager,
DataRequestedEventArgs>(this.ShareStorageItemsHandler);
}
private async void ShareStorageItemsHandler(DataTransferManager sender,
DataRequestedEventArgs e)
{
DataRequest request = e.Request;
request.Data.Properties.Title = "Share StorageItems Example";
request.Data.Properties.Description = "Demonstrates how to share files.";
// Because we are making async calls in the DataRequested event handler,
// we need to get the deferral first.
DataRequestDeferral deferral = request.GetDeferral();
// Make sure we always call Complete on the deferral.
try
{
StorageFile logoFile =
await Package.Current.InstalledLocation.GetFileAsync("Assets\\Logo.png");
List<IStorageItem> storageItems = new List<IStorageItem>();
storageItems.Add(logoFile);
request.Data.SetStorageItems(storageItems);
}
finally
{
deferral.Complete();
}
}
It is critical to place the following statement before any async method is called:
DataTransferManager dataTransferManager = DataTransferManager.GetForCurrentView();
You only have half a second to get the whole job done (getting the file, attaching...etc.). If the half-second deadline occurs you'll get this "driving crazy" message. Consider implementing some resumable logic and replace the message with "the attachment is being prepared please try again in a few seconds" (or else).
Your WinRT device might be just slower than your development machine. The latter just does the job before the deadline...

How do I use Sitecore.Data.Serialization.Manager.LoadItem(path,LoadOptions) to restore a item to Sitecore?

I am trying to use the sitecore API to serialize and restore sitecore items. I have created a WCF app to retrieve an Item name given a ID or sitecore path (/sitecore/content/home), retrieve a list of the names of the items children give an id or path. I can also Serialize the content tree.
public void BackupItemTree(string id)
{
Database db = Sitecore.Configuration.Factory.GetDatabase("master");
Item itm = db.GetItem(id);
Sitecore.Data.Serialization.Manager.DumpTree(itm);
}
The above code works great. After running it can see that the content tree has been serialized.
However when I try to restore the serialized items useing the following:
public void RestoreItemTree(string path)
{
try
{
using (new Sitecore.SecurityModel.SecurityDisabler())
{
Database db = Sitecore.Configuration.Factory.GetDatabase("master");
Data.Serialization.LoadOptions opt = new Data.Serialization.LoadOptions(db);
opt.ForceUpdate = true;
Sitecore.Data.Serialization.Manager.LoadItem(path, opt);
//Sitecore.Data.Serialization.Manager.LoadTree(path, opt);
}
}
catch (Exception ex)
{
throw ex;
}
}
With this code I get no errors. It runs, but if I check SiteCore it didn't do anything. I have tested using the Office Core example. The path I sent in, which might be the issue is:
C:\inetpub\wwwroot\sitecoretest\Data\serialization\master\sitecore\content\Home\Standard-Items\Teasers\Our-Clients.item
and
C:\inetpub\wwwroot\sitecorebfahnestockinet\Data\serialization\master\sitecore\content\Home\Standard-Items\Teasers\Our-Clients
Neither seems to do anything. I changed the teaser title of the item and am trying to restore to before the but every time the change is still present.
Any help would be appreciated as the SiteCore documentation is very limited.
You can always check how the Sitecore code works using Reflector, the following method is called when you click "Revert Item" in back-end:
protected virtual Item LoadItem(Item item, LoadOptions options)
{
Assert.ArgumentNotNull(item, "item");
return Manager.LoadItem(PathUtils.GetFilePath(new ItemReference(item).ToString()), options);
}
In LoadOptions you can specify whether you want to overwrite ("Revert Item") or just update ("Update Item") it.
See Sitecore.Shell.Framework.Commands.Serialization.LoadItemCommand for more info.
You have the correct LoadOptions for forcing an overwrite (aka Revert).
I suspect that the path you are using for the .item file wrong. I would suggest modifying your method to take a path to a Sitecore item. Using that path, you should leverage other serialization APIs to determine where the file should be.
public void RestoreItemTree(string itemPath)
{
Sitecore.Data.Database db = Sitecore.Configuration.Factory.GetDatabase("master");
Sitecore.Data.Serialization.ItemReference itemReference = new Sitecore.Data.Serialization.ItemReference(db.Name, itemPath);
string path = Sitecore.Data.Serialization.PathUtils.GetFilePath(itemReference.ToString());
Sitecore.Data.Serialization.LoadOptions opt = new Sitecore.Data.Serialization.LoadOptions(db);
opt.ForceUpdate = true;
using (new Sitecore.SecurityModel.SecurityDisabler())
{
Sitecore.Data.Serialization.Manager.LoadItem(path, opt);
}
}
Took me a while to work out, but you have to remove .item when restoring the tree
try this
public void RestoreItemTree(string itemPath)
{
var db = Factory.GetDatabase("master");
var itemReference = new ItemReference(db.Name, itemPath);
var path = PathUtils.GetFilePath(itemReference.ToString());
if (!System.IO.File.Exists(path))
{
throw new Exception("File not found " + path);
}
var opt = new LoadOptions(db);
opt.ForceUpdate = true;
using (new SecurityDisabler())
{
Manager.LoadItem(path, opt);
Manager.LoadTree(path.Replace(".item", ""), opt);
}
}

SharePoint 2010 Rename Document on Upload Fails in Explorer View

I'm trying to implement a customization in SharePoint 2010 so that when a document is uploaded to a library, the file name is changed to include the Document ID in the name. (I know that people shouldn't worry about file names as much any more, but we have a lot of legacy files already named and users who like to have local copies).
I was able to implement a custom Event Receiver on the ItemAdded event that renames the file by adding the Document ID before the file name. This works correctly from the web Upload.
The problem is with the Explorer View. When I try to add the file using WebDAV in the Explorer View, I get two copies of the file. It seems that when a file is uploaded via the Web the events that fire are
ItemAdding
ItemAdded
But when I copy/paste a file into Explorer View I see the following events:
ItemAdding
ItemAdded
ItemAdding
ItemAdded
ItemUpdating
ItemUpdated
The result is I have two files with different names (since the Document IDs are different).
I've found a lot of people talking about this issue online (this is the best article I found). Anyone have any other ideas? Would it make more sense to do this in a workflow instead of an event receiver? I could use a scheduled job instead, but that might be confusing to the user if the document name changed a few minutes later.
This is my code that works great when using the Web upload but not when using Explorer View:
public override void ItemAdded(SPItemEventProperties properties)
{
try
{
SPListItem currentItem = properties.ListItem;
if (currentItem["_dlc_DocId"] != null)
{
string docId = currentItem["_dlc_DocId"].ToString();
if (!currentItem["BaseName"].ToString().StartsWith(docId))
{
EventFiringEnabled = false;
currentItem["BaseName"] = docId + currentItem["BaseName"];
currentItem.SystemUpdate();
EventFiringEnabled = true;
}
}
}
catch (Exception ex)
{
//Probably should log an error here
}
base.ItemAdded(properties);
}
I have found that using a Visual Studio workflow allows me the most flexibility to do this. A SharePoint Designer Workflow would be simpler, but would be harder to deploy to different sites and libraries.
After reading some good articles including this and this I have come up with this code which seems to work. It starts a workflow and waits until the document is not in a LockState and then processes the filename.
The workflow looks like this:
And here is the code behind:
namespace ControlledDocuments.RenameWorkflow
{
public sealed partial class RenameWorkflow : SequentialWorkflowActivity
{
public RenameWorkflow()
{
InitializeComponent();
}
public Guid workflowId = default(System.Guid);
public SPWorkflowActivationProperties workflowProperties = new SPWorkflowActivationProperties();
Boolean continueWaiting = true;
private void onWorkflowActivated1_Invoked(object sender, ExternalDataEventArgs e)
{
CheckFileStatus();
}
private void whileActivity(object sender, ConditionalEventArgs e)
{
e.Result = continueWaiting;
}
private void onWorkflowItemChanged(object sender, ExternalDataEventArgs e)
{
CheckFileStatus();
}
private void CheckFileStatus()
{
if (workflowProperties.Item.File.LockType == SPFile.SPLockType.None)
{
continueWaiting = false;
}
}
private void renameFile(object sender, EventArgs e)
{
try
{
SPListItem currentItem = workflowProperties.Item;
if (currentItem["_dlc_DocId"] != null)
{
string docId = currentItem["_dlc_DocId"].ToString();
if (!currentItem["BaseName"].ToString().StartsWith(docId))
{
currentItem["BaseName"] = docId + currentItem["BaseName"];
currentItem.SystemUpdate();
}
}
}
catch (Exception ex)
{
//Should do something useful here
}
}
}
}
Hope this helps someone else if they have the same problem.
Well i'd go for the workflow workaround... there are 2 options imo:
1) Create a boolean fied in your document library, then create a SPD workflow that fires when the item is added and set that field to "Changed" or something. In the EventReceiver you then check whether that field has been set..
2) Do everything with the SPD workflow - changing the title like in this example should be no problem.