I am migrating an ASP.NET4.5 website to ASP.NET 5. One function we had returned images off the hard disk from an absolute location. The files arent stored within the web directory. Previously this worked fine, with the following code:
public ActionResult GetVideoImage(string serialNumber, int videoEntryId)
{
try
{
var serial = Device.FriendlySerialNumberToNumericalSerialNumber(serialNumber);
var entry = this.service.GetVideoEntry(serial, videoEntryId);
if (entry != null && System.IO.File.Exists(entry.FirstVideoFrameLocation.LocalPath))
{
return this.File(entry.FirstVideoFrameLocation.LocalPath, "image/jpeg"); // adjust content type appropriately
}
}
return this.Redirect("/content/noimage.png");
}
Unfortunately this doesnt work anymore and throws an exception. From what I can tell its because this.File now takes a virtualPath rather than an absolute one so balks at the idea of serving a file from outside of its web directory.
How can I get around this?
Also is ActionResult still the best
return type for this?
I found the answer on a MS thread here that links to a ASP github commit.
Long story short there are new classes available in the Microsoft.AspNet.Mvc namespace that allow the thing I'm looking for. I specifically chose PhysicalFileResult which works as expected
Related
These are great guides for migrating between the different versions of NuGet package:
https://github.com/Azure/azure-sdk-for-net/blob/Azure.Storage.Blobs_12.6.0/sdk/storage/Azure.Storage.Blobs/README.md
https://elcamino.cloud/articles/2020-03-30-azure-storage-blobs-net-sdk-v12-upgrade-guide-and-tips.html
However I am struggling to migrate the following concepts in my code:
// Return if a directory exists:
container.GetDirectoryReference(path).ListBlobs().Any();
where GetDirectoryReference is not understood and there appears to be no direct translation.
Also, the concept of a CloudBlobDirectory does not appear to have made it into Azure.Storage.Blobs e.g.
private static long GetDirectorySize(CloudBlobDirectory directoryBlob) {
long size = 0;
foreach (var blobItem in directoryBlob.ListBlobs()) {
if (blobItem is BlobClient)
size += ((BlobClient) blobItem).GetProperties().Value.ContentLength;
if (blobItem is CloudBlobDirectory)
size += GetDirectorySize((CloudBlobDirectory) blobItem);
}
return size;
}
where CloudBlobDirectory does not appear anywhere in the API.
There's no such thing as physical directories or folders in Azure Blob Storage. The directories you sometimes see are part of the blob (e.g. folder1/folder2/file1.txt). The List Blobs requests allows you to add a prefix and delimiter in a call, which are used by the Azure Portal and Azure Data Explorer to create a visualization of folders. As example prefix folder1/ and delimiter / would allow you to see the content as if folder1 was opened.
That's exactly what happens in your code. The GetDirectoryReference() adds a prefix. The ListBlobs() fires a request and Any() checks if any items return.
For V12 the command that'll allow you to do the same would be GetBlobsByHierarchy and its async version. In your particular case where you only want to know if any blobs exist in the directory a GetBlobs with prefix would also suffice.
I noticed this in the debug environment where I have to do many re-installs in order to test persistent data storage, initial settings, etc... It may not be relevant in production, but I mention this anyway just to inform other developers.
Any files created by an app in its App Folder are not 'visible' to queries after manual un-install / re-install (from IDE, for instance). The same applies to the 'Encoded DriveID' - it is no longer valid.
It is probably 'by design' but it effectively creates 'orphans' in the app folder until manually cleaned by 'drive.google.com > Manage Apps > [yourapp] > Options > Delete hidden app data'. It also creates problem if an app relies on finding of files by metadata, title, ... since these seem to be gone. As I said, not a production problem, but it can create some frustration during development.
Can any of friendly Googlers confirm this? Is there any other way to get to these files after re-install?
Try this approach:
Use requestSync() in onConnected() as:
#Override
public void onConnected(Bundle connectionHint) {
super.onConnected(connectionHint);
Drive.DriveApi.requestSync(getGoogleApiClient()).setResultCallback(syncCallback);
}
Then, in its callback, query the contents of the drive using:
final private ResultCallback<Status> syncCallback = new ResultCallback<Status>() {
#Override
public void onResult(#NonNull Status status) {
if (!status.isSuccess()) {
showMessage("Problem while retrieving results");
return;
}
query = new Query.Builder()
.addFilter(Filters.and(Filters.eq(SearchableField.TITLE, "title"),
Filters.eq(SearchableField.TRASHED, false)))
.build();
Drive.DriveApi.query(getGoogleApiClient(), query)
.setResultCallback(metadataCallback);
}
};
Then, in its callback, if found, retrieve the file using:
final private ResultCallback<DriveApi.MetadataBufferResult> metadataCallback =
new ResultCallback<DriveApi.MetadataBufferResult>() {
#SuppressLint("SetTextI18n")
#Override
public void onResult(#NonNull DriveApi.MetadataBufferResult result) {
if (!result.getStatus().isSuccess()) {
showMessage("Problem while retrieving results");
return;
}
MetadataBuffer mdb = result.getMetadataBuffer();
for (Metadata md : mdb) {
Date createdDate = md.getCreatedDate();
DriveId driveId = md.getDriveId();
}
readFromDrive(driveId);
}
};
Job done!
Hope that helps!
It looks like Google Play services has a problem. (https://stackoverflow.com/a/26541831/2228408)
For testing, you can do it by clearing Google Play services data (Settings > Apps > Google Play services > Manage Space > Clear all data).
Or, at this time, you need to implement it by using Drive SDK v2.
I think you are correct that it is by design.
By inspection I have concluded that until an app places data in the AppFolder folder, Drive does not sync down to the device however much to try and hassle it. Therefore it is impossible to check for the existence of AppFolder placed by another device, or a prior implementation. I'd assume that this was to try and create a consistent clean install.
I can see that there are a couple of strategies to work around this:
1) Place dummy data on AppFolder and then sync and recheck.
2) Accept that in the first instance there is the possibility of duplicates, as you cannot access the existing file by definition you will create a new copy, and use custom metadata to come up with a scheme to differentiate like-named files and choose which one you want to keep (essentially implement your conflict merge strategy across the two different files).
I've done the second, I have an update number to compare data from different devices and decide which version I want so decide whether to upload, download or leave alone. As my data is an SQLite DB I also have some code to only sync once updates have settled down and I deliberately consider people updating two devices at once foolish and the results are consistent but undefined as to which will win.
I currently have a problem figuring out how to use the ImageResizer plugin to properly work with SQL and the DiskCache plugin.
My strategy for naming is as follows:
/myimagetitle-4319560-100x100.jpg is rewritten to /4319560.jpg?id=4319560&title=myimagetitle&height=100&width=100 by the IIS URL Rewrite module.
This works as expected.
Now, in order to find the file name for the image, I need to translate the id using SQL. I have created a IVirtualImageProvider plugin, which implements the FileExistsand the GetFile methods.
public IVirtualFile GetFile(string virtualPath, NameValueCollection queryString)
{
var path = this.GetOriginalFilePath(queryString);
return new VirtualFileWrapper(new ProductPhotoVirtualFile(path));
}
public bool FileExists(string virtualPath, NameValueCollection queryString)
{
if (File.Exists(this.GetCachedFilePath(queryString)))
{
return true;
}
if (File.Exists(this.GetOriginalFilePath(queryString)))
{
return true;
}
return false;
}
private string GetCachedFilePath(NameValueCollection queryString)
{
// Get customized cache file path based on the query string
// "cache\4319\560\4319560_w_100_h_100.jpg"
}
private string GetOriginalFilePath(NameValueCollection queryString)
{
// Perform SQL lookup to translate the id from the query to a file name
}
I am using the DiskCache plugin to ensure my images are cached using IIS etc.
Unfortunately, the FileExists method is always run, executing the SQL on each request.
What I would like to achieve is the following:
have the DiskCache plugin run before the FileExists method, and in that way skip the actual SQL lookup, if the file is cached
handle the cache file naming strategy, in order for other tools to generate the images in the correct folders.
Is any of the above possible and/or am I doing something wrong?
Thanks
FileExists will always be called, as there's no other way to determine which IVirtualImageProvider should be responsible for the request — and therefore which is responsible for providing caching details like modified date and keys. A better name for it would be IsHandled
In practice, it's OK to lie within the FileExists method, as throwing a FileNotFoundException during .Open() later will also be handled as a 404. It's not OK to lie within FileExists if it will prevent another IVirtualImageProvider from working.
In your case, you should return true from FileExists if the image URL is in the format you're expecting (I.e, has an ID, or is within the path structure dedicated to SQL image blobs). Typically it's best to use a path prefix, though, as it's hard to predict more complex patterns reliably.
I am using the code below to try and bundle a pre-minified version of the jQuery simpleModal plugin in an ASP.NET MVC4 project:
public static void RegisterBundles(BundleCollection bundles)
{
AddDefaultIgnorePatterns(bundles.IgnoreList);
bundles.Add(new ScriptBundle("~/ModalBundle").Include("~/Scripts/jquery.simplemodal.{version}.min.js"));
}
where AddDefaultIgnorePatterns() is defined as per ASP.NET MVC 4 ScriptBundle returns empty:
public static void AddDefaultIgnorePatterns(IgnoreList ignoreList)
{
if (ignoreList == null)
throw new ArgumentNullException("ignoreList");
ignoreList.Clear();
ignoreList.Ignore("*.intellisense.js");
ignoreList.Ignore("*-vsdoc.js");
ignoreList.Ignore("*.debug.js", OptimizationMode.WhenEnabled);
//ignoreList.Ignore("*.min.js", OptimizationMode.WhenDisabled);
ignoreList.Ignore("*.min.css", OptimizationMode.WhenDisabled);
}
Unfortunately, the requested resource is never included when rendered using #Scripts.Render() after being registered in Application_Start().
The following combinations do seem to work, but where possible I did not want to have to change the file name across a large number of projects:
jquery.simplemodal.1.2.3.min.js (exact version match)
jquery.simplemodal.{version}.js (removal of '.min', with corresponding file rename)
The following do not work:
jquery.simplemodal-{version}.min.js (hyphen before version, with corresponding file rename)
jquery.simplemodal* (loose wildcard, which is very suspicious)
I have verified that the file definitely exists in the expected location, but must be missing something else fundamental. None of my other script or style bundles suffer from this problem. Any ideas?
If you want to use the bundle system you must not use minified files.
When you are un debug mode, your system will load normal files.
When you are in release mode, your system will mingnify it automatically.
you can look a working configuration here :
https://myprettycms.codeplex.com/SourceControl/latest#MyPrettyCMSCommunityManager/Portals/MVC4Portal/App_Start/BundleConfig.cs
In my project properties I go to publish, options, and file associations and enter ".cms", "Contact manager File" "pqcms" and "1icon.ico", but when I publish and install it does not appear to associate the files...I want to be able to double click on the file and have it open the program but it does not appear to do so.
I believe there are ways to edit the registry if you run your program as an administrator, but I really need clickonce to be happy with me because I am maximizing the features. Isn't clickonce supposed to set up the file association for me? Why isn't it?
and final question: what can I do without elevating privileges to administrator?
Have you added the code required to handle the user double-clicking on the file?
//Get the ActivationArguments from the SetupInformation property of the domain.
string[] activationData =
AppDomain.CurrentDomain.SetupInformation.ActivationArguments.ActivationData;
if (activationData != null)
{
Uri uri = new Uri(activationData[0]);
string fileNamePassedIn = uri.LocalPath.ToString();
//now you have the file name and you can handle it
}
One other thing to beware of. I originally converted this code (provided by RobinDotNet) to vb.net. Now I've converted the project to c# and ran into something interesting. When debugging (and I'd imagine if you chose to have the exe accessible as opposed to the click once reference app) "AppDomain.CurrentDomain.SetupInformation.ActivationArguments" is null (no activation arguments were assigned) so I modified the code slightly to trap this error.
//Get the ActivationArguments from the SetupInformation property of the domain if any are set.
if (AppDomain.CurrentDomain.SetupInformation.ActivationArguments != null)
{
string[] activationData =
AppDomain.CurrentDomain.SetupInformation.ActivationArguments.ActivationData;
if (activationData != null)
{
Uri uri = new Uri(activationData[0]);
string fileNamePassedIn = uri.LocalPath.ToString();
//now you have the file name and you can handle it
}
}