I have followed the imageresizer's troubleshooting for when images do not appear as they are listed here:
Potential causes:
You are using the .jpg.ashx syntax, and you did not register the HttpModule properly in both places of your Web.config file.
You are using the .jpg.ashx syntax, but you're not using a query string. You should drop the '.ashx' unless you actually want to process the file.
You are using ASP.NET MVC, but do not have the MvcRoutingShim plugin installed.
You have Precompilation enabled, and are using an image provider. This is caused by a long-standing bug in the .NET framework.
However, everything works perfectly in my local development environment. I have the MvcRoutingShim plugin installed and all the correct web.config references. On the live site, all images referenced by either just a querystring OR the RemoteReaderPlugin.Current.CreateSignedUrl procedure are not found.
All images are resulting in a 404 error.
Any ideas?
The answer: DiskCache(ConfigurationError): Not working: Your NTFS Security permissions are preventing the application from writing to the disk cache
Please give user read and write access to directory "C:\inetpub\wwwroot{site}\imagecache" to correct the problem. You can access NTFS security settings by right-clicking the aformentioned folder and choosing Properties, then Security.
Related
Running the pdf2htmlEX.exe Windows binary from the command prompt works as expected. While, running the pdf2htmlEX Windows binary in a wrapper (.Net in my case) I received an error like the one below.
__tmp_font1.ttf is not in a known format (or uses features of that format fontfo
rge does not support, or is so badly corrupted as to be unreadable)
Cannot load font C:\Users\admin\AppData\Local\Temp\pdf2htmlEX-5RLDCX/__tmp_fo
nt1.ttf
This is a pretty ambiguous error, and appears to be frequent among users when using the windows binary version.
Apparently Lu Wang wasn't able to offer a solution for Windows users, as all posts related are marked 'insufficient info'. Unfortunately, the pdf2htmlEX project is also archived, and no new comments can be added, so I'm adding this information here in the hope that this may help someone else in the future.
In my scenario, the library is called via an ASP.Net wrapper method using System.Diagnostics.Process to convert uploaded files into HTML versions. The Pdf2htmlEX library would work without issue from the Command Prompt, and for some reason, would also work perfectly in my development environment, but not in a production environment (Both of which are Windows Server 2012R2).
My first assumption, and correctly so, was that there was a permissions issue. Pdf2htmlEX uses FontForge internally to handle fonts, and one or both use the Windows Temp directory by default to store resource files used in the creation of the HTML and/or other files. And, I 'believe' although not confirmed, that it also may use the active user's %USERPROFILE%\AppData\Local\Temp folder...
When running test commands from Command Prompt, you are operating under your user context, and everything your user can do, Pdf2htmlEX can do. So everything works as expected.
In a server environment, the process is operating under the ApplicationPoolIdentity, a special IIS user type with limited permissions. Here it failed for me. While, I'd see folders and files created in the Windows Temp folder, they couldn't be opened by Pdf2HtmlEX to create the end files elsewhere.
Solution: (there may be other solutions for your individual case)
In my case, adding a new system user, adding that user to the Users group, and then setting the IIS worker process to that account resolved the issue. The reason I believe, is that the Users group has read/write access to the Windows Temp directory, and potentially other required areas of the system required for Pdf2htmlEX to complete.
What is the best practice approach to local user generated content when using Microsoft WebDeploy and Team City to deploy fixes to a site?
Using the deployment process described by Troy Hunt:
http://www.troyhunt.com/2010/11/you-deploying-it-wrong-teamcity_26.html
When changes are made to a site the WebDeploy agent updates the site including removing old files that are no longer needed - which is great. However in the case where a site contains user generated data (say users can upload an image which is stored as a file on disk or a simple CMS where page content files can be updated by the user), what is the best practice to prevent these files being deleted by the deployment agent?
Is there an ignore flag for certain folders?
Should the user files be stored outside the root of the deployed website (Is this a security risk)?
You basically need to use MSDeploy's skip rules. This will tell MSDeploy to ignore certain files, folder, or subfolders etc.
It depends on where you implement these to what the syntax will look like. But you have the following options:
If your publishing through VS.Net using a publishing profile you can include skip rules here (I've taken this approach and seen it work fine). This SO question should point you in the right direction - MSDeploy skip rules when using MSBuild PublishProfile with Visual Studio 2012
If your using a vs.net web solution (website / web application) I later found out you can also implement skip rules in the web.config. Although the following article is a bit old the approach may still be viable - How to write skip and replace rules for MSDeploy (I havent used, or tested this approach)
Last, but not least, you could use MSDeploy skip rule on the command line itself. So assuming you execute msdeply directly (as opposed to via msbuild) you would need to append a skip parameter with the relevant attributes you require. Further information can be found at: Demystifying MSDeploy skip rules or Web Deploy Operation Settings (Look for the skip command reference, about 2/3 down the page) (Using publishing profiles with MSBuild ultimately makes this call for you, i've seen it in action working by using the first approach above).
Hope that helps!
I've a very simple application built in MVC4. This application allow the users to upload a file, and the application generates an output.
This app works great locally, but when I publish to azure (by right click -> publish), I get a less descriptive error. I've figured out that the error was because in the code, we accessed to a server relative path, and that is not possible in azure. So I've found a way to solve that in this link, that says that I should use LocalResource, rather than Server.MapPath. That make sense for me, but so far, I'm struggling with the suggested line.
LocalResource localResource = RoleEnvironment.GetLocalResource("DownloadedTemplates");
I'm not able to get it working, and also can't get a proper error. BTW I'm not sure how to enable the error log in azure :(
So, after going deeper in MSDN, I've seen that I should configure the Local Storage Resources, but as I've created a local MVC4 project, I can't find where I should configure this.
I need to be able to store a temporary file in the application (hosted in azure).
Did someone faced with this problem?
Anybody knows how to enable the Local Storage Resource in a project like that?
TIA!
Milton RodrÃguez
Well, after struggling a while, I've ended up using Windows Azure Tools.
The steps:
Add a new project
Under Cloud category, select Windows Azure Cloud Service.Note that if you don't have this option, an option to install the needed SDK will be shown. Install it first.
Name it properly :)
New Windows Azure Cloud Service window will appear, select the role that fits your needs. In my case, I choose ASP.Net MVC4, and then removed it.Note that you can edit the name of the created role at the right.
In the Roles folder of your new project, select Add, and then Web Role Project in solution. Your project will be an option to add.
You can remove the other role in the folder, the web project created in step 4, and also the folder ending in Content (ie. WebRole1Content). Basically, you can remove the created assets, but the Azure Service, and link the service to your project.
You're almost done. Follow this link to configurate your local storage :)
Now you're done!
In Visual Studio 2012, using publish profiles along with web deploy simplifies the deployments quite a bit. However it still is missing few things or may be I don't know how to use it yet.
I prefer to use the NTLM authentication without storing the username and password (especially) in the publish profiles. How can this be done? If I leave the username and password empty, I am prompted for it. Is there a way like manually modifying the .pubxml files?
Why is the username/password stored in PublishProfileName.pubxml that I have checked in the source control and not in PublishProfileName.pubxml.user that is local to each user? I could at least save the username but obviously don't want that to be checked in.
The Configuration itself is not part of PublishProfileName.pubxml but is stored in PublishProfileName.pubxml.user as LastUsedBuildConfiguration.
Same for the Platform as last point.
I am also missing support for multi-server deployments. I am currently forced to use batch files in addition to Publish Profiles.
EDIT
The command line that works fine for publishing is
MSBuild.Exe MyProject.sln /p:Configuration=QA /p:DeployOnBuild=true;PublishProfile=PublishToQA;AllowUntrustedCertificate=true /p:authType=NTLM /p:UserName=
In this I would like to omit the /p:Configuration=QA if the configuration becomes part of the publish profile itself.
Some answers to your questions.
I prefer to use the NTLM authentication without storing the username and password (especially) in the publish profiles. How can
this be done? If I leave the username and password empty, I am
prompted for it. Is there a way like manually modifying the .pubxml
files?
Your authentication is typically driven by how Web Deploy is hosted. By default if you are using the Web Management Service then you are using IIS users for auth. With IIS users you can control which users have permissions to specific sites/apps. You can configure WMSVC to use windows auth as well though. If you have issues using VS for those scenarios let me know.
If you are using the Remote Agent service to host Web Deploy then in this case you'll be using windows auth.
Why is the username/password stored in PublishProfileName.pubxml that I have checked in the source control and not in
PublishProfileName.pubxml.user that is local to each user? I could
at least save the username but obviously don't want that to be checked
in.
We have another mechanism for you to determine what information is private/shared. With the exception of the password all publish info is shared (and checked in by default). In order to simplify the design you can either have a publish profile which is shared, or one which is not shared at all. There is no in-between in which you have a profile that some fields are shared and other not. Password is special cased here and encrypted on a per-user/per-machine basis in the .pubxml.user file.
If you'd like to have a private publish profile then you can simply not check in the .pubxml file which corresponds to the publish profile. These are stored in the Properties\PublishProfiles (or My Project\PublishProfiles for VB) and just exclude them from the project and don't check the files in. The publish dialog looks for the profiles on disk, not just the ones which are in the project. Everything should continue to work.
We don't support the concept of selectively storing values in the .pubxml.user file. The publish dialog will only store a set number of values in that file. Instead of
The Configuration itself is not part of PublishProfileName.pubxml but is stored in
PublishProfileName.pubxml.user as LastUsedBuildConfiguration.
Same for the Platform as last point.
This was a mistake it should have been stored in the .pubxml file, not the .pubxml.user file. We have since fixed this, but haven't had a chance to release the update yet.
The Configuration property cannot be set in the publish profile. The Configuration property is a core part of the build process. To be more specific, the reason why we didn't call this property Configuration is because the .pubxml file is imported into the definition of the .csproj/.vbproj during a build & publish. Since other properties are defined based on Configuration you cannot change the value once it's been set. I just blogged with way too much detail on this subject at http://sedodream.com/2012/10/27/MSBuildHowToSetTheConfigurationProperty.aspx. This limitation is an MSBuild thing not a publish limitation. For command line you should specify Configuration in the following way:
msbuild.exe myproj.csproj /p:...(other properties)... /p:Configuration=
I am also missing support for multi-server deployments. I am currently forced to use batch files in addition to Publish Profiles.
We don't have direct support for this, but if you expand on your needs I may be able to help. FYI I have an extension which you may be interested in. I have posted a 5 min video to http://sedodream.com/2012/03/14/PackageWebUpdatedAndVideoBelow.aspx.
You are free (and encouraged) to manually edit your pubxml files, so feel free to remove the password.
To switch to NTLM, change AuthType to NTLM in the first PropertyGroup.
Platform and Configuration remain build configuration, the user file just stores them so Visual Studio knows what the last configuration you deployed was.
By multi-server, do you mean a web farm? If so, you might try looking at the Web Farm Framework which basically performs MSDeploy syncs from the primary server to the others.
Alternatively, you could switch to the command line and use postSync to upload and execute a batch file on the remote server that triggers the other deployments from there.
I work on quite a few DotNetNuke sites, and occasionally (I haven't figured out the common factor yet), when I use the Database Publishing Wizard from Microsoft to create scripts for the site I've created on my Dev server, after running the scripts at the host (usually GoDaddy.com), and uploading the site files, I get an error... I'm 99.9% sure that it's not file related, so not sure where to begin in the DB. Unfortunately with DotNetNuke you don't get the YSOD, but a generic error, with no real way to find the actual exception that has occured.
I'm just curious if anyone has had similar deployment issues using the Database Publishing Wizard, and if so, how they overcame them? I own the RedGate toolset, but some hosts like GoDaddy don't allow you to direct connect to their servers...
The Database Publishing Wizard's generated scripts usually need to be tweaked since it sometimes gets the order wrong of table/procedure creation when dealing with constraints. What I do is first backup the database, then run the script, and if I get an error, I move that query to the end of the script. Continue restoring the database and running the script until it works.
There are two areas that I would look at -
Are you running in the dbo schema and was your scripted database
using dbo?
Are you using an objectqualifier in either your dev or your
production environment? (look at your sqldataprovider configuration
settings)
You should be able to expose the underlying error message by setting the following in the web.config:
customErrors mode="Off"
Could you elaborate on "and uploading the site files"? New instance of DNN? updating an existing site? upgrading DNN version? If upgrade or update -- what files are you adding/overwriting?
Also, when using GoDaddy, can you check to verify that the web site's identity (network service or asp.net machine account depending on your IIS version) has sufficient permissions to the website's file system? It should have modify permissions and these may need to be reapplied if you are overwriting files.
IIS6 (XP, Server 2000, 2003) = ASP.Net Machine Account
IIS7 (Vista, Server 2008) = Network Service
Test your generated scripts on a new local database (using the free SQL Express product or the full meal deal). If it runs fine locally, then you can be confident that it will run elsewhere, all things being equal.
If it bombs when you run it locally, use the process of elimination and work your way through the script execution to find the offending code.
My hunch is that the order of scripts could be off. I think I've had that happen before with the database publishing wizard.
Just read your follow up. In every case that I've had your problem, it was always something to do with the connection string in web.config. Even after hours of staring at it, it was always a connection string issue in web.config. Get up, take a walk and then come back.
If you are getting one of DNN's error pages, there is a chance it may have logged the error to the eventlog table.
Depending on exactly what is happening and what DNN is showing you you might be able to manually look inside the EventLog table, pull out the XML data stored there, and parse it to find the stack trace and detailed information regarding the specific error at hand.
I have found however though that I get MUCH better overall experiences with deployments using backups and restores of my database, that way I am 100% sure that all objects moved correctly, and honestly it works better in my experience.
With GoDaddy I know another MAJOR common issue is incorrect file permissions, preventing DNN from modifying the web.config and other files that it needs to do.