When uploading images on my local development environment, and uploading the contents of the Uploads and Content Folders up to the live environment the images do not show even after updating the application pool - do you know what step i am missing. I am using Piranha in passive mode?
Well without any further information it's hard to come up with something that will qualify as an answer :) However, if you have copied both the database and the binary files located in ~/App_Data/Content/ my guess would be that Windows is playing a prank on you and that the IIS pool of the live environment either:
Doesn't have permission to access the App_Data/Content folder, or
That permissions from your dev server have been copied to the live server so that the IIS app pool doesn't have permission to the copied files.
Regards
HÃ¥kan
Related
We have a requirement where we should provide capability to upload files up to 100 GB size. Current flow which we have is to put the file from client location/local system to the application server. Then application server pushes the file to a service account in Google Drive server. I would like to know if there is a way to push the file from local system directly to service account in Google Drive. This would help us to not have to store such big files in application server. Please let me know. Also would like to know if we can actually have Drive installed in our local system to point to a service account. This way these big files can be put into the drive location and it will be synced to server in the background.
I would like to know if there is a way to push the file from local system directly to service account in Google Drive
The only way I know is for you to upload them. The Upload Files page in the Drive API documentation details this feature. In your case, you'll have to use uploadType=resumable due to the file size you'll upload.
Also would like to know if we can actually have Drive installed in our local system to point to a service account
Syncing ala-Dropbox might be a bit tricky, I haven't read anything in the Drive documentation that has this feature. Syncing to desktop is usually just a .glink shortcut that will open up a browser.
At present, I start up red5 in linux command line ./red5.sh and it runs the script. Then I go to http://localhost:5080 demos page to set up my camera and audio input and all works fine in testing the stream both on demo page and in swf of my webpage.
Question is, do I need to include some java and/or action script for the swf player to
bypass the red5 demo page so I can directly connect my input and stream in the code of the page? Also so only logged in webpage viewers can connect?
Overall wondering if there is a way of hiding the server stream from anyone not logged in to view it on my site? I understand in webapps folder somewhere there is the hosts list of IP but it would be impossible to know the IP of the viewers as opposed to unwanted viewers or bandwidth stealers.
I am trying to set up a site for poetry readings and make it so readers can record live to my server and then logged in viewers can view from my website. I am trying to figure out whether I must have that red5 page open and if that doesn't pose some kind of risk.
Found my own way of doing this just by removing and renaming files and folders.
If you go to usr/local/red5/webapps here lies all the directories for viewing when you go to default port 5080 so I simply installed the applications I needed and then took everything out of there except those applications I wanted and needed to run. I took out all and placed it in a folder in /var directory named it red5_movedstuff in case I want access to further applications later on.Then I renamed the applications I am using in webapps folder and kept admin folder to access them but I renamed my applications and had to importantly rename also in WEB-INF for each application name change.
Now if someone goes to myip:5080 they get a blank page and by changing names of applications I've hidden my directories beyond that including list of streams.
developed a win. form vb.net db app that uses an access.accdb backend. I am struggling to find the best deployment strategy. In the past, I have distributed the .exe and access.accdb from the /bin/debug folder. This works, but Im not sure if it's the best method.
this db app. will be used by 5-10 ppl, non-simultaneous
my current plan is to put the .exe and access.accdb on a network share drive, users will launch from network share
users do not have admin privs, the computers have strict security settings
I have noticed that when launching the .exe from network share, you get the unknown publisher warning; this message does not appear when launching from local drive. Due to users security restrictions, I know that simply hitting 'continue/run' on the publisher warning is NOT an option. There is no 'continue/run' button.
So, I assume I have to buy a code signing cert and strong name sign the assembly?
I also read here http://msdn.microsoft.com/en-us/library/142dbbz4%28v=vs.80%29.aspx that clickonce deployment does not require admin privs, and can be launched from network share and ran from cache.
In this case, I buy Authenticode cert and sign the clickonce manifest?
Any advice?
edit
I left out a key function of the app that will affect deployment.
Users can select files and upload them. The basePath/filename is stored in the db. uploading and retrieving the file via openFileDialog and datagridview.cellContentClick is all relative to where the .exe is launched from (application.startupPath). I didn't want to hard code full paths into the db, because I'm sure it will be moved, (both app and files) over time to a new location.
ClickOnce deployment is the way to go here.
You do not have to buy anything, you can self sign your assembly with a strong name.
This should be fine for an internal application
I have spent numerous hours searching for a solution to this, not sure where else to turn. Our main app is deployed on amazon EC2, GlassFish 3.1. One of our pages handles file uploads. It works fine on my local development box, and on several other local deployments and other developer boxes. I've done a number of different logging options, and every time on local boxes the changed logging options appear correctly.
The problem is, no matter what I've tried, in amazon EC2, the file when uploaded never shows up in the various temporary paths I've tried (including changing all permissions to the path and parent paths to 777.. bad.. I know but trying anything to get this to work), the file always shows empty, and the logging indicates no errors. I've modified the file types allowed, the max file size supported, to no avail.
At this point, I am thinking it has something to do with EC2, since it works in several other deployments in different networks (team in India, local box, etc). I can't find anything in the EC2 console that sheds any light on any changes needed to permit a file upload. Heck, I can scp the .war file up to the instance and deploy it just fine, so I can't imagine anything would block file uploads from the web app itself from a browser.
Any help at all is much appreciated as this is unfortunately the only thing holding up our initial alpha release.
Thank you.
I am newbie to vb.net I have tested a few applications downloaded from net. They are surely database applications but when I checked in the folder where the application is installed, there is no database (no mdb, sdf etc.)
When I install my application, my databases are always in the folder.
What is the trick to make them invisible but they should still work.
Thanks
If the applications are using databases files such as Access or SQLite, then they are probably storing them in a user-specific directory, such as C:\Users\MyUser\My Documents\MyApplication\MyDatabase.mdb, instead of in the application directory.
This allows the applications to keep different databases for each user that logs onto the machine.