I am having a strange issue with FileZilla client Not Responding (Picture attached) exclusively when accessing my App_Data folder in an asp.net website which contains a mdf file (SQL Database file). The folder I'm trying to access is on my local machine. I do not have any issues with viewing any other file types.
I am running the newest version of FileZilla client (3.34.0), I have tried reinstalling. I've also tried deleting the files in AppData/Roaming/FileZilla to see if that was the issue. I've checked my crashdumps in AppData\Local\CrashDumps, and there are no dumps. I've also made sure all SQL services are not running in case the MDF file was somehow in use.
I've searched around the web, and haven't found anyone with the same issue, and as a temporary solution have to use Windows file explorer to manage my FTP for this site.
Has anyone had a similar issue, or have any ideas of what I can try next? Any help is appreciated.
FileZilla Client not responding
Related
I've got a custom http module in IIS 8 that uses a set of config files to control how it works. These are all named *.config and are mainly in a filesystem folder outside the scope of the site, with one in the bin folder.
In the module I use system.io.file.writealltext to update the files. When I do this, the file I'm writing to is copied to a .bak file before the new file is written to. This only seems to happen when the file extension is .config (I've tested with other extensions and it doesn't happen).
It also doesn't happen everywhere - it happens with all the deployments I have on an Azure Windows Server 2016 VM, but doesn't happen much on other deployments (we have clients with a variety of versions of Win Server & IIS).
I suspect its an IIS setting, to automatically create a .bak when a .config file is changed, but can't find any info on this.
Does anyone have any idea why this is happening and how I can turn it off?
I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.
I have since connected to my SSH site using Putty and Filezilla. Putty and Filezilla give me direct access to the appropriate directories and Filezilla gives me a full path from the top level directory. I have then tried the same path from the starting point in Aptana, but get blocked one level above my target directory. If I collapse all the directories in Putty and Filezilla and try to expand again, Putty works fine, but Filezilla then blocs at the same point as Aptana. So there is some subtle difference in approach between Filezilla and Aptana and Putty.
I was looking for a way to get an encryted link when editing files on my hosted webserver when using Aptana Studio. I can get an ordinary Remote FTP link set up and use that to edit files. The SSH facility looked as though it could do a secure link, so got the SSH details off my ISP and set up in Studio. In the setup I was asked to select a directory, but the ones available were all system directories rather than my htdocs tree. However by leaving out the default provided /, I could get to the htdocs tree and see all my files under remote. However when I try to load a file to Studio, I get a file does not exist message.
I'm new to SSH (and Studio) and don't know whether what I am trying to do is not possible or there is something else I need to set up. So far I have just been using Studio as a remote editor for PHP and HTML. I suspect there is a much better and professional way to set up what I am doing under Studio (even ignoring the secure transfer) as I am currently doing separate backup of the files in Filezilla, but just haven't figured out the way to set everything up yet.
Many thanks for any help.
Actually I'd like to post this as a comment to your question, but I'm pretty new here, so the system doesn't allow me to.
Anyway. You can add connections to remote servers over ssh like this:
If you don't have the remote tab go to Window -> Show View -> Remote
Rightclick in the remote tab and click "Add New FTP Site..."
Change the Protocol to SFTP and put in your credentials (you can use Username/Password or Username/Private Key authentification)
When you close Aptana with files still opened, it usually show you those files when you reopen Aptana later. It seams not to do this when you work with files of a SFTP remote host and shows you an error instead. I guess this is because it tries to validate those files (if they still exists) but doesn't authenticate with the remote host first. So nothing to worry there.
Hope this helps.
I have a windows form application that requires users to log in to access the information. I have created a local compact database file for the credentials to be stored. I added the database file to my the folder but when I open my application and try to log in it tells me that it cannot find the database file.
Should the file be stored on a different folder, or should I need to install an instance of sql on the user computer.
This is my first deployment so I am not sure how to go about it. I have done some research on the subject, but it does not seem related to my issue. The help section of Intallshield was not clear either.
I am looking for some resources on how to accomplish this.
I figure out the issue, in order to work all files, including the database files need to be dumped under the userprofile folder.
I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.