I have a files of a website created using the AdonisJs by a friend and i want to upload it on enter link description here .
Basically when i run filezilla on laptop it shows inside server the Root file . So my question is:
In which direction should i upload the files?
If you have the files on website S, the target server for your upload is T and your local workstation is W. Then you can do it in two steps:
Download it from S to W.
Upload it from W to T.
Related
I want to use DDEV as a local development environment. The setup was successful and the website (a WordPress) is running.
Currenty our team is using XAMPP and to avoid downloading large files on every local machine we create symbolic links (e.g. the "uploads" folder in WordPress). The target is a network drive. So everyone in our team has access to the same files.
Now I want to do the same with DDEV. In WSL I mounted the network drive and created a symbolic link. Inside the console I have full access to the mounted directory, I can create, edit and remove files.
But when I access a file with the browser I get the following error message:
403 Forbidden. You don't have permission to access this resource.
The same error occurs when I try to upload a new file within WordPress.
Is there any way to give the webserver the permission to view and modify the files on a network drive?
The Webserver is an Apache/2.4.38.
As #rfay mentioned I had to add the network drive as volume so it's accessible by the web container. Therefore I created a new docker-compose-file within the .ddev directory (see also in the docs: https://ddev.readthedocs.io/en/stable/users/extend/custom-compose-files/#docker-compose42yaml-examples).
Additionally the permissions on the network drive were incorrect.
I am creating an SAPUI5 WebApp with an file upload function. I try this example from SAPUi5 Explored: sap.m.sample.UploadCollection
I try with my trial account in SAP WebIDE to set the upload function ( Upload Collection)
The issue is, not allowing to upload a file in the project folder or local desktop folder.
If I upload a file it appears but i can't open it and I get a 405 HTTP
error.
Any Ideas, what the problem is?
like you see already in your posts comments, you need a backend for this task. The UploadCollection control is only usable with a backend in the background which receives the transmitted file from the control.
On the page https://sapui5.hana.ondemand.com/#/api/sap.m.UploadCollection you see
This control allows you to upload single or multiple files from your devices (desktop, tablet or phone) and attach them to the application
while you can replace "application" with "receiving backend"
Indipendent of this may I allowed to ask where do you think the file should be uploaded when not to a backend system? I mean when you choose a file from your local storage it doesn't make any sense to upload it again to your local storage?!
I have cPanel access on two different servers and would like to transfer from one server to another. The original size of the account on the first server is close to 15GB.
Currently, the only two ways I can think of are:
Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
My problem with option 2 is that this means I will end up using 30GB of data transfer if it actually does that.
What is the best way to transfer from server to server using cPanel?
I have limited knowledge on this. I will suggest you some tips.
1.Backup using cPanel then Restore on the second server. But this process times out. I get "Failed - Network error" error
The backup creation is failed due to the high size of account. So that reason that way is not possible.
Use FTP App like Filezilla to login and transfer files from that. I haven't tested this but I think it first downloads the files on my local machine (temp folder) then uploads them to the second server.
You can download the whole content to your PC or download one folder by folder like first home, mail, and etc and upload to new cpanel account using ftp
Could you please open a support ticket to your hosting provider they will help you create from server backend using /script/pkgacct.
Thank you.
Is it possible to upload a file to S3 from a remote server?
The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.
It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg
upload from http://example.com/1.jpg
server -------------------------------------------> S3 <-----> example.com
If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.
You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html
Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.
I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.
Unfortunately there's no other way to do it.
I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.
The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.
Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.
you can use scp command from Terminal.
1)using terminal, go to the place where there is that file you want to transfer to the server
2) type this:
scp -i yourAmazonKeypairPath.pem fileNameThatYouWantToTransfer.php ec2-user#ec2-00-000-000-15.us-west-2.compute.amazonaws.com:
N.B. Add "ec2-user#" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!
3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!
I have a wordpress website which is installed by Microsoft Webtrix on my computer A.
I have installed Microsoft Webmatrix on my computer B so that I can continue my work on computer B so that I can continue work on computer B.
I want to copy my full wordpress website (website content + database) from computer A to B.
I can not find a tutorial which teaches how to copy database from computer A to B.
I am totally new to creating web so please kindly help.
Thanks
Your best long term solution is to learn how to export WordPress sites from local to local development environments as well as local to live host.
Within your present set up simply copy all files from wp-content of the site and transfer them to your new set up.
Add a plugin to your original site such as WP Database Backup. This will allow you to dump the .sql (database) and keep a copy as a backup.
Over to the new set up (B computer). Create a new wordpress site.
Copy and paste the wp-content folder into the new site.
Now this is where you need to check with web matrix on your new set up. You'll need to make sure the wp-config.php file has the correct settings such as db name and password user etc.
Finally upload the backup db .sql file via the db plugin mentioned in step 2.