How to tell to gsutil file upload command that the "dst_url" is actually a file, not a directory? - gsutil

On a GCS bucket, I have uploaded from the web console a file with name fileName, and created a directory with the same fileName name. Hence, the GCS bucket contains both at its root a file with name fileName and a directory with the same name.
Now, when I attempt to update the fileName file via the gsutil command via
gsutil cp localFile gs://bucketname/fileName
where localFileis a file on my local machine and gs://bucketname/fileName is hence the dest_url parameter, instead of overriding the GCS fileName file, a new file with name fileName is created in the GCS directory with name fileName, which yields the GCS file gs://bucketname/fileName/fileName.
I could not find an option in the gsutil command, which indicates that the provided dest_url should be interpreted as a file and not a directory.
Did I miss something? Any work-around, please?

A workaround to accomplish this is to use:
gsutil rsync -d . gs://bucketname
This command will keep the contents of filename directory but will change the contents of filename file; as a matter of fact, this command synchronizes the contents of your local path (in this case the "current" path specified by "dot": .) with the contents of your bucket, therefore, if you just change the contents of filename file in your local system, the synchronization will result in an "overwriting" of your filename file in your GCS bucket without affecting your filename directory.

Related

VBA-How to convert a local path of a file in a folder managed by Drive File Stream into an URL

My goal is to create an xls sheet containing a list of files (file name + link to the file) contained in a folder.
This folder is a GDrive synchronized folder (using Drive File Stream)
I have a VBA macro which is producing this list of files but the link is a local link (G:...) and not an URL link (https://drive.google.com/...) that can be used outside my computer.
Here is an exemple
Let say I have a file in my "Drive File Stream" folder like this one:
G:\Team Drives\Test\my_file.txt
How to convert this path (using excel vba) into a GDrive's URL:
https://drive.google.com/file/d/FILE_ID/edit?usp=sharing
Another way to ask is how to get the file_id from a local path?

How to copy artifacts folder to ftp folder in TFS?

I'm trying to publish artifacts and it's other folders files as well.I've read all the docs file provide by microsoft from here and used them but none of them worked for me.
I' tried File patterns as
** =>which copied all root files to ftp
**\* => which copied all sub folders file to ftp's root directory.
What I've wanted is copy folder to folder in ftp aswell.
-artifacts ftp
--a.dll --a.dll
--subfolder --subfolder
---subfolder_1.dll ---subfolder_1.dll
what's happening is
ftp
--a.dll
--subfolder_1.dll
It's copying all sub directories file to root directory of ftp.
I've use curl and ftp both giving me same result.
How can i achieve folder to folder copy in TFS 2017.
It's not related File patterns, to upload the entire folder content recursively, simply specify **.
All you have to do is checking the Preserve file paths in Advanced option.
If selected, the relative local directory structure is recreated under
the remote directory where files are uploaded. Otherwise, files are
uploaded directly to the remote directory without creating additional
subdirectories.
For example, suppose your source folder is: /home/user/source/ and
contains the file: foo/bar/foobar.txt, and your remote directory
is: /uploads/. If selected, the file is uploaded to:
/uploads/foo/bar/foobar.txt. Otherwise, to: /uploads/foobar.txt.

How to copy files in local folder to sftp using pentaho copy file step

I have to copy the files in local folder to sftp location using pentaho.
i have tried with "copy step"step from spoon jobs by providing local path in file/folder source path and sftp details file/folder destination.
while executing i am getting error like "File system exception : could not find file in the files /path".
please let me know how to resolve it.
i did copying files from sftp to local folder using same "copy file" step by proving sftp details in file/folder source path and local path file/folder destination, it is working.
i have defined vfs parameters also in job.
Thanks
You should try the "Get file with SFTP" and "Put file with SFTP" entries instead.
Both have support for authentication via username/password or private key.
"copy files" will not be the best option here.
Try using "Get a file with SFTP"

can't download file with scp- no error but the file don't download

I'm trying to download a file using scp
I run
scp user#ip:/home/user/file.gzip $HOME/Desktop
I get
file.gzip 100% 156MB 155.8MB/s 00:01
but the file is not found
I tried on Ubuntu and Windows and the results is the same. And I tried with another destiny folder, but the file don't found.
Please I appreciate any help
Using that syntax, scp will attempt to save the file with the name 'Desktop'
Instead, save it inside your home folder by specifying a filename explicitly:
scp user#ip:/home/user/file.gzip $HOME/file.gzip
Or, to preserve the original filename and save inside your 'Desktop' folder, append forward slash period:
scp user#ip:/home/user/file.gzip $HOME/Desktop/.

Copying all files from a directory using a pig command

Hey I need to copy all files from a local directory to the HDFS using pig.
In the pig script I am using the copyFromLocal command with a wildcard in the source-path
i.e copyFromLocal /home/hive/Sample/* /user
It says the source path doesnt exist.
When I use copyFromLocal /home/hive/Sample/ /user , it makes another directory in the HDFS by the name of 'Sample', which I don't need.
But when I include the file name i.e /home/hive/Sample/sample_1.txt it works.
I dont need a single file. I need to copy all the files in the directory without making a directory in the HDFS.
PS: Ive also tried *.txt, ?,?.txt
No wildcards work.
Pig copyFromLocal/toLocal commands work only for a file or a directory.It will never take series of files (or) wildcard.More over, pig concentrates on processing data from/to HDFS.Upto my knowledge you cant even loop the files in a directory with ls.because it lists out files in HDFS. So, for this scenario I would suggest you to write a shell script/action(i.e. fs command) to copy files from locally to HDFS.
check this link below for info:
http://pig.apache.org/docs/r0.7.0/piglatin_ref2.html#copyFromLocal