How to delete a text file in google colab standard path? - google-colaboratory

I am using google colab and have saved some png's and now i want to delete them is there any way to do it, my current path is \content.

Use the %cd magic to switch to whatever directory holds the files and then use shell commands to remove them.
For example, if you have a file in /content/directory/a.png, run:
%cd /content/directory
!rm a.png
If you want to remove all .png files, adjust your rm command like so: !rm *.png.

Related

How do I change path directory in Jupyter lab?

How do I change initial path directory in Jupyter lab, when i want to get a file via "~/"?
Have tried to generate config, and then change some parameters but only got confused.
You can change file directory like that.
import shutil
File= r'C:\Users\ivan\Desktop\Somewhereidonotknow\example.csv'
Whereyou_want= r'C:\Users\ivan\Desktop\example.csv'
shutil.move(File, Whereyou_want)
You should be using the %cd magic command to change the working directory. And then to set up using tab completion, you'd start by typing ./ before hitting tab at the place where you want to choose your CSV file.
In the demonstration set-up for the screenshot below I made a test directory in the root (home) location and made two CSV files in there.
Using %cd test first I am then able to use tab completion to get the option to select one of the two CSV files:
I probably should have included running pwd to 'print the working directoryafter I ran the%cd test` command to demonstrate things more fully.
Before I executed the command %cd test, the tab-completion was showing the root (home directory) when I tried for tab completion.
The tilde symbol (~) always means the HOME directory on the system. It won't change. So you were always specifying to start in HOME in your example in your post, no matter what the current working directory is in the notebook's active namespace. You want to use relative paths for when the working directory has been adjusted.
There are more complex settings you can take advantage of using inside the notebook in conjunction with the %cd magic.
For example, this post and answer shows how you can use the %boookmark magic to set assign a directory to a bookmark setting and then you can more easily switch around to various directories using %cd.

How do I configure black to use different formatting rules for different file extensions?

I use black for format normal .py files as well as Jupyter Notebook files (.ipynb). For notebooks, I want a shorter line-length.
Is it possible to specify different formatting rules for different file extensions with black?
You could create two separate files for .py and .ipynb files and run them separately
Some usefull flags from docs:
--config FILE Read configuration from FILE path.
--include TEXT A regular expression that matches files and directories that should be included on recursive searches.
So, to format multiple types of files, run something like:
python -m black --config pyproject.py.toml --include '*.py' src
python -m black --config pyproject.ipynb.toml --include '*.ipynb' src
Also you could specify include field inside toml files. It's in docs too:
[tool.black]
line-length = 88
target-version = ['py37']
include = '\.pyi?$'

Github in Parent Directory of Google Colab

I'm a noob to Google Colab and Python. I'm attempting to import a custom set of scripts from a Github directory. I'm using the following:
!git clone https://github.com/theAIGuysCode/tensorflow-yolov4-tflite.git
By default, this will export to a folder that it names based on the git name. However, the functions in the needed scripts call the parent directory and not the git folder name. Example:
Google Colab Screenshot
Is there a method for importing the git in the parent directory so the scripts can run without modifying the file hierarchy in each script?
The error is that you are in a different directory. Most likely current directory is /content/ if those two cells in the picture are on top.
You need to change directory before you can call save_model.py, then it will work as expected. Use !pwd to know the current directory.
Before the last cell change directory to the one where desired code is. So in this case it can be,
%cd "/content/tensorflow-yolov4-tflite"
If you are unsure about path, right click on folder and select Copy path to use with cd command.

How to download different folders of different directory from S3 bucket , maybe with bash?

I have a list of folders of the S3 bucket that I want to download from different directories in .txt format. I am able to use the CP command to download one folder. But, I am not sure how can we run a CLI command providing .txt file and download different folders from different directories. Any guidance would be highly appreciated.
Update: My directory from S3 looks like this, where I want to download folder A1, A2, B5, C9, and C11. I have a .txt file with the list of folders.
s3://storage-folder/Folder A/A1
s3://storage-folder/Folder A/A2
s3://storage-fodler/Folder B/B5
s3://storage-fodler/Folder B/B6
s3://storage-fodler/Folder C/C9
s3://storage-fodler/Folder C/C11
want to get locally in my machine as:
Folder A/A1
Folder A/A2
Folder B/B5
Folder B/B6
Folder C/C9
Folder C/C11
For only one folder, I am using the following cp command
aws s3 cp "s3://storage-folder/Folder A/A1" "./Folder A/A1/" --recursive
Here is how your bash script would look like:
while IFS="" read -r line || [ -n "$line" ]
do
local_path="./$(echo "$line" | cut -d '/' -f4-)"
aws s3 cp "\"$line\"" "\"$local_path\"" --recursive
done < file.txt
While loop explained here: Looping through the content of a file in Bash
Also, I strongly recommend you not to use spaces in your paths, that will help you avoid some nasty quotes escaping.

Custom `rsync` command to sync my Documents and Dropbox?

This is what I want to achieve:
Dropbox Directory Structure:
Dropbox/
1passwordstuff
Music
documentfolder1
documentfolder2
Documents Structure:
Documents/
documentfolder1
documentfolder2
Then, I want to do all of my work within the Documents folder. So let's say I make some changes to a file in documentfolder1, then I want to call a command like rsync ... and have all of my changes pushed into Dropbox. I've managed to achieve this with rsync -r --ignore-existing Documents Dropbox but there's a problem. Let's say I delete some files in Documents/documentfolder1/somefile then I want those files in my Dropbox folder to also get deleted. I don't know how to do this.
Any help?
Voted to close, since this question isn't programming-related, but I think you want rsync --delete.
Why not simply use the symbolic links?
Create a symbolic link in the dropbox folder to the Documents folder, and everything will get synced, and you still will work in your Documents location.
just go to your dropbox folder and run
ln -s PATH_TO_DOCUMENTS Documents