Why does Google colab say: chmod: cannot access 'RDP.sh': No such file or directory - google-colaboratory

When I put the following code in the Google Colab Run cell:
! wget https://raw.githubusercontent.com/alok676875/RDP/main/RDP.sh &> /dev/null
! chmod +x RDP.sh
! ./RDP.sh
The result is as follows:
chmod: cannot access 'RDP.sh': No such file or directory
/bin/bash: ./RDP.sh: No such file or directory
Please tell, where is the error and what is the solution. Thank you

this file
https://raw.githubusercontent.com/alok676875/RDP/main/RDP.sh
was deleted so you can't download it.

Related

Cannot read .7z file in Google Colab

I am using Google Colab to create a deep learning model, and I face an issue when I run this code at the first time.
!p7zip -d filename.7z
I get the following message:
/usr/bin/p7zip: cannot read filename.7z
But when I re-run the same cell again, the code works.
Do you know what is the reason of this issue?
First, you have to specify path before the file name
in my case
!mkdir ~/data
!cd ~/data
!mkdir planet
!cd planet
# -c: competition name
# -f: which file you want to download
# -p: path to where the file should be saved
!kaggle competitions download -c planet-understanding-the-amazon-from-space -f train-jpg.tar.7z -p ~/data/planet/
# Unzip the 7zip files
# -d: which file to un7zip
!p7zip -d ~/data/planet/test-jpg.tar.7z

Impossible write into the AJXP_DATA_PATH folder ajaxplorer

I uploaded ajaxplorer "pydio-core-5.0.4.zip" to my server and after I extracted files into a folder in the server i request the folder to starting install but i get this message :
"Impossible write into the AJXP_DATA_PATH folder: Make sure to grant write access to this folder for your webserver!"
i made the folder : /data permissions to 777 and it did not make change ..
any solve ?
I'v got the same problem few hours ago.
The problem:
You put full permissions (777) to the data folder, but subfolders don't get it.
The solution:
sudo chmod -R 777 data
sudo chmod -R 777 data
or
sudo mkdir -m 777 your_pydio_path/data/tmp/sessions
I know this is old, but I was having the same issue with pydio-core-6.0.8. Also, I'm going to preface this by saying that I am a php noob. But I was able to resolve my issue without a chmod 777 command. Instead, I made the nginx user the owner of the data directory.
chown -R nginx /path/to/pydio-core-6.0.8/data
And then made sure that php-fpm was running as the nginx user with the two php-fpm.conf settings
listen.owner = nginx
user = nginx
After restarting php-fpm, I was able to load the pydio page which went into the startup wizard.
This command is so easy! But it's dangerous!
Go to /var/www/pydio for apache2 or /usr/share/nginx/html/pydio for nginx and try:
chmod ugo+x data
It's more protected!

Drush make directory is not writable

I am running a drush make and getting the following error
Directory /Applications/MAMP/htdocs/geoslate/sites/default exists, but is not writable. Please check directory permissions. [error]
Unable to copy /tmp/make_tmp_1365076314_515d695acefc3/__build__/sites/default to ./sites/default. [error]
Cannot move build into place
I am not an expert on permissions in the terminal, can you give me a hand to give the directory the write permisions. I have tried chmod -w /Applications/MAMP/htdocs/geoslate/sites/default and chmod u+x /Applications/MAMP/htdocs/geoslate/sites/default
chmod +w <directory> solved the problem

mysqldump. Location of the file?

I did an update on my db, but can't find the file so I can download it. Where is the file located at? Thanks in advance
root#xxxx:~# mysqldump -u root -p cherio > myBackup.sql
Enter password:
root#xxxx:~# find myBackup.sql
myBackup.sql
EDIT:
I tried this:
root#xxxx:~# find / -name "myBackup.sql"
/root/myBackup.sql
Ok, I had to refresh my FTP for the file to show up.
On the same directory you made the dump. It's ./myBackup.sql and not myBackup.sql
Do an ls -all.
It is in the same directory where you ran this command. To check this, run "pwd", it will give you your current path. Then run the command and it will store the database file in the same directory.
To get generated file properly run following command,
mysqldump --database --user=root --password your_db_name > export_into_db.sql
This will give you the "export_into_db.sql" database file.
Play with it.
Enjoy...
It is at either at root or in current directory where you run the dump command.
try this command on terminal..
locate myBackup.sql
I had the same problem.
It comes out that for some reason I could not create any file at root directory.
Try to go into some deeper folder
For example
cd var
cd www
and than execute:
mysqldump -uUSERNAME -pPASSWORD DATABASE > backup.sql

Putty ssh commands zip all the files within this folder then download

oh so i cd into my folder
ls
cgi-bin wp-comments-post.php wp-mail.php
googlec3erferfer228fc075b.html wp-commentsrss2.php wp-pass.php
index.php wp-config-sample.php wp-rdf.php
license.txt wp-config.php wp-register.php
php.ini wp-content wp-rss.php
readme.html wp-cron.php wp-rss2.php
wp-activate.php wp-feed.php wp-settings.php
wp-admin wp-includes wp-signup.php
wp-app.php wp-links-opml.php wp-trackback.php
wp-atom.php wp-load.php xmlrpc.php
wp-blog-header.php wp-login.php
(uiserver):u45567318:~/wsb454434801 >
What i want to do is zip all the files within this folder then download it to my computer i am really new to ssh and this is a clients website but really want to start to use command line for speed, i have been looking a this reference http://ss64.com/bash/ to find the right commands but would really like some help from somebody please??
Thanks
cd path/to/folder/foldername
zip -r foldername.zip foldername * [use * if it has any sub directory]
Please try this code, it will solve your problem.
If you are in directory itself then
zip -r zipfilename.zip *
Go to folder path using cd command
zip -r foldername.zip foldername
Ex : zip -r test-bkupname.zip test
Here test is the folder name.
tar zcvf ../my_directory.tar.gz .
will create my_directory.tar.gz file.
scp ../my_directory.tar.gz username#your-ip:/path/to/place/file
will transfer file to your computer.
Looks like this is the webroot directory.
Why not zip the directory above (httpdocs / html / whatever) and then move this into the website space, and download from there?
i.e. go into the directory above the web root. For example, if your web root is /var/www/html/ go into /var/www/ and run the following commands:
zip -r allwebfiles.zip html
mv allwebfiles.zip /html/allwebfiles.zip
Then in your web browser go to http://mydomain.com/allwebfiles.zip and just download that file.
When extracting, you'd just need to either extract into /var/www/ OR extract into webroot and move all files up one level.
Use the following Ssh command to download multiple files at one time
mget ./*