Can cd commands be used multiple times in a script? - mkdir

I am writing a script , please confirm if I can use multiple cd commands as I have to create and cd multiple times to make the job run. So can I use it again and again.
I have created a small script from it to mkdir and cd in one command but its not working .
1.
function mkdircd () { mkdir -p testjdk && eval cd "$_" ; }
mkdircd /tmp/testjdk
pwd
mkdir test && cd "$_"
However 2nd one works outside if I directly tried to run it but inside the script its not working .

I am assuming you want a bash script to make a directory and then cd into it? Something similar to what is shown below will work.
You need to pass an argument to the function and to the script itself. So $1 is the argument that you pass to the Function call when you run the script from the command line. And then within the script then the same argument is passed to the function.
So say this script was named test.sh, then you would run it by executing something like source test.sh ./my_dir. Here ./my_dir is the relative path to the directory that you want to create and enter. If you want to create and enter it in the root then run the script with sudo and specify the full path.
#!/bin/bash
#It is a function
myFunction() {
mkdir -p $1;
cd $1
}
#function call
myFunction $1

Related

Run multiple commands using wsl

I have some scripts on my home folder that I would like to run from command line, actually from the task scheduler, but I don't find the way to do so.
I have tried
wsl -u Ubuntu -u jlanza "cd /home/jlanza/bin && ./myscript && ./myotherscript"
but it doesn't work.
How can I concatenate the execution of several commands under the same session?
You need to escape && so powershell does not see them as different commands, to pass && to wsl do the following:
wsl -d Ubuntu -u jlanza -- cd /home/jlanza/bin `&`& ./myscript `&`& ./myotherscript
You need to pass those commands to WSL without the quotes so that bash interprets them correctly as a series of commands chained together, rather than a single long command. Unfortunately, when you do that, the '&&' operator gets interpreted by the Windows command line interpreter and the commands following do not get passed to WSL.
What I've found to work is replacing the '&&' (run command only if the preceeding command exits with success) operator with simple ';' (run command regardless of how the preceeding command exited). In your case then, something like this should work:
cmd /c "wsl -u Ubuntu -u jlanza cd /home/jlanza/bin; ./myscript; ./myotherscript"
However, if your use cases neccesitates the use of the '&&' operator, I'd try saving the command you're running as a script or an shell alias in the WSL. Calling that would then save you the need of passing '&&' through the Windows command line interpreter.
C:\Windows\System32\wsl.exe -d Ubuntu -u jlanza sh -c ". ~/.profile && script-jlanza.sh"
This way I load my own own profile and then I'm able to run the commands.
The option is just to run sh or any shell and then execute the commands you'd like within that shell. The other answer is also valid, but I like mine most, as I'm able to use the user profile (path, aliases, etc.)

Apache pig script delete a folder if exists

I want to delete the output folder of the previous execution through the apache pig script. This command works fine.
sh [ -e /home/LocalPig/test ] && rm -rf /home/LocalPig/test
but if I write
sh OutpuFile=/home/LocalPig/test
sh [ -e OutputFile] && rm -rf OutputFile
I got the error about OutputFile!
ERROR 2997: Encountered IOException. org.apache.pig.tools.parameters.ParameterSubstitutionException: Undefined parameter : OutputFile
Does anybody have any idea?
Thanks
Hope this solves the problem. Its simply the below command from .pig script file. You don't have to write any shell command. It can be accomplished from within the pig environment using the built in fs command.
Example, put a statement in your pig script like below, it will also not error out due to non-existence of the folder. It will delete if exists or gracefully exists the statement.
fs -rm -f -r -R /user/horton/denver_total;
Offcourse you can also do a lot of work outside the pig but its very useful to perform any delete within your script that controls creation of the data. It makes life simpler to trace the lineage of create and destroy of that files.
Reference: Parameter Substituion
%declare OutputFile '/home/LocalPig/test'
sh [ -e '$OutputFile' ] && rm -rf '$OutputFile'

How to use Command Grep inside a script

I'm using this inside a script:
VAR=$(grep -c mac myfile.tmp)
echo $VAR
the result is 0 when I run the script. But if I run the command in command line it returns the real value that is 1.
Anyone know what the problem is?

How to make shell script run by double-click?

I have a script which is executable from command line but I want to make it user friendly as I want to run this script by double clicking on it. How it is possible?
#! /bin/bash
cd
cd Desktop/D_usman/
java -jar imageSynch.jar
You may need to add ".command" to the end of the name of your shellscript on Mac OS X.
You need to add execute permissions for the user, group, or rest of the world, depending on who should be allowed to execute it. Look into chmod for more information.
Example: chmod u+x myscript
Once you do this, you can also start the shell script like this ./myscript instead of sh myscript.sh
Note: You can also make your JAR start by adding execute permission, given Java is setup correctly on your machine.

Unable to run a postgresql script from bash

I am learning the shell language. I have creating a shell script whose function is to login into the DB and run a .sql file. Following are the contents of the script -
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
echo "Running SQL Dump - auto_qa_db_sync"
\\i auto_qa_db_sync.sql
After running the above script, I get the following error
./autoqa_script.sh: 39: ./autoqa_script.sh: /i: not found
Following one article, I tried reversing the slash but it didn't worked.
I don't understand why this is happening. Because when I try manually running the sql file, it works properly. Can anyone help?
#!/bin/bash
set -x
echo "Login to postgres user for autoqa_rpt_production and run script"
$DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT -f auto_qa_db_sync.sql
The lines you put in a shell script are (moreless, let's say so for now) equivalent to what you would put right to the Bash prompt (the one ending with '$' or '#' if you're a root). When you execute a script (a list of commands), one command will be run after the previous terminates.
What you wanted to do is to run the client and issue a "\i ./autoqa_script.sh" comand in it.
What you did was to run the client, and after the client terminated, issue that command in Bash.
You should read about Bash pipelines - these are the way to run programs and input text inside them. Following your original idea to solving the problem, you'd write something like:
echo '\i auto_qa_db_sync.sql' | $DB_PATH -U $POSTGRESS_USER $Auto_rpt_production$TARGET_DB -p $TARGET_PORT
Hope that helps to understand.