how to set export command for apache user - apache

I'm developing a site where I need to use a program but I have to export one VAR, so ok by command line in my user it's easy,
export MY_VAR=/PATH
The problem is when try to execute the same program through the website I got an error telling me this:
Set environment variable $MY_VAR to the location
I've try to execute a script, but I think the problem is set the variable but with my user, I mean, I've try to do this in php:
echo exec('sh /PATH/exportscript.sh');
Also this:
$cmd = "export";
$path="MY_VAR=/PATH/data_tables/";
$export= $cmd . " " . $path;
shell_exec($export);
No result with this command....any other option? How I can put MY_VAR for Apache???

Related

How can I pass parameters to a script block in a new process

I want to start a new powershell session from pwsh core to run some code that is designed to run in powershell 5.1 (it checks the version table).
I can get the script block to execute fine, but I want to pass variable values from my pwsh session to the new session.
An example that isn't working is:
7.0.3: >_ $block = {param($name)Write-Host "Hello, $name. How are you?"}
7.0.3: >_ start powershell -argumentlist "-noexit $block 'friend'"
new window opens:
5.1: >_ Hello, . How are you? friend
5.1: >_
However, when I wrap this in a full blown .ps1 script it seems to work fine.
Most likely the $block variable is not defined in the new scope. This works.
start powershell -ArgumentList '-noexit & {param($name)Write-Host "Hello, $name. How are you?"} friend'
I used the & invoke method, you could also use .
start powershell -ArgumentList '-noexit . {param($name)Write-Host "Hello, $name. How are you?"} friend'
output in the new console
Hello friend . How are you?

read -p command (input) inside the while loop

I am trying to read multiple inputs from user using while loop.
I tried below
while
do
read -p "Enter Server Names : " multi_act
done
'read -p' command is not working inside the while loop, How can I solve this?
I got it fixed as below
exec 5< file1
while read <&5 a; do
echo before
read var
echo after
done
exec 5<&-
its working like a charm.

Cronjob does not execute command line in perl script

I am unfamiliar with linux/linux environment so do pardon me if I make any mistakes, do comment to clarify.
I have created a simple perl script. This script creates a sql file and as shown, it would execute the lines in the file to be inserted into the database.
#!/usr/bin/perl
use strict;
use warnings;
use POSIX 'strftime';
my $SQL_COMMAND;
my $HOST = "i";
my $USERNAME = "need";
my $PASSWORD = "help";
my $NOW_TIMESTAMP = strftime '%Y-%m-%d_%H-%M-%S', localtime;
open my $out_fh, '>>', "$NOW_TIMESTAMP.sql" or die 'Unable to create sql file';
printf {$out_fh} "INSERT INTO BOL_LOCK.test(name) VALUES ('wow');";
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
while( my $sql_file = glob '*.sql' )
{
my $status = system ( "$SQL_COMMAND < $sql_file" );
if ( $status == 0 )
{
print "pass";
}
else
{
print "fail";
}
}
}
insert();
This works if I execute it while I am logged in as a user(I do not have access to Admin). However, when I set a cronjob to run this file let's say at 10.08am by using the line(in crontab -e):
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl > /dev/null 2>&1
I know the script is being executed as the sql file is created. However no new rows are inserted into the database after 10.08am. I've searched for solutions and some have suggested using the DBI module but it's not available on the server.
EDIT: Didn't manage to solve it in the end. A root/admin account was used to to execute the script so that "solved" the problem.
First things first, get rid of the > /dev/null 2>&1 at the end of your crontab entry (at least temporarily) so you can actually see any errors that may be occurring.
In other words, change it temporarily to something like:
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl >/tmp/myfile 2>&1
Then you can examine the /tmp/myfile file to see what's being output.
The most likely case is that mysql is not actually on the path in your cron job, because cron itself gives a rather minimal environment.
To fix that problem (assuming that's what it is), see this answer, which gives some guidelines on how best to expand the cron environment to give you what you need. That will probably just involve adding the MySQL executable directory to your PATH variable.
The other thing you may want to consider is closing the out_fh file before trying to pass it to mysql - if the buffers haven't been flushed, it may still be an empty file as far as other processes are concerned.
The expression glob(".* *") matches all files in the current working
directory.
- http://perldoc.perl.org/functions/glob.html
you should not rely on the wd in a cron job. If you want to use a glob (or any file operation) with a relative path, set the wd with chdir first.
source: http://www.perlmonks.org/bare/?node_id=395387
So if your working directory is, for example /home/user, you should insert
chdir('/home/user/');
before the WHILE, ie:
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
chdir('/home/user/');
while( my $sql_file = glob '*.sql' )
{
...
replace /home/user with wherever your sql files are being created.
It's better to do as much processing within Perl as possible. It avoids the overhead of generating a separate shell process and leaves everything under the control of the program so that you can handle any errors much more simply
Database access from Perl is done using the DBI module. This program demonstrates how to achieve what you have written using the mysql utility. As you can see it's also much more concise
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $host = "i";
my $username = "need";
my $password = "help";
my $dbh = DBI->connect("DBI:mysql:database=test;host=$host", $username, $password);
my $insert = $dbh->prepare('INSERT INTO BOL_LOCK.test(name) VALUES (?)');
my $rv = $insert->execute('wow');
print $rv ? "pass\n" : "fail\n";

Powershell piping to variable and write-host at the same time

Hello all and thanks for your time in advance;
I'm running into a slight issue.
I'm running a command and piping it into a variable so i can manipulate the output.
$variable = some command
this normally works fine but doesn't output what's happening to the screen, which is fine most of the time. However occasionally this command requires some user input (like yes or no or skip for example), and since it's not actually piping anything to the command window, it just sits there hanging instead of prompting the user. If you know what to expect you can hit y or n or s and it'll proceed normally.
Is there anyway to run the command so that the output is piped to a variable AND appears on screen? I've already tried:
($variable = some command)
I've also tried:
write-host ($variable = some command)
But neither work. Note that the command running isn't a native windows or shell command and I cannot just run it twice in a row.
To clarify (because i probably wasn't clear)
I've also tried :
$variable = some command : Out-host
and
$variable = some command | out-default
with all their parameters, But the "prompt" from the command (to write y, n, s) doesn't show up.
Being able to pass S automatically would also be acceptable.
Sounds like you need Tee-Object. Example:
some command | Tee-Object -Variable MyVariable
This should pass everything from the command down the pipe, as well as store all output from the command in the $MyVariable variable for you.
You need to give some specific example that doesn't work. I tried this and it works:
function test { $c = read-host "Say something"; $c }
$x = test
I still see "Say something". read-host does not output to standard output so your problem is surprising. Even this works:
read-host "Say something" *> out
=== EDIT ===
Since this is interaction with cmd.exe you have two options AFAIK. First, test command:
test.cmd
#echo off
set /p something="Say something: "
echo %something%
This doesn't work as you said: $x= ./test.cmd
To make it work:
a) Replace above command with: "Say something:"; $x= ./test.cmd. This is obviously not ideal in general scenario as you might not know in advance what the *.cmd will ask. But when you do know its very easy.
b) Try this:
Start-transcript test_out;
./test.cmd;
Stop-transcript;
gc .\test_out | sls 'test.cmd' -Context 0,1 | select -Last 1 | set y
rm test_out
$z = ($y -split "`n").Trim()
After this $z variable contains: Say something: something. This could be good general solution that you could convert to function:
$z = Get-CmdOutput test.cmd
Details of text parsing might be slightly different in general case - I assumed here that only 1 question is asked and answer is on the same line but in any case with some work you will be able to get everything cmd.exe script outputed in general case:
=== EDIT 2 ===
This might be a better general extraction:
$a = gi test_out; rm test_out
$z = $a | select -Index 14,($a.count-5)
$z
$variable = ""
some command | % {$variable += $_;"$_"}
This executes the command, and each line of output is both added to $variable and printed to the console.

dynamically fetching dynamic variable's value from properties file

Below unix commands works:
export myTempVar=myTempVar1
export myTempVar1=myTempVar2
eval echo '$'$myTempVar
This correctly prints myTempVar2.
However, what if myTempVar1=myTempVar2 is present in a properties file instead of directly in the script.
So my script will have
. $MYDIR/myProperties.properties
myTempVar=myTempVar1
myTempVar3=eval echo '$'$myTempVar
Above lines are not working and the value of myTempVar3 is not coming as myTempVar2.
myProperties.properties is having below line:
myTempVar1=myTempVar2
Using indirection is far safer than eval:
#!/bin/bash
. $MYDIR/myProperties.properties # myTempVar1=myTempVar2
myTempVar=myTempVar1
myTempVar3=${!myTempVar}
echo $myTempVar3
Gives:
myTempVar2
and you don't need the echo in eval:
eval myTempVar3='$'$myTempVar