Apache unable to load perl module - apache

I am experimenting with setting up an Apache server and running some perl scripts on it, but I'm running into some issues with my Apache and Perl config.
Initially, I was getting a 500 server error when trying to run a cgi script, and the error log showed that Apache was looking in the wrong #INC for a module I was running.
So, in httpd.conf, I added this line:
SetEnv PERL5LIB /Users/rasha/.plenv/versions/5.34.0/lib/perl5/site_perl/5.34.0/darwin-2level:/Users/rasha/.plenv/versions/5.34.0/lib/perl5/site_perl/5.34.0:/Users/rasha/.plenv/versions/5.34.0/lib/perl5/5.34.0/darwin-2level:/Users/rasha/.plenv/versions/5.34.0/lib/perl5/5.34.0
which I got by running: perl -e 'print join "\n", #INC;'
Now I am still getting a 500 server error, but with a different error message:
[Tue Feb 22 14:24:32.919661 2022] [cgi:error] [pid 35434] [client 127.0.0.1:55187] AH01215: Can't load '/Users/rasha/.plenv/versions/5.34.0/lib/perl5/site_perl/5.34.0/darwin-2level/auto/List/Util/Util.bundle' for module List::Util: dlopen(/Users/rasha/.plenv/versions/5.34.0/lib/perl5/site_perl/5.34.0/darwin-2level/auto/List/Util/Util.bundle, 0x0001): symbol not found in flat namespace '_PL_DBsub' at /Users/rasha/.plenv/versions/5.34.0/lib/perl5/5.34.0/XSLoader.pm line 96.: /usr/local/var/www/cgi-bin/test
The test script I am trying to run:
#!/usr/bin/env perl
use strict;
use warnings;
use DBI;
use CGI qw(:standard);
my ($dbh, $sth, $count);
$dbh = DBI->connect("DBI:mysql:host=localhost;database=xxxx",
"xxxx", "xxxx",
{PrintError => 0, RaiseError => 1});
$sth = $dbh->prepare("Select name, wins, losses from teams");
$sth->execute;
print header, start_html("team data");
$count = 0;
while (my #val = $sth->fetchrow_array) {
print p (sprintf ("name = %s, wins = %d, losses = %d\n", $val[0], $val[1], $val[2]));
$count++;
};
print p("$count rows total"), end_html;
$sth->finish;
$dbh->disconnect;
Does anyone have any ideas what the issue might be?

Related

inets httpd cgi script: How do you retrieve json data?

The cgi scripts that I have tried are unable to retrieve json data from my inets httpd server.
In order to retrieve json data in a cgi script, you need to be able to read the body of the request, which will contain something like:
{"a": 1, "b": 2}
With a perl cgi script, I can read the body of a request like this:
my $cgi = CGI->new;
my $req_body = $cgi->param('POSTDATA');
I assume that is an indirect way of reading what the server pipes to the script's stdin because in a python cgi script I have to write:
req_body = sys.stdin.read()
When I request a cgi script from an apache server, my perl and python cgi scripts can successfully get the json data from apache. But when I request the same cgi scripts from my inets httpd server, my perl cgi script reads nothing for the request body, and my python cgi script hangs then the server times out. My cgi scripts are able to retrieve data formatted as "a=1&b=2" from an inets httpd server--in that case the cgi facilities in both perl and python automatically parse the data for me, so instead of trying to read the body of the request, I just access the structures that cgi created.
Here is my httpd sever configuration (server.conf):
[
{modules, [
mod_alias,
mod_actions,
mod_esi,
mod_cgi,
mod_get,
mod_log
]},
{bind_address, "localhost"},
{port,0},
{server_name,"httpd_test"},
{server_root,"/Users/7stud/erlang_programs/inets_proj"},
{document_root,"./htdocs"},
{script_alias, {"/cgi-bin/", "/Users/7stud/erlang_programs/inets_proj/cgi-bin/"} },
{erl_script_alias, {"/erl", [mymod]} },
{erl_script_nocache, true},
{error_log, "./errors.log"},
{transfer_log, "./requests.log"}
].
I start my httpd server with this program (s.erl):
-module(s).
-compile(export_all).
%Need to look up port with httpd:info(Server)
ensure_inets_start() ->
case inets:start() of
ok -> ok;
{error,{already_started,inets}} -> ok
end.
start() ->
ok = ensure_inets_start(),
{ok, Server} = inets:start(httpd,
[{proplist_file, "./server.conf"}]
),
Server.
stop(Server) ->
ok = inets:stop(httpd, Server).
My cgi script (1.py):
#!/usr/bin/env python3
import json
import cgi
import cgitb
cgitb.enable() #errors to browser
import sys
sys.stdout.write("Content-Type: text/html")
sys.stdout.write("\r\n\r\n")
#print("<div>hello</div>")
req_body = sys.stdin.read()
my_dict = json.loads(req_body)
if my_dict:
a = my_dict.get("a", "Not found")
b = my_dict.get("b", "Not found")
total = a + b
print("<div>Got json: {}</div>".format(my_dict) )
print("<div>a={}, b={}, total={}</div>".format(a, b, total))
else:
print("<div>Couldn't read json data.</div>")
My cgi script (1.pl):
#!/usr/bin/env perl
use strict;
use warnings;
use 5.020;
use autodie;
use Data::Dumper;
use CGI;
use CGI::Carp qw(fatalsToBrowser);
use JSON;
my $q = CGI->new;
print $q->header,
$q->start_html("Test Page"),
$q->h1("Results:"),
$q->div("json=$json"),
$q->end_html;
Server startup in terminal window:
~/erlang_programs/inets_proj$ erl
Erlang/OTP 20 [erts-9.2] [source] [64-bit] [smp:4:4] [ds:4:4:10] [async-threads:10] [hipe] [kernel-poll:false]
Eshell V9.2 (abort with ^G)
1> c(s).
s.erl:2: Warning: export_all flag enabled - all functions will be exported
{ok,s}
2> Server = s:start().
<0.86.0>
3> httpd:info(Server).
[{mime_types,[{"htm","text/html"},{"html","text/html"}]},
{server_name,"httpd_test"},
{erl_script_nocache,true},
{script_alias,{"/cgi-bin/",
"/Users/7stud/erlang_programs/inets_proj/cgi-bin/"}},
{bind_address,{127,0,0,1}},
{modules,[mod_alias,mod_actions,mod_esi,mod_cgi,mod_get,
mod_log]},
{server_root,"/Users/7stud/erlang_programs/inets_proj"},
{erl_script_alias,{"/erl",[mymod]}},
{port,51301},
{transfer_log,<0.93.0>},
{error_log,<0.92.0>},
{document_root,"./htdocs"}]
4>
curl request:
$ curl -v \
> -H 'Content-Type: application/json' \
> --data '{"a": 1, "b": 2}' \
> http://localhost:51301/cgi-bin/1.py
* Trying ::1...
* TCP_NODELAY set
* Connection failed
* connect to ::1 port 51301 failed: Connection refused
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 51301 (#0)
> POST /cgi-bin/1.py HTTP/1.1
> Host: localhost:51301
> User-Agent: curl/7.58.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 16
>
* upload completely sent off: 16 out of 16 bytes
===== hangs for about 5 seconds ====
< HTTP/1.1 504 Gateway Time-out
< Date: Thu, 08 Mar 2018 11:02:27 GMT
< Content-Type: text/html
< Server: inets/6.4.5
* no chunk, no close, no size. Assume close to signal end
<
* Closing connection 0
$
My directory structure:
~/erlang_programs$ tree inets_proj/
inets_proj/
├── apache_cl.erl
├── cgi-bin
│   ├── 1.pl
│   └── 1.py
├── cl.beam
├── cl.erl
├── errors.log
├── htdocs
│   └── file1.txt
├── mylog.log
├── mymod.beam
├── mymod.erl
├── old_server.conf
├── old_server3.conf
├── old_server4.conf
├── requests.log
├── s.beam
├── s.erl
├── server.conf
└── urlencoded_post_cl.erl
I dug up the RFC for the cgi spec, which says:
RFC 3875 CGI Version 1.1 October 2004
4.2. Request Message-Body
Request data is accessed by the script in a system-defined method;
unless defined otherwise, this will be by reading the 'standard
input' file descriptor or file handle.
Request-Data = [ request-body ] [ extension-data ]
request-body = <CONTENT_LENGTH>OCTET
extension-data = *OCTET
A request-body is supplied with the request if the CONTENT_LENGTH is
not NULL. The server MUST make at least that many bytes available
for the script to read. The server MAY signal an end-of-file
condition after CONTENT_LENGTH bytes have been read or it MAY supply
extension data. Therefore, the script MUST NOT attempt to read more
than CONTENT_LENGTH bytes, even if more data is available. However,
it is not obliged to read any of the data.
I don't understand what extension data is, but the key line is:
the [cgi] script MUST NOT attempt to read more than CONTENT_LENGTH
bytes, even if more data is available.
If I alter my python script to read in the content length rather than trying to read in the whole stdin file--which doesn't stop reading until it gets an eof signal--then my python cgi script successfully retrieves the json data from my inets httpd server.
#!/usr/bin/env python3
import json
import sys
import os
content_len = int(os.environ["CONTENT_LENGTH"])
req_body = sys.stdin.read(content_len)
my_dict = json.loads(req_body)
sys.stdout.write("Content-Type: text/html")
sys.stdout.write("\r\n\r\n")
if my_dict:
a = my_dict.get("a", "Not found")
b = my_dict.get("b", "Not found")
total = a + b
print("<div>Content-Length={}</div".format(content_len))
print("<div>Got json: {}</div>".format(my_dict) )
print("<div>a={}, b={}, total={}</div>".format(a, b, total))
else:
print("<div>Couldn't read json data.</div>")
'''
form = cgi.FieldStorage()
if "a" not in form:
print("<H1>Error:</H1>")
print("<div>'a' not in form</div>")
else:
print("<p>a:{}</p>".format( form["a"].value) )
if "b" not in form:
print("<H1>Error:</H1>")
print("<div>'b' not in form</div>")
else:
print("<p>b:{}</p>".format(form["b"].value) )
'''
Server info:
4> httpd:info(Server).
[{mime_types,[{"htm","text/html"},{"html","text/html"}]},
{server_name,"httpd_test"},
{erl_script_nocache,true},
{script_alias,{"/cgi-bin/",
"/Users/7stud/erlang_programs/inets_proj/cgi-bin/"}},
{bind_address,{127,0,0,1}},
{modules,[mod_alias,mod_actions,mod_esi,mod_cgi,mod_get,
mod_log]},
{server_root,"/Users/7stud/erlang_programs/inets_proj"},
{erl_script_alias,{"/erl",[mymod]}},
{port,65451},
{transfer_log,<0.93.0>},
{error_log,<0.92.0>},
{document_root,"./htdocs"}]
5>
curl request (note that curl automatically calculates the content length and puts it in a Content-Length header):
~$ curl -v \
> -H 'Content-Type: application/json' \
> --data '{"a": 1, "b": 2}' \
> http://localhost:65451/cgi-bin/1.py
* Trying ::1...
* TCP_NODELAY set
* Connection failed
* connect to ::1 port 65451 failed: Connection refused
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 65451 (#0)
> POST /cgi-bin/1.py HTTP/1.1
> Host: localhost:65451
> User-Agent: curl/7.58.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 16
>
* upload completely sent off: 16 out of 16 bytes
< HTTP/1.1 200 OK
< Date: Fri, 09 Mar 2018 04:36:42 GMT
< Server: inets/6.4.5
< Transfer-Encoding: chunked
< Content-Type: text/html
<
<div>Content-Length=16</div
<div>Got json: {'a': 1, 'b': 2}</div>
<div>a=1, b=2, total=3</div>
* Connection #0 to host localhost left intact
~$
Here's the perl script that I got work with inets httpd (1.pl):
#!/usr/bin/env perl
use strict;
use warnings;
use 5.020;
use autodie;
use Data::Dumper;
use JSON;
if (my $content_len = $ENV{CONTENT_LENGTH}) {
read(STDIN, my $json, $content_len);
my $href = decode_json($json);
my $a = $href->{a};
my $b = $href->{b};
print 'Content-type: text/html';
print "\r\n\r\n";
print "<div>a=$a</div>";
print "<div>b=$b</div>";
#my $q = CGI->new; #Doesn't work with inets httpd server
#my $q = CGI->new(''); #Doesn't try to read from stdin, do does work.
# print $q->header,
# $q->start_html("Test Page"),
# $q->div("json=$json"),
# $q->div("a=$a"),
# $q->div("b=$b"),
# $q->div("total=$total"),
# $q->end_html;
}
else {
my $error = "Could not read json: No Content-Length header in request.";
print 'Content-type: text/html';
print "\r\n\r\n";
print "<div>$error</div>";
# my $q = CGI->new;
# print $q->header,
# $q->start_html("Test Page"),
# $q->h1($error),
# $q->end_html;
}
I couldn't get perl's CGI module to work in conjunction with reading from STDIN. Edit: A kind soul at perlmonks helped me solve that one:
my $q = CGI->new('');
The blank string tells CGI->new not to read from stdin and parse the data.

Error while executing the perl script and not able to set the Environment Variables

I executed the below perl script,
#!/usr/bin/perl
use strict;
use DBD::Oracle;
use DBI;
my $driver = "Oracle";
my $database = "host=xxxxxx;port=6210;sid=xxxx";
my $dsn = "DBI:$driver:$database";
my $userid = "xxxxx";
my $password = "xxxxx";
#Database Connection
my $dbh = DBI->connect($dsn, $userid, $password,{RaiseError => 1}) or die "$DB::errstr";
my $sth = $dbh->prepare("update collabuser set user_email='aravikum.wipro.com' where user_login='aravikum'") or die "$DBI::errstr";
$sth->execute() or die "couldn't execute statementn$!";
$sth->rows;
#End of Program
$sth->finish();
$dbh->disconnect();
I got the error below:
Can't load '/usr/local/lib64/perl5/auto/DBD/Oracle/Oracle.so' for module DBD::Oracle: libocci.so.11.1: cannot open shared object file: No such file or directory at /usr/lib64/perl5/DynaLoader.pm line 190.
at perlupdt.pl line 11.
Compilation failed in require at perlupdt.pl line 11.
BEGIN failed--compilation aborted at perlupdt.pl line 11.
on googling, i got an answer like running the below export commands fixed the issue. so, i executed and it worked fine,
export ORACLE_HOME=/usr/lib/oracle/11.2/client64
export LD_LIBRARY_PATH=/usr/lib/oracle/11.2/client64/lib:$LD_LIBRARY_PATH
export PATH=/usr/lib/oracle/11.2/client64/bin:$PATH
But, i cant execute the above commands each and every time when i login to putty,
I decided to put this export inside the script so added these at the starting of the script,
$ENV{"ORACLE_HOME"} = '/usr/lib/oracle/11.2/client64';
$ENV{"LD_LIBRARY_PATH"} = '/usr/lib/oracle/11.2/client64/lib:$LD_LIBRARY_PATH';
$ENV{"PATH"} = '/usr/lib/oracle/11.2/client64/bin:$PATH';
But I am getting the above stated error, please suggest a solution to import these commands inside my perl script.
After setting the above export commands in the etc/profile and etc/bashrc and adding as an .sh file under etc/profile.d fixed the issue.

Gulp task to SSH and then mysqldump

So I've got this scenario where I have separate Web server and MySQL server, and I can only connect to the MySQL server from the web server.
So basically everytime I have to go like:
step 1: 'ssh -i ~/somecert.pem ubuntu#1.2.3.4'
step 2: 'mysqldump -u root -p'password' -h 6.7.8.9 database_name > output.sql'
I'm new to gulp and my aim was to create a task that could automate all this, so running one gulp task would automatically deliver me the SQL file.
This would make the developer life a lot easier since it would just take a command to download the latest db dump.
This is where I got so far (gulpfile.js):
////////////////////////////////////////////////////////////////////
// Run: 'gulp download-db' to get latest SQL dump from production //
// File will be put under the 'dumps' folder //
////////////////////////////////////////////////////////////////////
// Load stuff
'use strict'
var gulp = require('gulp')
var GulpSSH = require('gulp-ssh')
var fs = require('fs');
// Function to get home path
function getUserHome() {
return process.env.HOME || process.env.USERPROFILE;
}
var homepath = getUserHome();
///////////////////////////////////////
// SETTINGS (change if needed) //
///////////////////////////////////////
var config = {
// SSH connection
host: '1.2.3.4',
port: 22,
username: 'ubuntu',
//password: '1337p4ssw0rd', // Uncomment if needed
privateKey: fs.readFileSync( homepath + '/certs/somecert.pem'), // Uncomment if needed
// MySQL connection
db_host: 'localhost',
db_name: 'clients_db',
db_username: 'root',
db_password: 'dbp4ssw0rd',
}
////////////////////////////////////////////////
// Core script, don't need to touch from here //
////////////////////////////////////////////////
// Set up SSH connector
var gulpSSH = new GulpSSH({
ignoreErrors: true,
sshConfig: config
})
// Run the mysqldump
gulp.task('download-db', function(){
return gulpSSH
// runs the mysql dump
.exec(['mysqldump -u '+config.db_username+' -p\''+config.db_password+'\' -h '+config.db_host+' '+config.db_name+''], {filePath: 'dump.sql'})
// pipes output into local folder
.pipe(gulp.dest('dumps'))
})
// Run search/replace "optional"
SSH into the web server runs fine, but I have an issue when trying to get the mysqldump, I'm getting this message:
events.js:85
throw er; // Unhandled 'error' event
^
Error: Warning:
If I try the same mysqldump command manually from the server SSH, I get:
Warning: mysqldump: unknown variable 'loose-local-infile=1'
Followed by the correct mylsql dump info.
So I think this warning message is messing up my script, I would like to ignore warnings in cases like this, but don't know how to do it or if it's possible.
Also I read that using the password directly in the command line is not really good practice.
Ideally, I would like to have all the config vars loaded from another file, but this is my first gulp task and not really familiar with how I would do that.
Can someone with experience in Gulp orient me towards a good way of getting this thing done? Or do you think I shouldn't be using Gulp for this at all?
Thanks!
As I suspected, that warning message was preventing the gulp task from finalizing, I got rid of it by commenting the: loose-local-infile=1 From /etc/mysql/my.cnf

Using custom functions in Volt force apache process to die

I worked with Phalcon and Volt under WAMP. Recently we moved to another dev environment (CentOS) and there I have PHP 5.5.17 with latest Phalcon build (I compiled and tested 2 versions lower as well).
Now, when Volt tries to compile template with custom function, it crashes (PHP process). The same is
about custom filters.
Error log of Apache
[Tue Sep 30 06:06:24.809476 2014] [proxy_fcgi:error] [pid 31199:tid 140596014397184] (104)Connection reset by peer: [client 10.0.2.2:53931] AH01075: Error dispatching request to :3080:
[Tue Sep 30 06:06:27.216226 2014] [proxy_fcgi:error] [pid 31200:tid 140596161255168] [client 10.0.2.2:53941] AH01067: Failed to read FastCGI header
[Tue Sep 30 06:06:27.216249 2014] [proxy_fcgi:error] [pid 31200:tid 140596161255168] (104)Connection reset by peer: [client 10.0.2.2:53941] AH01075: Error dispatching request to :3080:
PHP error log
[30-Sep-2014 06:06:27] WARNING: [pool www] child 32519 exited on signal 11 (SIGSEGV - core dumped) after 204.725812 seconds from start
[30-Sep-2014 06:06:27] NOTICE: [pool www] child 32529 started
PHP code looks like
$di->set('view', function () use ($config) {
$view = new View();
$view->setViewsDir($config->application->viewsDir);
$view->registerEngines(array(
'.volt' => function ($view, $di) use ($config) {
$volt = new VoltEngine($view, $di);
$volt->setOptions(array(
'compiledPath' => $config->application->cacheDir,
'compiledSeparator' => '_',
'compileAlways' => $config->application->debug
));
$compiler = $volt->getCompiler();
$compiler->addFunction(
'last',
function ($resolvedArgs) use ($compiler) {
return 'array_pop('. $resolvedArgs .')';
}
);
return $volt;
}
));
return $view;
}, true);
And in Volt for example
{{ last(['1', '2', '3']) }}
And I really stuck on this problem, because I have pretty a lot of custom functions and I do need them. Tried to debug it, but, as soon as volt tried to parse line with custom function, process die.
Phalcon bug submitted. Solution: totally disable xdebug for current build. More here: https://github.com/xdebug/xdebug/pull/120

Redmine.pm does not work with Authen::Simple::LDAP

I'm using the Bitnami Redmine Stack on a Windows Server. I have configured Redmine to use LDAP authentication and it works. Now I'd like to have the SVN authentication via Redmine and LDAP. I can log-in with a Redmine-Account but not with LDAP. In the Apache error.log was an error like this:
[Authen::Simple::LDAP] Failed to bind with dn 'REDMINE'. Reason: 'Bad file descriptor'
'REDMINE' is the user which is used for the LDAP.
It seems to be a problem in the Redmine.pm which is written in Perl, but I don't know much about Perl.
I found a part of the Perl-Code which, I think, causes the error:
my $sthldap = $dbh->prepare(
"SELECT host,port,tls,account,account_password,base_dn,attr_login from auth_sources WHERE id = ?;"
);
$sthldap->execute($auth_source_id);
while (my #rowldap = $sthldap->fetchrow_array) {
my $bind_as = $rowldap[3] ? $rowldap[3] : "";
my $bind_pw = $rowldap[4] ? $rowldap[4] : "";
if ($bind_as =~ m/\$login/) {
# replace $login with $redmine_user and use $redmine_pass
$bind_as =~ s/\$login/$redmine_user/g;
$bind_pw = $redmine_pass
}
my $ldap = Authen::Simple::LDAP->new(
host => ($rowldap[2] eq "1" || $rowldap[2] eq "t") ? "ldaps://$rowldap[0]:$rowldap[1]" : "ldap://$rowldap[0]",
port => $rowldap[1],
basedn => $rowldap[5],
binddn => $bind_as,
bindpw => $bind_pw,
filter => "(".$rowldap[6]."=%s)"
);
my $method = $r->method;
$ret = 1 if ($ldap->authenticate($redmine_user, $redmine_pass) && (($access_mode eq "R" && $permissions =~ /:browse_repository/) || $permissions =~ /:commit_access/));
}
$sthldap->finish();
undef $sthldap;
Does anyone have a fix for this problem? Or maybe a working alternative to the standard Redmine.pm?
The problem was that I don't have installed "IO::Socket::IP" and "latest perl-ldap" via the cpan console and strawberry-perl doesn't include it in the install. After I installed these with the following CMD commands it worked fine!
> cpan
cpan> install IO::Socket::IP
cpan> install latest perl-ldap