Fileupload (Laravel 9) and store to https local dev server - file-upload

I have two local development servers, one https://public.local and other https://cdn.local, both on (xampp). I am trying to save a file from public to cdn using fileupload but without success. the following driver i tried:
'cdn' => [
'driver' => 'https',
'root' => '/content/images/profileimages',
'url' => env('CDN_URL'),
'visibility' => 'public',
'throw' => false,
],
where:
CDN_URL=https://cdn.public
Are there any prerequisites that need to be met, server configuration or any other packages or libraries that need to be included for this to work?

Related

How to use Wasabi (AmazonS3) in sabre/dav?

I am building a WebDAV server using sabre/dav, I want to create a WebDAV server file storage location in Wasabi which is compatible with AmazonS3, I did some research and found something that looks like AWS.php but I don't know how to use it. If anyone knows how to do this specifically, please respond.
What we tried:
・Download s3dav (https://github.com/audionamix/s3dav) and install the file.
・Server.php was written as follows
<?php
use Sabre\DAV;
use Aws\S3\S3Client;
// The autoloader
require 'vendor/autoload.php';
$raw_credentials = array(
'credentials' => array(
'key' => '<insert-access-key>',
'secret' => '<insert-secret-key>'
),
//'profile' => 'wasabi',
'endpoint' => 'https://s3.wasabisys.com',
'region' => 'us-east-1',
'version' => 'latest',
'use_path_style_endpoint' => true,
'use_path_style' => true,
'use_ssl' => true,
'port' => 443,
'hostname' => 's3.wasabisys.com',
'bucket' => '<bucket-name>',
);
// establish an S3 Client.
$s3 = S3Client::factory($raw_credentials);
// Now we're creating a whole bunch of objects
//$rootDirectory = new DAV\FS\Directory('public');
$rootDirectory = new DAV\FS\S3Directory("/",'<bucket-name>',$s3);
// The server object is responsible for making sense out of the WebDAV protocol
$server = new DAV\Server($rootDirectory);
// If your server is not on your webroot, make sure the following line has the
// correct information
$server->setBaseUri('/server.php/');
// The lock manager is reponsible for making sure users don't overwrite
// each others changes.
$lockBackend = new DAV\Locks\Backend\File('data/locks');
$lockPlugin = new DAV\Locks\Plugin($lockBackend);
$server->addPlugin($lockPlugin);
// This ensures that we get a pretty index in the browser, but it is
// optional.
$server->addPlugin(new DAV\Browser\Plugin());
// All we need to do now, is to fire up the server
$server->exec();
Result:
The file name list is displayed, but it is displayed as 0 bytes.
Uploading is working, but other operations are not working (file size is correct on Wasabi).
”4.4.0 Exception Cannot traverse an already closed generator" is displayed.

Unable to run migrations on GCP with CakePHP 3.8

I am trying to set up my CakePHP 3.8 project on a GCP "Compute Engine" VM.
I have set up my app.php to use the following DB configuration:
'className' => 'Cake\Database\Connection',
'driver' => 'Cake\Database\Driver\Mysql',
'persistent' => false,
'datasource' => 'Database/Mysql',
'persistent' => false,
'host' => 'localhost',
'username' => 'user',
'password' => 'password',
'database' => 'dbname',
'prefix' => '',
'encoding' => 'utf8',
'timezone' => 'UTC',
'cacheMetadata' => true,
'log' => false,
'flags' => [
PDO::MYSQL_ATTR_INIT_COMMAND => "SET ##SESSION.sql_mode='';",
// uncomment below for use with Google Cloud SQL
PDO::MYSQL_ATTR_SSL_KEY => CONFIG.'ssl/client-key.pem',
PDO::MYSQL_ATTR_SSL_CERT => CONFIG.'ssl/client-cert.pem',
PDO::MYSQL_ATTR_SSL_CA => CONFIG.'ssl/server-ca.pem',
PDO::MYSQL_ATTR_SSL_VERIFY_SERVER_CERT => false
],
'cacheMetadata' => true,
'log' => false,
My problem happens when I try to run migrations. The site works just fine with the above configuration, however, if I run
$> php bin/cake.php migrations migrate
I get a slew of errors saying that it cannot connect, access denied for user#host.
If I add
'ssl_key' => CONFIG .'ssl/client-key.pem',
'ssl_cert' => CONFIG . 'ssl/client-cert.pem',
'ssl_ca' => CONFIG . 'ssl/server-ca.pem',
I get an error:
Caused by: [PDOException] PDO::__construct(): Peer certificate CN=`gcpname:gcpserver' did not match expected CN=`111.111.111.111' in /var/www/mydomain.com/vendor/robmorgan/phinx/src/Phinx/Db/Adapter/PdoAdapter.php on line 79
I guess this is because the migrations plugin still doesn't pass the flags or custom mysql_attr_* options over to the Phinx connection configuration, see this issue:
https://github.com/cakephp/migrations/issues/374
I don't think there's much that can be done here, other than adding support for flags / attribute options, or using Phinx directly (ie without the Migrations plugin).
I've pushed a PR that would add support for driver specific flags, you might want to give it a try and comment on the issue or the PR whether it works for you (it's for CakePHP 4.x (Migrations 3.x), I'll backport it for CakePHP 3.x (Migrations 2.x) in case it's being accepted):
https://github.com/cakephp/migrations/pull/478

Laravel SQL Server connection with ENCRYPT=yes trustServerCertificate=true

I got a ubuntu docker container which runs php 5.5.9, laraverl 5.2 which can connect successfully to SQL Server and get results back.
The docker image I am using is https://hub.docker.com/r/h2labs/laravel-mssql/
The problem I got is that the server uses encryption and I cant find how to pass the following parameters to the laravel connection string for mssql
ENCRYPT=yes;trustServerCertificate=true
My SQL Server connection string at present looks like this
DB_CONNECTION=sqlsrv
DB_HOST=sql.mydomain.com
DB_PORT=1433
DB_DATABASE=mydbname
DB_USERNAME=mysusername
DB_PASSWORD=mypass
My laravel database config looks like this
'sqlsrv' => [
'driver' => 'sqlsrv',
'host' => env('DB_HOST', 'localhost'),
'database' => env('DB_DATABASE', 'forge'),
'username' => env('DB_USERNAME', 'forge'),
'password' => env('DB_PASSWORD', ''),
'charset' => 'utf8',
'prefix' => '',
],
The SQL Server error log entry is
Encryption is required to connect to this server but the client library does not support encryption; the connection has been closed. Please upgrade your client library. [CLIENT: 103.31.114.56]
Support for either option was not introduced until Laravel 5.4; Specifically, v5.4.11
So you would first need to upgrade to laravel/framework:>=5.4.11,<5.5
Then, to configure your application, you will need to modify your config/database.php file as follows:
// ...
'sqlsrv' => [
'driver' => 'sqlsrv',
'host' => env('DB_HOST', 'localhost'),
'database' => env('DB_DATABASE', 'forge'),
'username' => env('DB_USERNAME', 'forge'),
'password' => env('DB_PASSWORD', ''),
'charset' => 'utf8',
'prefix' => '',
'encrypt' => 'yes', // alternatively, defer to an env variable
'trust_server_certificate' => 'true', // alternatively, defer to an env variable
],
// ...
DatabaseServiceProvider, via ConnectionFactory and SqlServerConnector will use this to build the underlying PDO connection with those options set in the DSN.

Amazon S3 Upload error SSL certificate issues

I'm trying to test Laravel Amazon S3 on my localhost but keep getting the same error:
S3Exception in WrappedHttpHandler.php line 192: Error executing
"ListObjects" on
"https://s3-us-west-2.amazonaws.com/app?prefix=appimages%2FIMG-1469840859-j.jpg%2F&max-keys=1&encoding-type=url";
AWS HTTP error: cURL error 60: SSL certificate problem: unable to get
local issuer certificate (see
http://curl.haxx.se/libcurl/c/libcurl-errors.html)
My code:
$s3 = \Storage::disk('s3');
$filePath = '/images/' . $filename;
$s3->put($filePath, file_get_contents($image), 'public');
You have do a tweak to the php.ini file. Download this file http://curl.haxx.se/ca/cacert.pem and set the path in php.ini like this and then restart the server.
;;;;;;;;;;;;;;;;;;;;
; php.ini Options ;
;;;;;;;;;;;;;;;;;;;;
curl.cainfo = "C:\xampp\php\extras\ssl\cacert.pem"
Above path is common for XAAMP
And that will fix your issue.
$s3 = new S3Client
([
'version' => 'latest',
'scheme' =>'http',
'region' => $this->config->item('s3_region'),
'credentials' => [
'key' => $this->config->item('s3_access_key'),
'secret' => $this->config->item('s3_secret_key')
],
]);
Add 'scheme' =>'http' for development.
I had the same problem.
Error reason is you are working on local or on a not verified server.
Just you need to add the following line to "filesystem.php"
'scheme' => 'http' // to disable SSL verification on local development
Your filesystem.php should look like this :
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
'scheme' => 'http' // to disable SSL verification on local development
],
When you run it on your server which has SSL verification, you need to comment 'scheme' line.
Try it and you will see it works.
Enjoy your coding !

Yii is handling $_GET for clean url only

i am trying to allow accessing the $_GET for both clean url and normal get request in YII (Yii 1.1.14), so for example if i have a controller Cities and action getCities that var_dump($_GET) only
1- http://example.com/cities/getCities/country_id/100
==> the output is array(1) { ["country_id"]=> string(3) "100" }
2- http://example.com/cities/getCities?country_id=100
==> the output is array(0) { }
my urlManager is
'urlManager' => array(
'class' => 'ext.localeurls.LocaleUrlManager',
'languageParam' => 'lang', // Name of the parameter that contains the desired language when constructing a URL
'urlFormat' => 'path',
'showScriptName' => false,
'caseSensitive' => false,
'rules'=>array(
'login'=>'/home/login'
)
),
how could i allow Yii to recognize $_GET in both cases above?
EDIT
i am using nginx 1.6. GET params (?x=y) is working fine on other Yii projects, only this project doesn't. I changed the web-server to apache, and i got it working on this project!!! although this project has the same nginx configurations like others!!