How to get Data Dump from magento cloud. I have tried magento-cloud db:dump. It has given me .sql file. I dont know where to run that file.
I tried Magento CLI. I got sql file using magento-cloud db:dump in my terminal.
P.S. I am using Mac
Related
I am trying to set my environment variable (on Mac) to request from Google Bigquery by using the following guideline:
Source: https://cloud.google.com/bigquery/docs/reference/libraries
I'm doing it to avoid having to type in export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/service-account-file.json" into VS Code terminal, everytime I want to run something from Bigquery.
How I do it? (which doesn't work)
(I create both .bashrc and .profile file, because not sure which one do I need) Create both files and put it in:
Insert into both files the following:
export GOOGLE_APPLICATION_CREDENTIALS="/Users/GunardiLin/Credentials/service-account-file.json"
Put the google credentials to: /Users/GunardiLin/Credentials/service-account-file.json
(The credential file is 100 % correct, because I tested it by inserting the following manually export GOOGLE_APPLICATION_CREDENTIALS=/Users/GunardiLin/Credentials/service-account-file.json . Afterwards my program to request on Bigquery works fine. Doing it manually everytime I start the VS Code is not ideal for me.)
After Step 3, I get the following error:
Can somebody help me with this problem?
I am using MacOS, VS Code, Conda. Thank you in advance.
I have installed confluence and postgres on synology nas using docker. Both run succesfully. Now I have to copy the data from a .sql file to the database that I have created in postgres.
How can I do this? I tried looking up different things but nothing helps.
regards
Use could use psql and let read commands from the file and execute this in your database. --file=? will do the trick.
I am tasked to automate the process of downloading files from Amazon S3. I am quite new this, after searching on the web I found this command line utility for windows s3.exe
I run the get command with the below syntax
s3 get xxx-xxx/yyyy_yyy/ D:\ /key:****** /secret:*****
xxx-xxx(bucketname)
yyyy_yyy(keyprefix)
D:(local path)
When I execute this the only message i see is s3.exe version 2.0 - check for updates at http://s3.codeplex.com and then it takes me back to the folder where I run the command. Pardon me if this too basic a question for this forum but if anyone have used this utility some insights will help me achieve this
I want to encode my php files to give this code to the testers to test. For that,I have encoded my php file using zend quard. While taking license it I am getting,below warning,
I tried to run that encoded file, I get this error.
How to solve this error and how to run my encode php file.I am using xampp how to configure xampp with zendguard. I am new to zend guard since yesterday I am trying to solve this.But, I did not get any soluctions. Can any one assist me to solve this.
I'm trying to run a pyspark job I created that downloads and uploads data from s3 using the boto3 library. While the job runs fine in pycharm, when I try to run it in nifi using this template https://github.com/Teradata/kylo/blob/master/samples/templates/nifi-1.0/template-starter-pyspark.xml
The ExecutePySpark errors with "No module named boto3".
I made sure it was installed on my conda environment that is active.
Any ideas, im sure im missing something obvious.
Here is a picture of the nifi spark processor.
Thanks,
tim
The Python environment where PySpark should run on is configured via the PYSPARK_PYTHON variable.
Go to Spark installation directory
Go to conf
Edit spark-env.sh
Add this line: export PYSPARK_PYTHON=PATH_TO_YOUR_CONDA_ENV