Bluemix APIConnect Publishing a loopback project from command line - authentication

I am following the APIC tutorial documented here:
Publishing a project from the command line
I have gone through the steps in the tutorial to get into the APIConnect dashboard in Bluemix and into the Sandbox catalog.
I get the baseURL under api management:
e.g. https://api.us.apiconnect.ibmcloud.com/FREDusibmcom-dev/sb
Then I use that to
apic config:set
catalog=apic-catalog://api.us.apiconnect.ibmcloud.com/orgs/FREDusibmcom-dev/catalogs/sb
app=apic-app://api.us.apiconnect.ibmcloud.com/orgs/FREDusibmcom-dev/apps/acme-bank-Fred
After this as per the instructions, I try to login using my Bluemix credentials
apic login --server api.us.apiconnect.ibmcloud.com -u fred -p mypassword
This fails with:
ERROR Login to api.us.apiconnect.ibmcloud.com failed, please verify the servername and credential
Am I doing something wrong in regards to the servername or credentials that I am using? Thanks!

For your server argument in the login command, use us.apiconnect.ibmcloud.com instead. I think the api portion is throwing things off.
Once successful there, I also recommend that you run apic edit and proceed to Log in with Bluemix there, as that will ensure that you're able to publish applications to Bluemix from the CLI or API Designer.

I assume you used your actual username/password, and not "fred/mypassword".
If so, then the problem may be with the Bluemix URL. There's now a simpler way to get the app identifier and catalog identifier (and to make sure you have the right Bluemix base URL). The catalog & app tiles now have a link icon that you can click to easiy copy the catalog/app identifers:
Geting the catalog identifier
The Bluemix base URL will the part immediately following apic-catalog:// in the catalog identifier.
We're in the process of updating the docs with this.

Related

Pulumi automation backend

I am a newbie in pulumi. I am having an issue. When I do pulumi login in GCP backend It appears an error:
stderr: error: getting secrets manager: passphrase must be set with
PULUMI_CONFIG_PASSPHRASE or PULUMI_CONFIG_PASSPHRASE_FILE environment
variables
When I do pulumi logout the deployment works - pulumi api automation. Does anyone have an idea how to fix this?
Tried to set pulumi_config_passphrase.
When using the self-managed backends for Pulumi, you need to provide a pass phrase to encrypt secret values.
This can be done by setting a global environment variable which will depend on the operating system you're using. In Unix like environments (eg MacOs or Linux) you can do:
export PULUMI_CONFIG_PASSPHRASE=<a password you can remember>
In Windows on Powershell this can be done using:
$env:PULUMI_CONFIG_PASSPHRASE=<a password you can remember>
If you don't wish to use a passphrase, you can leverage the Pulumi service as your state store, or configure a cloud secrets provider.
This is done when initializing your stack, more information on that can be found here

How to connect snowsql from PC (Windows 10)

I have signed up for 30 day trial version for snowflake, as part of learning I am trying to run the Snowsql (client) from my windows desktop. I installed client from snowflake client repository (account name, username and password are all correct).
Got the following error:
C:\Users\ugain>snowsql -a vg49826 -u ugainedi
Password:
**250001 (08001): Failed to connect to DB. Verify the account name is correct: vg49826.snowflakecomputing.com:443. HTTP 403: Forbidden
If the error message is unclear, enable logging using -o log_level=DEBUG and see the log to find out the cause. Contact support for further help.
Goodbye!**
Appreciate the help. Thank you!
Go to your account using the Snowflake web UI and look at the URL. Since there is no account vg49826.snowflakecomputing.com, that means you're running Snowflake somewhere other than on AWS_US_WEST_2. That's the only region that does not include the region in the account name for connection purposes.
Copy the portion of the URL up to but not including "snowflakecomputing.com". It will be something like vg49826.us-east-1, vg49826.east-us-2.azure, or something similar. Your Snowflake account, for the purposes of connecting to SnowSQL is the portion of the URL after https:// and before snowflakecomputing.com in the web UI URL.

Keycloak federation with LDAP fails to make the connection : Error! Error when trying to connect to LDAP. See server.log for details

I am trying to create a federated authentication using the Keycloak and following the steps mentioned here: Setup User Federation with Keycloak
I have been using the port 10389 instead of 389 mentioned in the document. Everything seems to be working fine until the step where I am making the connection from Keycloak to LDAP.
When I provide the Connection URL as ldap://localhost:10389 and click on the Test Connection then I get the error:
Error! Error when trying to connect to LDAP. See server.log for details
I am not sure what's wrong because when I check in the Apache Directory Studio there everything seems to be working fine for me and I am able to get all the users list etc. I am not sure why I am unable to make the connection from Keycloak to LDAP.
I tried the following things but nothing worked for me:
ldap://localhost:10389
localhost:10389
ldap://127.0.0.1:10389
ldap://localhost:389
Stopped the docker in the dashboard and started again.
After trying a lot I found the solution. Posting the answer as it can be useful to someone else in the future.
I was using the localhost and 127.0.0.1 which was not working. Finally, I checked the IP Address of my system using the terminal (for mac ipconfig) and tried that and it worked:
ldap://192.168.1.12:10389

ERROR: (gcloud.compute.ssh) Could not fetch resource: - Insufficient Permission

I am having trouble working through the Compute Engine Quickstart: Build a to-do app with a MongoDB tutorial. (edit: I am running the tutorial from within the compute engine console; i.e. https://console.cloud.google.com/compute/instances?project=&tutorial=compute_quickstart)
I SSH into the backend instance. I enter the "gcloud compute" command as copied from the tutorial. I am prompted to enter a passphrase. The following is returned:
WARNING: The public SSH key file for gcloud does not exist.
WARNING: The private SSH key file for gcloud does not exist.
WARNING: You do not have an SSH key for gcloud.
WARNING: SSH keygen will be executed to generate a key.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in
...
<< Identifying detail ommitted >>
...
**ERROR: (gcloud.compute.ssh) Could not fetch resource:
- Insufficient Permission**
I had run through this stage of the tutorial on a previous occasion with no problems.
I am working from a Windows 10 PC with the google-cloud-sdk installed. I am using google chrome. I have tried in both regular and incognito modes.
Any help or advice greatfully received!
DaveDub
It looks like the attempt to SSH is recognising the instance in your project, but the user doesn't have the required permissions to access the machine.
Have you tried running:
gcloud auth login
and completing the web-based authorization to ensure you are attempting to access the machine as the correct (authenticated) user? This process ensures the Cloud SDK you are running inherits the permissions of the user specified in the web-based authorisation. See here for more information on this.
It's also worth adding the link to the tutorial you are following to your question.
Besides the accepted answer, be sure you are in the correct gcloud project
gcloud projects list
Then
gcloud config set project <your-project>
I just ran into this for yet another reason. Google has always had poor handling of multi-user auth conflicts with their business products. Whatever you sign into a clean chrome session with 'first' gets a 'special', invisible role. I've noticed with gsuite that I get 'forced' into that first user when I try to access the admin panel, and the only way to escape is to make sure that whatever google user I use for the gsuite admin is 'first', or open an incognito window. I've seen this bug for years, can't believe it still exists.
Anyways, I ran into a similar issue. Somehow I was the wrong google user, so the link I got when copy/pasting out of 'connect with gcloud command' was implying wrong google user. Only noticed later when I just gave up and used the terminal that I was not my normal user... So, might look into that.

Use GAE remote api with local (dev) installation

Has anyone find to use the GAE remote api but instead of connecting to AppEngine to connect to localhost?
For dev purposes of course
i was able to get this working by adding the following to the app.yaml file
builtins:
- remote_api: on
and then from the command line you can access the db, users, urlfetch or memcache modules
remote_api_shell.py -s localhost:8080
This will prompt you for the email and password but this is not important right now. the remote_api_shell.py is on my path from the google app engine directory
Have you tried the development console? To access it, go to this URL: http://localhost:8080/_ah/admin.
If you really want to use the remote API, have a look at this article. I believe you can use the dev_server by passing the local host url to the interactive console script.
For Java see this document which explains both local and remote access
https://developers.google.com/appengine/docs/java/tools/remoteapi#Configuring_Remote_API_on_the_Client
If there are some like me who prefer to execute from a python script rather than a shell:
from google.appengine.ext.remote_api import remote_api_stub
remote_api_stub.ConfigureRemoteApiForOAuth('localhost:8081', '/_ah/remote_api', secure=False)
os.environ['SERVER_SOFTWARE'] = 'Development'
os.environ['HTTP_HOST'] = 'localhost:8080'
... do stuff ...
I run the dev server with the option "--api_port 8081" otherwise just look at the port used in the dev server logs ("Starting API server at ...").
The environ tweaks are to be able to use cloudstorage api against the dev server too.