WLST identify the Multi Datasource. - passwords

I have a script which needs to be updated. Currently it demands password for all the datasources but now we have some multi datasources as well along with the generic datasources. My requirement is that whenever the datasource is a multi datasource, it should skip that and shouldn't prompt for the password. I have listed the current script below.
def updateJDBCPasswords():
PARAMS_TEMPLATE = '/JDBCSystemResources/%s/JDBCResource/%s/JDBCDriverParams/%s'
domainConfig()
# Get JDBC DataSources
cd("JDBCSystemResources")
dataSources = cmo.getJDBCSystemResources()
edit()
# For each DataSource update the password
for dataSource in dataSources :
dsName = dataSource.getName()
password=raw_input("Enter database password for "+ dsName +" : ")
cd(PARAMS_TEMPLATE % (dsName, dsName, dsName) )
cmo.setPassword(password)
So far I have seen my threads/blogs but none of those is relevant to my problem. There is help for creating the multidatasources but couldn't find any hint how to identify it in order to satisfy my scenario.

Not exactly intuitive, but the root of serverConfig tree has a list of deployments which includes things like jdbc data sources and this is the only place I could find an mbean that allows you to identify multi data sources. So the additional code at the start builds a list of names to ignore in your code.
serverConfig()
deployments=cmo.getDeployments()
multiDataSources = []
for deployment in deployments:
if (deployment.getType()=="JDBCMultiPool"):
multiDataSources.append(deployment.getName())
domainConfig()
# Get JDBC DataSources
cd("JDBCSystemResources")
dataSources = cmo.getJDBCSystemResources()
# For each DataSource update the password
for dataSource in dataSources :
dsName = dataSource.getName()
if dsName in multiDataSources:
print("Skipping multidatasource %s" % dsName)
continue
print("Processing data source %s" % dsName)

Something like the below can help:
cd(_dsDir + '/JDBCDriverParams/NO_NAME_0/Properties/NO_NAME_0/Property/user')
_userprop = cmo.getValue()
# Multi datasource wouldn't have user properties and hence skipping the updation of multi datasources
if _userprop is None:
continue
else:
# continue updating password
cmo.setValue(_userval)
# Next set the password
cd(_dsDir + '/JDBCDriverParams/NO_NAME_0')
set('PasswordEncrypted', _dsPassword)

Related

ANSYS Mechanical Workbench Scripting - Accessing Parameters

Ansys gurus,
My project is a static structural analysis using ANSYS workbench mechanical. I have created the parametrized geometry (via Design modeler) and material property in workbench, and used ACT scripting to configure the model. However, I don't find too much information on how to access the parameters via ACT scripting.
I have confirmed that the geometric parameters are successfully created in the workbench, e.g.
ID
Paramater Name
Value
Unit
P1
diameter
50
um
The documentation LINK suggests that I can obtain parameter ID using Analysis.GetParameter(), however, the following code didn't work for me and resulted in the error as below.
Code:
STATIC_STRUCTURAL = ExtAPI.DataModel.AnalysisByName("Static Structural")
HEIGHT = STATIC_STRUCTURAL.GetParameter('height')
Error:
Property not found.
Do you have any suggestions on the cause of such error, is it because the Parameters were not imported from workbench "project schematic" to "Model", or the code I tried to retrieve the parameters was incorrect. In either cases, could you advise the correct method to access the parameters? Thank you!
hawkoli1987
If you want to access a parameter from the "project schematic" page you can create a list. If you than want to do something with this inside of mechanical, you have to send the commands to your model:
# Access the geometric parameters
allParameters = Parameters.GetAllParameters()
for parameter in allParameters:
print parameter.DisplayText
if parameter.DisplayText == 'height':
heigthParameter = parameter
# Loop over all systems in the project
for system in GetAllSystems():
# Get Model Container
model = system.GetContainer(ComponentName="Model")
# edit model component in batch mode
system.Refresh()
model.Edit(Interactive=True)
# code to be sent to ansys mechanical
cmd ='''
here goes your ACT script as string. You have to make sure, that there are no leading spaces or tabs.
'''
# send code and exit mechanical
model.SendCommand(Language='Python',Command=cmd)
model.Exit()
print "Finished script execution."

How to validate API response against multiple instance of data base from feature file in Karate API automation?

I have developed a script which executes against one DB instance e.g.: db1. The code to connect to DB is written in Background section. Now what i want to do is, i have to execute same test script against diffrent db instance e.g.:db2
Feature:Execution against multiple DB instance.
##############################################
Background:
* def db_properties = {db_username,db_password,db_connection_string,driver}
* def createConnection = path to read .java file
* def readFromDB = new createConnection(db_properties)
##############################################
In * def db_properties, i have hard coded the actual values of username, password, conenction string and driver.What exactly i want to do is, i have to validate my API response agains't another DB instance e.g. build is deployed in another environment, and db properties which i have mentioned is diffrent environment. How can i do it?
This has nothing to do with Karate. Maybe the solution is to have 2 sets of DB connection values in your karate-config.js. Please figure out a solution that is appropriate for your situation.

wlst commands for Weblogic security realm Roles creation and add conditions

I have started my WL Admin console. I went to the Security Realms -> myrealm -> Roles and Policies -> Global Roles -> Roles. There I clicked on "New" button, created a new role, then modified it, giving it a LDAP user as Role condition.
I was wondering if we can automate this job by creating wlst script. Could you please help us to identify the wlst commands for - Create a role & adding conditions.
I have done some study about cmo.getSecurityConfiguration().getDefaultRealm().lookupRoleMapper("XACMLRoleMapper") from Oracle pages but not much sure about the implementation.
Here is a sample to script used to create a global role and a policy on a jms resource :
connect('...','...','t3://localhost:7001')
realm=cmo.getSecurityConfiguration().getDefaultRealm()
rm=realm.lookupRoleMapper(""XACMLRoleMapper"")
rm.createRole(None,""role1"",None,"""")
rm.createRole(None,""role2"",None,"""")
authorizer=realm.lookupAuthorizer(""XACMLAuthorizer"")
authorizer.createPolicy('type=<jms>, application=SystemModule-0, destinationType=queue, resource=Queue-0','{Rol(role1)}')
authorizer.removePolicy('type=<jms>, application=SystemModule-0, destinationType=queue, resource=Queue-0','{Rol(role1)}')
authorizer.getPolicyExpression('type=<jms>, application=SystemModule-0, destinationType=queue, resource=Queue-0')
rm=cmo.getSecurityConfiguration().getDefaultRealm().lookupRoleMapper("XACMLRoleMapper")
print rm.getProviderClassName()
print rm.getName()
cursor = rm.listAllRoles(1000)
print cursor
userReader = rm
while userReader.haveCurrent(cursor):
usrrd = userReader.getCurrentProperties(cursor)
print usrrd.get('RoleName')
#print usrrd
print "\t",usrrd.get('Expression')
userReader.advance(cursor)
userReader.close(cursor)

Renaming an Amazon CloudWatch Alarm

I'm trying to organize a large number of CloudWatch alarms for maintainability, and the web console grays out the name field on an edit. Is there another method (preferably something scriptable) for updating the name of CloudWatch alarms? I would prefer a solution that does not require any programming beyond simple executable scripts.
Here's a script we use to do this for the time being:
import sys
import boto
def rename_alarm(alarm_name, new_alarm_name):
conn = boto.connect_cloudwatch()
def get_alarm():
alarms = conn.describe_alarms(alarm_names=[alarm_name])
if not alarms:
raise Exception("Alarm '%s' not found" % alarm_name)
return alarms[0]
alarm = get_alarm()
# work around boto comparison serialization issue
# https://github.com/boto/boto/issues/1311
alarm.comparison = alarm._cmp_map.get(alarm.comparison)
alarm.name = new_alarm_name
conn.update_alarm(alarm)
# update actually creates a new alarm because the name has changed, so
# we have to manually delete the old one
get_alarm().delete()
if __name__ == '__main__':
alarm_name, new_alarm_name = sys.argv[1:3]
rename_alarm(alarm_name, new_alarm_name)
It assumes you're either on an ec2 instance with a role that allows this, or you've got a ~/.boto file with your credentials. It's easy enough to manually add yours.
Unfortunately it looks like this is not currently possible.
I looked around for the same solution but it seems neither console nor cloudwatch API provides that feature.
Note:
But we can copy the existing alram with the same parameter and can save on new name
.

How to write a Python script that uses the OpenERP ORM to directly upload to Postgres Database

I need to write a "standalone" script in Python to upload sales taxes to the account_tax table in the database using ONLY the ORM module of OpenERP. What I would like to do is something like the pseudo code below.
Can someone provide me a more details on the following:
1) what sys.path's do I need to set
2) what modules do I need to import before importing the "account" module. Currently when I import the "account" module I get the following error:
AssertionError: The report "report.custom" already exists!
3) What is the proper way to get my database cursor. In the code below I am simply calling psycopg2 directly to get a cursor.
If this approach cannot work, can anyone suggest an alternative approach other than writing XML files to load the data from the OpenERP application itself. This process needs to run outside of the the standard OpenERP application.
PSEUDO CODE:
import sys
# set Python paths to access openerp modules
sys.path.append("./openerp")
sys.path.append("./openerp/addons")
# import OpenERP
import openerp
# import the account addon modules that contains the tables
# to be populated.
import account
# define connection string
conn_string2 = "dbname='test2' user='xyz' password='password'"
# get a db connection
conn = psycopg2.connect(conn_string2)
# conn.cursor() will return a cursor object
cursor = conn.cursor()
# and finally use the ORM to insert data into table.
If you wanna do it via web service then have look at the OpenERP XML-RPC Web services
Example code top work with OpenERP Web Services :
import xmlrpclib
username = 'admin' #the user
pwd = 'admin' #the password of the user
dbname = 'test' #the database
# OpenERP Common login Service proxy object
sock_common = xmlrpclib.ServerProxy ('http://localhost:8069/xmlrpc/common')
uid = sock_common.login(dbname, username, pwd)
#replace localhost with the address of the server
# OpenERP Object manipulation service
sock = xmlrpclib.ServerProxy('http://localhost:8069/xmlrpc/object')
partner = {
'name': 'Fabien Pinckaers',
'lang': 'fr_FR',
}
#calling remote ORM create method to create a record
partner_id = sock.execute(dbname, uid, pwd, 'res.partner', 'create', partner)
More clearly you can also use the OpenERP Client lib
Example Code with client lib :
import openerplib
connection = openerplib.get_connection(hostname="localhost", database="test", \
login="admin", password="admin")
user_model = connection.get_model("res.users")
ids = user_model.search([("login", "=", "admin")])
user_info = user_model.read(ids[0], ["name"])
print user_info["name"]
You see both way are good but when you use the client lib, code is less and easy to understand while using xmlrpc proxy is lower level calls that you will handle
Hope this will help you.
As per my view one must go for XMLRPC or NETSVC services provided by Open ERP for such needs.
You don't need to import accounts module of Open ERP, there are possibilities that other modules have inherited accounts.tax object and had altered its behaviour as per your business needs.
Eventually if you feed data by calling those methods manually without using Open ERP Web service its possible you'll get undesired result / unexpected failures / inconsistent database state.
You can use Erppeek to browse data, but not sure if you can really upload data to DB, personally I use/prefer XMLRPC
Why don't you use the xmlrpc call of openerp.
it will not need to import account or openerp . and even you can have all orm functionality.
You can use python library to access openerp server using xmlrpc service.
Please check https://github.com/OpenERP/openerp-client-lib
It is officially supported by OpenERP SA.
If you want to interacti directly with the DB, you could just import psycopg2 and:
conn = psycopg2.connect(dbname='dbname', user='dbuser', password='dbpassword', host='dbhost')
cur = conn.cursor()
cur.execute('select * from table where id = %d' % table_id)
cur.execute('insert into table(column1, column2) values(%d, %d)' % (value1, value2))
cur.close()
conn.close()
Why you want to fix it like that?! You should create a localization module and define data in XML files. This is the standard way to fix such a problem in OpenERP.
You want to insert sales taxes for which country? Explain more plz.
from openerp.modules.registry import RegistryManager
registry = RegistryManager.get("databasename")
with registry.cursor() as cr:
user = registry.get('res.users').browse(cr, userid, listids)
print user