From the documentation of oracle :
Domain Runtime MBean Server : This MBean server also acts as a single
point of access for MBeans that reside on Managed Servers.
what i want to do is to use this fact to access all my custom mBeans scattered in several managed servers.
for example assume that i have two nodes server-1 server-2 .
how can i access all of the custom mBeans on both server-1 server-2 by connecting to the administrator node ?
i dont want to remotly access each node to return the result i want a single entry point
i managed to get the names of the servers and the states and other information by doing this
JMXConnector connector;
ObjectName service;
MBeanServerConnection connection;
String protocol = "t3";
Integer portInteger = Integer.valueOf(<admin server port>);
int port = portInteger.intValue();
String jndiroot = "/jndi/";
String mserver = "weblogic.management.mbeanservers.runtime";
JMXServiceURL serviceURL = new JMXServiceURL(protocol, "<serverName>", port,
jndiroot + mserver);
Hashtable h = new Hashtable();
h.put(Context.SECURITY_PRINCIPAL, "weblogic");
h.put(Context.SECURITY_CREDENTIALS, "weblogicpass");
h.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES,
"weblogic.management.remote");
h.put("jmx.remote.x.request.waiting.timeout", new Long(10000));
connector = JMXConnectorFactory.connect(serviceURL, h);
connection = connector.getMBeanServerConnection(); service = new ObjectName("com.bea:Name=DomainRuntimeService,Type=weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean");
ObjectName[] ons = (ObjectName[]) connection.getAttribute(service, "ServerRuntimes");
int length = (int) ons.length;
for (int i = 0; i < length; i++) {
String name = (String) connection.getAttribute(ons[i],
"Name");
String state = (String) connection.getAttribute(ons[i],
"State");
String internalPort = (String) connection.getAttribute(ons[i],"ListenPort");
System.out.println("Server name: " + name + ". Server state: "
+ state);
but i need to access the custom Mbeans created on each server and not only the information
maybe my question wasnt clear but i found an answer and i will share it here now :
Question summary : i need to access custom mBeans exists in a managed server by connecting to the administration server from a client application.
Answer :
to do that you need to deploy your application to the administrator server (i tried remote but it didn't work )
you need to connect to the DomainRuntimeServiceMBean because it provide a common access point for navigating to all runtime and configuration MBeans in the domain .
when searching for the Object name add Location=
here is the code:
Hashtable props = new Hashtable();
props.put(Context.INITIAL_CONTEXT_FACTORY,
"weblogic.jndi.WLInitialContextFactory");
props.put(Context.SECURITY_PRINCIPAL, "<userName>");
props.put(Context.SECURITY_CREDENTIALS, "<password>");
Context ctx = new InitialContext(props);
MBeanServer server = (MBeanServer)ctx.lookup("java:comp/env/jmx/domainRuntime");
ObjectName on =new ObjectName("com.<companyName>:Name=<Name>,Type=<Type>,Location=<managed_server_name>");
boolean boolresult=(Boolean)server.invoke(on, "<method_Name>",
new Object[]{"<ARG1>","<ARG2>","<ARG3>"}
,new String[]{"java.lang.String","java.lang.String","java.lang.String"});
out.print(boolresult);
Related
Please refer to an existing question regarding the same
Send files from one remote server using JSch to another server using JSch too
public boolean uploadFile() throws JSchException, SftpException {
ChannelSftp channelSftpA = createChannelSftp();
ChannelSftp channelSftpB = createChannelSftp();
channelSftpA.connect();
channelSftpB.connect();
localFilePath = "/data/upload/readme.txt";
remoteFilePath = "/bingo/pdf/";
channelSftpA.cd(localFilePath);
channelSftpA.put(localFilePath + "readme.txt", remoteFilePath + "readme.txt");
But it doesn't work. Should I put channelB.put into my first channelA.put?
Solution given for above Question is "for transferring file you should get file from server A, and that put on server B. By the way users under which you are going to download and upload files should have access to specified folders!"
private boolean transferFile() throws JSchException, SftpException {
ChannelSftp channelSftpA = createChannelSftp();
ChannelSftp channelSftpB = createChannelSftp();
channelSftpA.connect();
channelSftpB.connect();
String fileName = "readme.txt";
String remoteFilePathFrom = "/folderFrom/";
String remoteFilePathTo = "/folderTo/";
InputStream srcInputStream = channelSftpA.get(remoteFilePathFrom + fileName);
channelSftpB.put(srcInputStream, remoteFilePathTo + fileName);
System.out.println("Transfer has been completed");
channelSftpA.exit();
channelSftpB.exit();
return true;
}
But My query is
what if I want to directly transfer files from server1 to server2 without getting it on my local. This I can do using command "scp -rv /sourcefolder/text1 username#server2Hostname /destinationfolder/" but through Program it will work ONLY when public key is set up between server1 and server2 SO no Requirement to pass the password. But I need a solution when public key is Not set up and I have to use the Password, Am not able to figure out how to pass the password for server2 in this command OR if we can create parallel session/channel to server1 and server2 NOT Sure how can we do that. ANy Idea ?
A follow up to this:
one SCDF source, 2 processors but only 1 processes each item
The 2 processors (del-1 and del-2) in the picture are receiving the same data within milliseconds of each other. I'm trying to rig this so del-2 never receives the same thing as del-1 and vice versa. So obviously I've got something configured incorrectly but I'm not sure where.
My processor has the following application.properties
spring.application.name=${vcap.application.name:sample-processor}
info.app.name=#project.artifactId#
info.app.description=#project.description#
info.app.version=#project.version#
management.endpoints.web.exposure.include=health,info,bindings
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.security.servlet.SecurityAutoConfiguration
spring.cloud.stream.bindings.input.group=input
Is "spring.cloud.stream.bindings.input.group" specified correctly?
Here's the processor code:
#Transformer(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT)
public Object transform(String inputStr) throws InterruptedException{
ApplicationLog log = new ApplicationLog(this, "timerMessageSource");
String message = " I AM [" + inputStr + "] AND I HAVE BEEN PROCESSED!!!!!!!";
log.info("SampleProcessor.transform() incoming inputStr="+inputStr);
return message;
}
Is the #Transformer annotation the proper way to link this bit of code with "spring.cloud.stream.bindings.input.group" from application.properties? Are there any other annotations necessary?
Here's my source:
private String format = "EEEEE dd MMMMM yyyy HH:mm:ss.SSSZ";
#Bean
#InboundChannelAdapter(value = Source.OUTPUT, poller = #Poller(fixedDelay = "1000", maxMessagesPerPoll = "1"))
public MessageSource<String> timerMessageSource() {
ApplicationLog log = new ApplicationLog(this, "timerMessageSource");
String message = new SimpleDateFormat(format).format(new Date());
log.info("SampleSource.timeMessageSource() message=["+message+"]");
return () -> new GenericMessage<>(new SimpleDateFormat(format).format(new Date()));
}
I'm confused about the "value = Source.OUTPUT". Does this mean my processor needs to be named differently?
Is the inclusion of #Poller causing me a problem somehow?
This is how I define the 2 processor streams (del-1 and del-2) in SCDF shell:
stream create del-1 --definition ":split > processor-that-does-everything-sleeps5 --spring.cloud.stream.bindings.applicationMetrics.destination=metrics > :merge"
stream create del-2 --definition ":split > processor-that-does-everything-sleeps5 --spring.cloud.stream.bindings.applicationMetrics.destination=metrics > :merge"
Do I need to do anything differently there?
All of this is running in Docker/K8s.
RabbitMQ is given by bitnami/rabbitmq:3.7.2-r1 and is configured with the following props:
RABBITMQ_USERNAME: user
RABBITMQ_PASSWORD <redacted>:
RABBITMQ_ERL_COOKIE <redacted>:
RABBITMQ_NODE_PORT_NUMBER: 5672
RABBITMQ_NODE_TYPE: stats
RABBITMQ_NODE_NAME: rabbit#localhost
RABBITMQ_CLUSTER_NODE_NAME:
RABBITMQ_DEFAULT_VHOST: /
RABBITMQ_MANAGER_PORT_NUMBER: 15672
RABBITMQ_DISK_FREE_LIMIT: "6GiB"
Are any other environment variables necessary?
I am attempting to automate a workflow in our Azure environment.
We have several web applications with connectionstrings to several databases. Each new customer recives a new database.
I've hit a snag in the script with our connectionstrings. I want the script to update all web applications and add a new connectionstring for the newly created customer db.
The problem is "Set-Azurermwebapp -Name -ResourceGroup -ConnectionStrings" takes a hashtable which replaces any previously configured data.
I would only like to append a new connectionstring, or get the previously configered cstrings and add them to an array, then replacing all data.
Example code;
$test= #{"Type"="Custom"; "Value" = "TestValue"}
$Connectionstring=#{"test"=$test }
Set-AzureRmWebApp
-Name "testapp"
-ResourceGroupName "testgrp"
-ConnectionStrings $Connectionstring"
Any ideas here?
$connStrings = #{
AzureWebJobsDashboard = #{
Type = "Custom";
Value = $AzureWebJobsDashboardÂ
};
AzureWebJobsStorage = #{
Type = "MySql";
Value = $connstring
}
};
Set-AzureRMWebApp -Name $webServiceName -ResourceGroupName $rgName -ConnectionStrings $connStrings
Cannot Delete All Azure Website Connection Strings
#Add new connection string
$newConnString = New-Object Microsoft.WindowsAzure.Commands.Utilities.Websites.Services.WebEntities.ConnStringInfo
$newConnString.Name = $ConnStringName
$newConnString.ConnectionString = $ConnStringValue
$newConnString.Type = $ConnStringType
$connStrings.Add($newConnString)
Set-AzureWebsite $WebAppName -ConnectionStrings $connStrings
You can download detail script from How to automatically create new connection strings for web applications Azure
I already know how to run and attach a transformation running on a remote Carte server using Java given transformation's Carte Object ID:
KettleEnvironment.init();
TransMeta transMeta = new TransMeta("file.ktr");
Trans trans = new Trans(transMeta);
SlaveServer ss = new SlaveServer("test", IP, PORT, "cluster", "cluster");
TransExecutionConfiguration jec = new TransExecutionConfiguration();
jec.setRemoteServer(ss);
String carteObjectId = trans.sendToSlaveServer(transMeta, jec, null, null);
and
KettleEnvironment.init();
SlaveServer ss = new SlaveServer("test", IP, PORT, "cluster", "cluster");
SlaveServerTransStatus state = ss.getTransStatus(transMetaName, carteObjectId, 0);
List<StepStatus> list = state.getStepStatusList();
However, for a more general (and usable) remote monitoring I need to get the whole list of the Object IDs of the running/run transformations on the remote Carte server. Which methods can I use to get such a list ?
List<SlaveServerTransStatus> transStatus = slave1.getStatus().getTransStatusList();
for(SlaveServerTransStatus transStatu:transStatus){
System.out.println(transStatu.getTransName()+"--"+transStatu.getStatusDescription()+"---"+transStatu.getId());
}
I am attempting to replicate a SQL CE 3.5 SP1 database but upon syncrhonization, I am thrown the following error:
"Failure to connect to SQL Server with provided connection information. SQL Server does not exist, access is denied because the IIS user is not a valid user on the computer running SQL Server, or the password is incorrect."
I am using the Windows Mobile 6 Professional emulator and the machine I am attempting to connect to is a Windows Virtual Machine running Windows XP Professional SP3. I have configured the network adapter settings for the emulator (I can access web pages), verified user permissions, double checked IIS settings, and triple checked my connection string:
SqlCeReplication rpl = null;
try
{
// Creates the replication object.
rpl = new SqlCeReplication();
// Establishes the connection string.
rpl.SubscriberConnectionString = #"Data Source = \Program Files\ParkSurvey\ParkSurvey.sdf; Password = *; Temp File Max Size = 512;
Max Database Size = 512; Max Buffer Size = 512; Flush Interval = 20; Autoshrink Threshold = 10; Default Lock Escalation = 100";
// Sets the Publisher properties.
rpl.PublisherSecurityMode = SecurityType.NTAuthentication;
rpl.Publisher = "PUBLISHER";
rpl.PublisherLogin = "INDICOPUBLIC\\subuser";
rpl.PublisherPassword = "*";
rpl.PublisherDatabase = "PUBLISHER";
rpl.Publication = "ParkSurveyPublication";
// Sets the internet replication properties.
rpl.InternetUrl = "http://replication/sqlce/sqlcesa35.dll";
rpl.InternetLogin = "INDICOPUBLIC\\subuser";
rpl.InternetPassword = "*";
rpl.ConnectionManager = true;
// Sets the Distributor properties.
rpl.Distributor = "PUBLISHER";
rpl.DistributorLogin = "INDICOPUBLIC\\subuser";
rpl.DistributorPassword = "psrAdmin";
rpl.DistributorSecurityMode = SecurityType.NTAuthentication;
// Sets the timeout properties.
rpl.ConnectionRetryTimeout = 120;
rpl.ConnectTimeout = 6000;
rpl.ReceiveTimeout = 6000;
rpl.SendTimeout = 6000;
// Sets the Subscriber properties.
rpl.Subscriber = "ParkSurveySubscriber";
rpl.HostName = "Mobile1";
rpl.CompressionLevel = 6;
rpl.ExchangeType = ExchangeType.BiDirectional;
// Call the replication methods.
rpl.Synchronize();
}
catch (SqlCeException sqlEx)
{
MessageBox.Show(sqlEx.Message);
}
finally
{
// Disposing the replication object
if (rpl != null)
{
rpl.Dispose();
}
}
I have also attempted to open the host machine itself in File Explorer on the mobile emulator and am prompted that "The network path was not found.". This leads me to believe it is ActiveSync issue within the emulator itself. Does anyone have any advice?
Try with IP adresse instead of hostname, and test the agent URL from IE on the device. Make sure to use the latest build of 3.5 SP2 on all components if your DB server is SQL 2012