how to generate *.nps cpu snapshot from commandline - visualvm

How to generate *.nps file cpu snapshot from commandline?
Can't find a clue from -help commandline or visualvm.github.io.
I know how to use it from ui, but need it triggered from monitoring tools.

Related

Running wrapper file continuously for using JFR to monitor ActiveMQ performance

I have an issue about continuously running Java Flight Recorder to monitor memory usage and other performance statistics of ActiveMQ.
Wrapper configuration file (wrapper.conf) is under this directory with nearside (wrapper, activemq, libwrapper.so) files;
../apache-activemq-5.12.1/bin/linux-x86-64/wrapper.conf
I added the lines below to run JFR;
wrapper.java.additional.13=-XX:+UnlockCommercialFeatures
wrapper.java.additional.14=-XX:+FlightRecorder
wrapper.java.additional.15=-XX:FlightRecorderOptions=defaultrecording=true,disk=true,repository=../jfr/jfrs_%WRAPPER_PID%,settings=profile
wrapper.java.additional.16=-XX:StartFlightRecording=filename=../jfr/jfrs_%WRAPPER_PID%/myrecording.jfr,dumponexit=true,compress=true
When I run wrapper file, expected output 'myrecording.jfr' is generated under specified path in wrapper.conf. But the problem is, I also want it to be happen automatically (without running wrapper file by hand).
What might be the possible solution for that?

The NUNIT automation scripts are NOT running from Task Scheduler

I am able to execute the NUNIT scripts using batch file. I am trying to run this batch file from Task scheduler to make it unattended and run regularly.
But bat file runs and provide output that window which is trying to perform automation is not opened.
Those are the parameters I'm using to launch the NUnit console:
cd C:\Program Files\NUnit.org\nunit-console
NUNIT3-CONSOLE D:\nunit\UnitTestProject1.dll --result="D:\nunit\TestResult.XML"
My code looks like this:
[Test]
public void TestMethod1()
{
IWebDriver driver = new ChromeDriver();
driver.Navigate().GoToUrl("www.oford.com");
}
Is there any way to run the batch files in unattended mode ?
I've started my first setup with the same approach, but some user privileges and Windows settings may interrupt at any time.
Is there any way to run the batch files in unattended mode ?
Yes, you can use a CI server. It is really to setup, free and powerful once you get the hang out of it. I would recommend Jenkins, because of its great community and vast resources (tutorials,plugins etc.). Configuring a basic job takes no more than five minutes. The reporting is also great.

JMeter Test Results Monitoring/ Analysis

I want to start load testing by running JMeter from command line for more accurate test results, but how can I monitor the run and then analyze the results after the test finishes.
You can generate JTL (JMeter results) file while executing the JMX (JMeter script) file from command line. A sample command for generating JTL file will look like this..
jmeter -n -t path-to-jmeterScript.jmx -l path-to-jtlFile.jtl
After completion of script execution you can open the JMeter GUI and simply open the JTL file in any listener (as per your requirement).
Most of the listeners in JMeter have an option to save the results into a file. This file contains usually not the report itself, but the samples which are generated by the tests. If you define this filename, you can generate the reports using these saved files. For example see http://jmeter.apache.org/usermanual/component_reference.html#Summary_Report .
If you run JMeter in command-line non-GUI mode passing results file name via -l parameter it will output results there. After test finishes you will be able to open the file with the Listener of your choice and perform the analysis.
By default JMeter writes results in chunks, if you need to monitor them in real time add the following line to user.properties file (lives under /bin folder of your JMeter installation)
jmeter.save.saveservice.autoflush=true
You can use other properties which names start with jmeter.save.saveservice.* to control what metrics you need to store. The list with default values can be seen in jmeter.properties file. See Apache JMeter Properties Customization Guide for more information on various JMeter properties types and ways of working with them.
You can also consider running your JMeter test via Taurus tool - it provides some statistics as the test goes either in console mode or via web interface.

External script (R) not working

When I try to run the external script (R Script) from kognitio console. I'm getting the below error message.
Error:external script vfork child: No such file or directory
Can someone please help me to understand what it is!
This will be because you have not replicated the script environment to all the DB nodes which are eligible to run the script.
Chapter 10 of the Kognitio Guide (downloadable from http://www.kognitio.com/forums/viewtopic.php?t=3), explains in section 10.2 how the script environment myst be identically installed in the same location on all nodes which will be used in processing, and section 10.6 explains how you can contrain this to a subset of nodes if for some reason you do not want the script environment to be on all nodes (e.g. if it has an expensive per-node licence).
You can use the wxsync tool to synchronise files across all nodes, or a remote deployment tool, such as HP's RDP, to ensure that the script environment is installed identically on all nodes.

Can not find schedule perspective in pentaho kettle spoon

Hello guys I'm using kettle spoon 5.0.1 and I can't find the Schedule Perspective, or any other except Data Integration Perspective, can you help me, please?
If you need scheduling in CE, you have to do it yourself. Fortunately this isn't terribly hard. Just use an external scheduler to launch transforms and jobs via Pan and Kitchen.
For example, I use pgagent since I have PostgreSQL. So I set up my schedules and launch my job as follows:
set KETTLE_PASSWORD=Encrypted (Some pwd)
"C:\Program Files\pentaho\Daily_Jobs.bat" > "C:\Program Files\pentaho\Daily_Jobs.log"
And then Daily_Jobs.bat is this:
cd /d "C:\Program Files\pentaho\"
Kitchen.bat /rep:"ETL_PROD" /job:"Daily Jobs" /dir:"/Finished" /user:Brian /level:Basic
This runs the job "Daily Jobs" in the "/Finished" directory of my PDI repository as use "Brian" with basic logging and using the password stored in KETTLE_PASSWORD. These are Windows batch files, but .sh scripts in Linux work just as well. No they don't have to be in two separate batch files (I forget why I did that originally).
While I use pgagent, any scheduler that can launch batch files/shell scripts should work.