How to submit Job to flink on yarn by web ui - hadoop-yarn

I setup a flink cluster on yarn, and submit job by type commands related on hosts successfully.
but it is not so convenient as web ui(i have tested to submit job by web ui on fink standlone cluster).
when i click "Submit new Job" buttons , the page is as follow:
I click "here" hyperlink, it jumped to a page with a random host ip in cluster and "random" port. as we do not open all port to public network, so this page is connection refused.
I try to debug js code to find whether some config trigger this problem, and find two code fragments:
It seems like this page must not function well with flink on yarn.
So, can i submit job to flink on yarn by web ui? and how?

As the message states YARN proxy, which you are seeing does not allow file uploads. If you really wanna upload jobs via web-ui on yarn, you can find out the real ip of the jobmanager and go to that ip directly(without yarn proxy).
There are some issues though with the approach. You have to have access to that node, which is usually not the case on yarn. (Which most probably you are hitting).

Flink on yarn has two modes: session and per-job. If you need to submit via web-ui, you must first create a yarn-session and enter session web-ui to submit, but per-job should not be submitted via web-ui Submitted

Related

How can I restart the Jitterbit UI Console?

I have Jitterbit Cloud Data Loader on a server and it was working fine for years. After the last update I cannot open anymore the UI console and cannot uninstall/reinstall the service "Jitterbit Cloud Data Loader Apache Server". When I try to uninstall it shows The system cannot find the file specified. : AH00436: No installed service named "Jitterbit Cloud Data Loader Apache Server" and when I try to install it says: The name is already in use as either a service name or a service display name. AH00370: Failed to create the "Jitterbit Cloud Data Loader Apache Server" service.
Does anyone has seen this problem before?
Tried to install and reinstall, but nothing so far, I don't want to lose my info/configuration of the processes so far...

How ambari detect a service state

I'm adding a new custom service to Ambari.
I have successfully created the service and install it in the Ambari web UI. After starting the master component of my new service, Ambari claims that the master is in stop status, however, the master has been run successfully on the intended node and I can use its API.
I wonder how Ambari checks a component status?
Does it use the status function which I have provided in the component definition? I don't see logs of calling my status function in the Ambari logs.
Or does it use the PID file? My component does not have a PID file.
#TailofGodzilla (cool name btw), When I make custom services, I start with existing open source examples, and then finally create management packs. You can easily reverse engineer these, including the service status function.
I checked 3 of these services (Hue, Elk, NiFi) and all are using PID File with entries for status function and status_params file.

Trying to run Apache Apex's Yahoo Finance example on YARN

I've downloaded Apache Apex 3.5.0 along with Malhar 3.5.0.
I've successfully started the apex client and submitted the Yahoo Finance demo example to our YARN cluster (running CDH 5.10). The cluster is running and configured properly (many Spark and MR jobs are running on it).
I see the application I submitted as RUNNING in YARN as well as in the Apex cli. However when I try to connect to the Application Master I get a 404.
org.apache.hadoop.yarn.webapp.WebAppException: /: controller for default not found
I also tried directly to connect to the appMasterTrackingUrl reported by get-app-info command, and I get the same error.
I tried a couple of apex examples, and I always get the same error.
Any idea why?
It is somewhat expected. Add "/ws/v2/stram/info" to the URL path
When you connect to the App Master you need to provide the complete URL for a REST API to invoke. There is nothing to show/return for "/" so what you are seeing is expected. What are you trying to do connecting to the App Master?

Open Daylight can not login and the ui of the web always display "unable to login"

I have install the Open Daylight Helium according to the document of wiki of installation. But, when I enable the webUI, it always displays "Unable to login" in the webUI.
How can I solve this problem?
This would be easier to answer if you described what you mean by the web UI. Nonetheless vaerify that you are testing the correctports, if you are then you can disable auth by editing org.opendaylight.aaa.authn.cfg
authEnabled=false
Obviously, we need to install feature of odl-dlux-core. And we can login to ODL's Web GUI successfully.
I had the same problem and I got it working by following the answer on this link:
https://ask.opendaylight.org/question/843/unable-to-login-dlux-web-interface-helium-release/
There it says:
Downloading and extracting the pre-build .zip : distribution-karaf-0.2.0-Helium.zip
./distribution-karaf-0.2.0-Helium/bin/karaf (on linux) to start the karaf container
feature:install odl-restconf odl-l2switch-switch odl-mdsal-apidocs odl-dlux-core
Accessing http://localhost:8181/dlux/index.html where localhost is your local ip
log in with user: admin pw: admin
Perhaps there is a problem with the order you have installed your features. I >got the order from dlux-wiki-page where they said that this is the recommended >way of installing features before starting the dlux feature.
To clean your local karaf container you can start the container using the clean >flag like "./distribution-karaf-0.2.0-Helium/bin/karaf clean" or delete the >"distribution-karaf-0.2.0-Helium/data/" folder.

Webdriver(Selenium2) - How to make selenium operate elements without wating for connecting to external AD links?

Environment:
- Selenium 2.39 Standalone Server
- PHP 5.4.11
- PHPUnit 3.7.28
- Chrome V31 & ChromeDriver v2.7
I'm testing a website,which invokes a lot of Advertisement Systems,such as Google AD.
The browser takes a lot of time to connect to external AD links , even all the elements of the page has already been loaded.
If my internet network was not fast when I ran my tests on a webpage,
Selenium would wait for a very long time ,since the AD links responsed slowly.
Under this condition ,Selenium usually waits for over 60 seconds, and throws a timeout exception.
I'm not sure how Senelium works, but it seems that Selenium has to wait for a sign of webpage's full loading, then pulls the DOM to find elements.
I want to make selenium operate elements without waiting for connectiong to external AD links.
Is there a way to do that ? Thank you very much.
I would suggest that you could make use of a proxy. Browsermob integrates well with selenium, very easy to use it:
// start the proxy
ProxyServer server = new ProxyServer(4444);
server.start();
// get the Selenium proxy object
Proxy proxy = server.seleniumProxy();
// This line will automatically return http.200 for any request going to google analytics
server.blacklistRequests("https?://.*\\.google-analytics\\.com/.*", 200);
// configure it as a desired capability
DesiredCapabilities capabilities = new DesiredCapabilities();
capabilities.setCapability(CapabilityType.PROXY, proxy);
// start the browser up
WebDriver driver = new FirefoxDriver(capabilities);
I'm not sure how Senelium works, but it seems that Selenium has to
wait for a sign of webpage's full loading, then pulls the DOM to find
elements.
It is pretty much like this. The default loading strategy is "NORMAL" which means:
NORMAL of type DOMString
The remote end MUST wait until the "document.readyState" of the frame currently handling commands equals "complete", or there are no
more outstanding network requests other than XMLHttpRequests.
I finally found a simple solution for my condition.
I decide to block these Ad requests and tried some firewall and proxy softwares,for example,
comodo,privatefirewall, etc.
comodo is too heavy and complex ,privatefirewall doesn't support wildcards, and firewall would interrupt tests. At last I choosed a proxy software CCproxy. Trial Version is enough.
I create a rule for localhost ,to make it can request my test website domain only, and all other requests are rejected.
Running a test costs about 1-2 minutes before and only 30 seconds now ,it's apparently more stable and fast without connecting to the useless Ad links.
Here're configuration steps:
1.launch CCproxy with Administor privilege( you should set it using Adminisrator in the file property)
2.click Options, select AutoStartup,select AutoDetected for Local IP Address. click OK.
3.create a txt file ,input your domains,like " *.rong360.com*;*.rong360.*; "
4.click Account, select PermitOnly for Permit Category;
click New, input 127.0.0.1 for IP Address/Range;
select WebFilter,click the E button at right side to create a filter;
click the ... button,select the text file you create at Step3,
select PermittedSites. click OK
click OK.
5.click OK to return to the main UI of CCproxy.
6.launch IE and config the local proxy with 127.0.0.1:808
other browsers will use this config automatically too.
now you can run the tests again , you'll feel better if have same condition :)