For my automation project, i am trying to integrate cassandra to spring boot jpa using datastasx driver and
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: api-beta.caas.dbattery.akamai.com/x.x.x.x:9042 (com.datastax.driver.core.exceptions.TransportException: [api-beta.caas.dbattery.akamai.com/x.x.x.x:9042] Error writing), api-beta.caas.dbattery.akamai.com/x.x.x.x:9042 (com.datastax.driver.core.exceptions.TransportException: [api-beta.caas.dbattery.akamai.com/x.x.x.x:9042] Error writing), api-beta.caas.dbattery.akamai.com/x.x.x.x:9042 (com.datastax.driver.core.exceptions.TransportException: [api-beta.caas.dbattery.akamai.com/x.x.x.x:9042] Error writing))
In POM.xml
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.1.RELEASE</version>
<relativePath/>
</parent>
<properties>
<spring.boot.version>2.1.1.RELEASE</spring.boot.version>
<datastax.driver.version>3.10.2</datastax.driver.version>
</properties>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>${datastax.driver.version}</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>${datastax.driver.version}</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-extras</artifactId>
<version>${datastax.driver.version}</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative</artifactId>
<version>2.0.34.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative-boringssl-static</artifactId>
<version>2.0.34.Final</version>
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport-native-epoll</artifactId>
<version>4.1.54.Final</version>
</dependency>
I suspect you were getting other exceptions like OperationTimedOutException when the driver attempts to connect to the nodes in the list but continually failed.
The driver generates a query plan which is a list of nodes to contact for an app query. The driver cycles through this list one node at a time until it is able to connect to all of them. After it has tried to connect to all nodes in the list (which all failed), the driver returns NoHostAvailableException: All host(s) tried for query failed.
The most common causes for this are (1) network connectivity issues, or (2) nodes being down or unresponsive. There is a good chance that the cluster nodes are listening on a different IP address that you are connecting to.
It sounds like you're trying this out for the first time and you might find it easier to just focus on coding your app by connecting to Astra so you don't have to worry about the DB side of things. There's a Spring sample app that you could look at that will get you up and running in literally a few minutes. Cheers!
Related
I'm building a simple reactive web application ( Following Josh long's tech talk ) Simply put I have reactive web, r2dbc and h2 as dependencies.
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-r2dbc</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
So I expect spring would configure everything for me( It does for Josh ). But I get error saying not being able to connect to a database and there is a suggestion asking to include h2(which I already have). What am I doing wrong here?
Description:
Failed to configure a ConnectionFactory: 'url' attribute is not specified and no embedded database could be configured.
Reason: Failed to determine a suitable R2DBC Connection URL
Action:
Consider the following:
If you want an embedded database (H2), please put it on the classpath.
If you have database settings to be loaded from a particular profile you may need to activate it (no profiles are currently active).
Ok it was missing r2dbc-h2 dependency. This happened because I didn't add r2dbc when I created the project with start.spring.io then added it and inspect the pom but only copied spring-boot-starter-data-r2dbc.
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-h2</artifactId>
<scope>runtime</scope>
</dependency>
Still bit confusing though. Spring boot says it looks in to the class path and auto configure dependencies but seems like sometimes it need given combination of dependencies.
I am trying to load a simple dataframe as Hudi dataset into S3 and I am having trouble in doing that. I am new to Apache Hudi and I am trying to load the data from by running the code locally on my Windows machine. All the Maven dependencies I am using to achieve this and the code along with exceptions are mentioned below
inputDF.write.format("com.uber.hoodie")
.option(HoodieWriteConfig.TABLE_NAME, tablename)
.option(DataSourceWriteOptions.RECORDKEY_FIELD_OPT_KEY, "GameId")
.option(DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY,"operatorShortName")
.option(DataSourceWriteOptions.PRECOMBINE_FIELD_OPT_KEY, "HandledTimestamp")
.option(DataSourceWriteOptions.OPERATION_OPT_KEY, DataSourceWriteOptions.UPSERT_OPERATION_OPT_VAL)
.mode(SaveMode.Append)
.save("s3a://s3_buket/Games2" )
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.623</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>3.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>com.uber.hoodie</groupId>
<artifactId>hoodie</artifactId>
<version>0.4.7</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-spark -->
<dependency>
<groupId>com.uber.hoodie</groupId>
<artifactId>hoodie-spark</artifactId>
<version>0.4.7</version>
</dependency>
Exception in thread "main" com.uber.hoodie.exception.DatasetNotFoundException: Hoodie dataset not found in path s3a://gat-datalake-raw-dev/Games2\.hoodie
at com.uber.hoodie.exception.DatasetNotFoundException.checkValidDataset(DatasetNotFoundException.java:45)
at com.uber.hoodie.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:91)
at com.uber.hoodie.HoodieWriteClient.rollbackInflightCommits(HoodieWriteClient.java:1172)
at com.uber.hoodie.HoodieWriteClient.startCommitWithTime(HoodieWriteClient.java:1044)
at com.uber.hoodie.HoodieWriteClient.startCommit(HoodieWriteClient.java:1037)
at com.uber.hoodie.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:144)
at com.uber.hoodie.DefaultSource.createRelation(DefaultSource.scala:91)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228)
at com.playngoplatform.scala.dao.DataAccessS3.writeDataToRefinedS3(DataAccessS3.scala:26)
at com.playngoplatform.scala.controller.GameAndProviderDataTransform.processData(GameAndProviderDataTransform.scala:29)
at com.playngoplatform.scala.action.GameAndProviderData$.main(GameAndProviderData.scala:10)
at com.playngoplatform.scala.action.GameAndProviderData.main(GameAndProviderData.scala)
I am not doing anything else apart from this. I am just creating a Hudi dataset directly from my Spark data source code. I am seeing the folder getting created the S3 path but not any further
.hoodie.properties file is mentioned below
hoodie.compaction.payload.class=com.uber.hoodie.common.model.HoodieAvroPayload
hoodie.table.name=hoodie.games
hoodie.archivelog.folder=archived
hoodie.table.type=MERGE_ON_READ
Hudi is not completely mature to support your windows OS.
The issue is fixed by changing file seperation character in terms of running this on windows machine.
I want to use JDBC mysql with Spring cloud config server, but always failed, this is what I am doing:
Spring cloud version: Finchley.SR2
In POM.xml
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-config-server</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-jdbc</artifactId>
</exclusion>
</exclusions>
</dependency>
Inside the application.config:
spring.profiles.active= jdbc
spring.datasource.url=jdbc:mysql://localhost:3306/config_db
spring.datasource.username=root
spring.datasource.password=12345
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.platform= mysql
spring.cloud.config.server.jdbc.sql= SELECT `key`, `value` FROM `properties` WHERE `application`=? AND `profile`=? AND `label`=?;
spring.cloud.config.server.jdbc.order=0
spring.cloud.config.server.default-profile=production
spring.cloud.config.server.default-label=latest
Finally, when I start server, I am getting below errors:
APPLICATION FAILED TO START
Description:
Invalid config server configuration.
Action:
If you are using the git profile, you need to set a Git URI in your configuration. If you are using a native profile and have spring.cloud.config.server.bootstrap=true, you need to use a composite configuration.
I am not using git here, why the error is about git url?
I had the same issue when used MySQL.
It seems to be an issue with MySQL JdbcTemplate (look here).
I switched to H2 to store configuration and it works.
I wonder if there any workaround to use MySQL?
I got the same error when I attempt to remove the DataSourceAutoConfiguration.class on start up i.e.
#SpringBootApplication(exclude = {DataSourceAutoConfiguration.class })
When I just used
#SpringBootApplication
everything worked as expected.
My reason for excluding the class was to stop the auto generation of a password on startup.
[i'm seeing the issue with Selenium remote driver when I'm executing the script with Htmlunit driver.
Note 1:- Same script works without any issue when I'm running with Firefox driver.]
Note 2: My browser had security authentication process for whatever the site i open, Not sure if that have ant role in this.
I have observed the selenium remote driver under maven shows with little different icon in left pane.
I feel its jar file loading issue.
I tried to put the selenium remote driver manually into .m2 repository.
1
Error message:-
Exception in thread "main" java.lang.NoClassDefFoundError: org/openqa/selenium/remote/SessionNotFoundException
at TestPackage.titleNUrlCheckingTest.main(titleNUrlCheckingTest.java:16)
Caused by: java.lang.ClassNotFoundException: org.openqa.selenium.remote.SessionNotFoundException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
[enter image description here][2]
You need to use latest version, note the change of artifactId from old versions.
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
<version>2.26</version>
</dependency>
which depends on
selenium-api 3.3.1
Update:
Your pom.xml works with simple test case of HtmlUnitDriver, but there is a potential conflict of versions, you should exclude HtmlUnitDriver 2.24 from selenium-java 3.3.1:
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.3.1</version>
<exclusions>
<exclusion>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
</exclusion>
</exclusions>
</dependency>
Also, try to remove all selenium dependencies, and have only htmlunit-driver, all needed dependencies are automatically handled by maven.
Please update your POM XML file with latest version of htmlunit dependency
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
<version>2.32.1</version>
</dependency>
and remove if you have something like
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-htmlunit-driver</artifactId>
<version>2.52.0</version>
</dependency>
and update project. This should resolve your exception issue.
Reference: https://github.com/SeleniumHQ/selenium/issues/4930
I am trying to use glassfish as a embedded server in my ejb3.1 project.
below are my maven dependencies..
But when I run my tests it fails to deploy ejb modules.
do I need to set javaee.home or some more variable ?
<dependency>
<groupId>org.glassfish.extras</groupId>
<artifactId>glassfish-embedded-all</artifactId>
<version>3.1-SNAPSHOT</version>
<scope>test</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.glassfish.extras</groupId>
<artifactId>glassfish-embedded-static-shell</artifactId>
<version>3.1-SNAPSHOT</version>
<scope>test</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
<version>6.0</version>
<scope>provided</scope>
</dependency>
The exception is..
Caused by: org.omg.CORBA.DATA_CONVERSION: vmcid: SUN minor code: 214 completed: No
.
.
.
Caused by: java.lang.IllegalStateException: java.lang.RuntimeException: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key iiop.cannot_find_keyalias
No. even you dont need glassfish-embedded-static-shell.jar.
If you want to use EJB3.1 only glassfish-embedded-all jar is enough.
If you want to access jpa data sources from ejb3 then you need a domain.xml file in classpath.
You will need to pass property "org.glassfish.ejb.embedded.glassfish.installation.root" while creating a EJB container in client code.(like EJBContainer.createEJBContainer(prop)). value of this property should be a folder name (ex. glassfish).
The folder should have domains\domain1\config\domain.xml file.
You can download and install glassfish v3 and from installation you can copy this file.