PDFBox error - java.lang.IllegalAccessError - pdfbox

We are using PDFBox 1.8.9 to create PDF documents. We are getting below exception from our testing server.
java.lang.IllegalAccessError: tried to access method org.apache.pdfbox.pdmodel.graphics.color.PDDeviceGray.()V from class org.apache.pdfbox.pdmodel.edit.PDPageContentStream
at org.apache.pdfbox.pdmodel.edit.PDPageContentStream.(PDPageContentStream.java:74)
at org.apache.pdfbox.pdmodel.edit.PDPageContentStream.(PDPageContentStream.java:173)
at org.apache.pdfbox.pdmodel.edit.PDPageContentStream.(PDPageContentStream.java:158)
at com.fedex.cal.clmc.transaction.clms.claims.LetterToPDFTransaction.convertToPDF(LetterToPDFTransaction.java:96)
The exception triggered at line #96 in our source code where below is the line of code there.
PDPageContentStream contentStream = new PDPageContentStream(document,page);
In our local environment we are not facing this issue and it is working fine. We are using maven and below is the pom dependency.
<dependency>
<groupId>fedex.cxs.commonlib</groupId>
<artifactId>fontbox</artifactId>
<version>1.8.9</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>fedex.cxs.commonlib</groupId>
<artifactId>pdfbox</artifactId>
<version>1.8.9</version>
<scope>compile</scope>
</dependency>
Any suggestions highly appreciated.

Related

Azure Keyvault library to Atlassian Confluence plugin pom.xml

I am trying to combine these 2 tutorials - Confluence Hello World Macro & Azure keyvault quick start:
https://developer.atlassian.com/server/framework/atlassian-sdk/create-a-confluence-hello-world-macro/
https://learn.microsoft.com/en-us/azure/key-vault/secrets/quick-create-java?tabs=azure-cli
After having added the 2 Azure dependencies to the pom.xml of the maven project and running atlas-mvn clean package I receive an error message about 3 banned dependencies.
I looked for the newest Azure packages at the maven portal. Then it was reduced to one.
Found Banned Dependency: org.slf4j:slf4j-api:jar:1.7.25
Then I added added exclusions to the dependency section:
This resulted that the build ran successfully, however, the Confluence plugin produces a runtime error:
java.lang.NoClassDefFoundError
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at com.azure.security.keyvault.secrets.SecretClientBuilder.(SecretClientBuilder.java:110)
Can you please help, how can I achieve this?
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-security-keyvault-secrets</artifactId>
<version>4.3.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.4.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
error: java.lang.NoClassDefFoundError Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger at com.azure.security.keyvault.secrets.SecretClientBuilder.(SecretClientBuilder.java:110)
The above error indicates that JVM is not able to found org/slf4j/Logger class in your application's path.The simplest reason for this error is the missing Slf4j.jar file.
If the problem is caused due to the missing slf4j.jar file then you can fix it by adding a relevant version of slf4j.jar into your path.
Use the latest version of the jar in which version of the JAR file you should add will depend upon the application.
In Maven , you can also add the following dependency in your pom.xml file to download sl4j.jar
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.36</version>
</dependency>
Reference:
java.lang.NoClassDefFoundError: org.slf4j.LoggerFactory - Stack Overflow

Apache Hudi throwing Dataset not found exception when storing to S3

I am trying to load a simple dataframe as Hudi dataset into S3 and I am having trouble in doing that. I am new to Apache Hudi and I am trying to load the data from by running the code locally on my Windows machine. All the Maven dependencies I am using to achieve this and the code along with exceptions are mentioned below
inputDF.write.format("com.uber.hoodie")
.option(HoodieWriteConfig.TABLE_NAME, tablename)
.option(DataSourceWriteOptions.RECORDKEY_FIELD_OPT_KEY, "GameId")
.option(DataSourceWriteOptions.PARTITIONPATH_FIELD_OPT_KEY,"operatorShortName")
.option(DataSourceWriteOptions.PRECOMBINE_FIELD_OPT_KEY, "HandledTimestamp")
.option(DataSourceWriteOptions.OPERATION_OPT_KEY, DataSourceWriteOptions.UPSERT_OPERATION_OPT_VAL)
.mode(SaveMode.Append)
.save("s3a://s3_buket/Games2" )
<!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.623</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>3.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>com.uber.hoodie</groupId>
<artifactId>hoodie</artifactId>
<version>0.4.7</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/com.uber.hoodie/hoodie-spark -->
<dependency>
<groupId>com.uber.hoodie</groupId>
<artifactId>hoodie-spark</artifactId>
<version>0.4.7</version>
</dependency>
Exception in thread "main" com.uber.hoodie.exception.DatasetNotFoundException: Hoodie dataset not found in path s3a://gat-datalake-raw-dev/Games2\.hoodie
at com.uber.hoodie.exception.DatasetNotFoundException.checkValidDataset(DatasetNotFoundException.java:45)
at com.uber.hoodie.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:91)
at com.uber.hoodie.HoodieWriteClient.rollbackInflightCommits(HoodieWriteClient.java:1172)
at com.uber.hoodie.HoodieWriteClient.startCommitWithTime(HoodieWriteClient.java:1044)
at com.uber.hoodie.HoodieWriteClient.startCommit(HoodieWriteClient.java:1037)
at com.uber.hoodie.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:144)
at com.uber.hoodie.DefaultSource.createRelation(DefaultSource.scala:91)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228)
at com.playngoplatform.scala.dao.DataAccessS3.writeDataToRefinedS3(DataAccessS3.scala:26)
at com.playngoplatform.scala.controller.GameAndProviderDataTransform.processData(GameAndProviderDataTransform.scala:29)
at com.playngoplatform.scala.action.GameAndProviderData$.main(GameAndProviderData.scala:10)
at com.playngoplatform.scala.action.GameAndProviderData.main(GameAndProviderData.scala)
I am not doing anything else apart from this. I am just creating a Hudi dataset directly from my Spark data source code. I am seeing the folder getting created the S3 path but not any further
.hoodie.properties file is mentioned below
hoodie.compaction.payload.class=com.uber.hoodie.common.model.HoodieAvroPayload
hoodie.table.name=hoodie.games
hoodie.archivelog.folder=archived
hoodie.table.type=MERGE_ON_READ
Hudi is not completely mature to support your windows OS.
The issue is fixed by changing file seperation character in terms of running this on windows machine.

Selenium remote driver issue with HtmlUnit driver

[i'm seeing the issue with Selenium remote driver when I'm executing the script with Htmlunit driver.
Note 1:- Same script works without any issue when I'm running with Firefox driver.]
Note 2: My browser had security authentication process for whatever the site i open, Not sure if that have ant role in this.
I have observed the selenium remote driver under maven shows with little different icon in left pane.
I feel its jar file loading issue.
I tried to put the selenium remote driver manually into .m2 repository.
1
Error message:-
Exception in thread "main" java.lang.NoClassDefFoundError: org/openqa/selenium/remote/SessionNotFoundException
at TestPackage.titleNUrlCheckingTest.main(titleNUrlCheckingTest.java:16)
Caused by: java.lang.ClassNotFoundException: org.openqa.selenium.remote.SessionNotFoundException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
[enter image description here][2]
You need to use latest version, note the change of artifactId from old versions.
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
<version>2.26</version>
</dependency>
which depends on
selenium-api 3.3.1
Update:
Your pom.xml works with simple test case of HtmlUnitDriver, but there is a potential conflict of versions, you should exclude HtmlUnitDriver 2.24 from selenium-java 3.3.1:
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.3.1</version>
<exclusions>
<exclusion>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
</exclusion>
</exclusions>
</dependency>
Also, try to remove all selenium dependencies, and have only htmlunit-driver, all needed dependencies are automatically handled by maven.
Please update your POM XML file with latest version of htmlunit dependency
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>htmlunit-driver</artifactId>
<version>2.32.1</version>
</dependency>
and remove if you have something like
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-htmlunit-driver</artifactId>
<version>2.52.0</version>
</dependency>
and update project. This should resolve your exception issue.
Reference: https://github.com/SeleniumHQ/selenium/issues/4930

FOP giving NoSuchMethodError when font auto-detect enable

I'm getting the bellow error when generating a PDF from Apache FOP,
FopFactory fopFactory = FopFactory.newInstance(new File("/Users/vinurip/cloud/Stripe/Full/pdfboxtut/src/main/resources/fop.xconf"));
OutputStream out = new BufferedOutputStream(new FileOutputStream(new File(outFile)));
Fop fop = fopFactory.newFop(MimeConstants.MIME_PDF, out);
OutFile is a empty pdf file location. fop.xconf is the same one which comes in the FOP source
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.fontbox.cff.CFFFont.getProperty(Ljava/lang/String;)Ljava/lang/Object;
at org.apache.fop.fonts.truetype.OTFFile.readName(OTFFile.java:134)
at org.apache.fop.fonts.truetype.OpenFont.readFont(OpenFont.java:740)
at org.apache.fop.fonts.truetype.OFFontLoader.read(OFFontLoader.java:109)
at org.apache.fop.fonts.truetype.OFFontLoader.read(OFFontLoader.java:93)
at org.apache.fop.fonts.FontLoader.getFont(FontLoader.java:124)
at org.apache.fop.fonts.FontLoader.loadFont(FontLoader.java:108)
at org.apache.fop.fonts.autodetect.FontInfoFinder.find(FontInfoFinder.java:254)
at org.apache.fop.fonts.FontAdder.add(FontAdder.java:63)
at org.apache.fop.fonts.FontDetectorFactory$DefaultFontDetector.detect(FontDetectorFactory.java:105)
at org.apache.fop.fonts.FontManager.autoDetectFonts(FontManager.java:229)
at org.apache.fop.fonts.DefaultFontConfigurator.configure(DefaultFontConfigurator.java:82)
at org.apache.fop.render.PrintRendererConfigurator.getCustomFontCollection(PrintRendererConfigurator.java:147)
at org.apache.fop.render.PrintRendererConfigurator.setupFontInfo(PrintRendererConfigurator.java:127)
at org.apache.fop.render.intermediate.IFUtil.setupFonts(IFUtil.java:170)
at org.apache.fop.render.intermediate.IFRenderer.setupFontInfo(IFRenderer.java:187)
at org.apache.fop.area.RenderPagesModel.<init>(RenderPagesModel.java:75)
at org.apache.fop.area.AreaTreeHandler.setupModel(AreaTreeHandler.java:135)
at org.apache.fop.area.AreaTreeHandler.<init>(AreaTreeHandler.java:105)
at org.apache.fop.render.RendererFactory.createFOEventHandler(RendererFactory.java:350)
at org.apache.fop.fo.FOTreeBuilder.<init>(FOTreeBuilder.java:107)
at org.apache.fop.apps.Fop.createDefaultHandler(Fop.java:104)
at org.apache.fop.apps.Fop.<init>(Fop.java:78)
at org.apache.fop.apps.FOUserAgent.newFop(FOUserAgent.java:182)
at org.apache.fop.apps.FopFactory.newFop(FopFactory.java:220)
at foptest.fo2PDF(foptest.java:73)
at foptest.main(foptest.java:57)
This issue seems to occuring only when the font auto-detect is enabled
You can find some info using search engines - eg. https://qnalist.com/questions/6434450/pdfbox-2-0-and-batik - although these are not very useful...
My workaround was using <directory recursive="true">/usr/share/fonts</directory> instead of <auto-detect />.
I ran into this issue because of a Maven dependency mismatch between Apache FOP and Apache PDFBox/FontBox.
I resolved this issue by migrating my project's pom.xml to the latest dependency versions as of this writing:
<dependency>
<groupId>org.apache.xmlgraphics</groupId>
<artifactId>fop</artifactId>
<version>2.2</version>
</dependency>
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>pdfbox</artifactId>
<version>2.0.9</version>
</dependency>
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>fontbox</artifactId>
<version>2.0.9</version>
</dependency>
To migrate to Apache FOP 2.2, follow the embedding guidelines and migration guideslines.

NoClassDefFoundError using with boxable plugin

I am using boxable plugin with pdfbox and I am trying to create a teble. I am getting error:
2015-09-09T10:36:52.453+0200|Severe: java.lang.NoClassDefFoundError: org/apache/pdfbox/pdmodel/edit/PDPageContentStream
at the line of code:
BaseTable table = new BaseTable(yStart,yStartNewPage, bottomMargin, tableWidth, margin, doc, page, true, drawContent);
Here is a part from pom.xml, describing the dependencies that I am using:
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>pdfbox</artifactId>
<version>2.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>com.github.dhorions</groupId>
<artifactId>boxable</artifactId>
<version>1.2</version>
</dependency>
Is there a bug in current version of dependencies or am I missing something?
Thank you very much for any help.
Remove this:
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>pdfbox</artifactId>
<version>2.0.0-SNAPSHOT</version>
</dependency>
2.0 is an unreleased version and is in development, and it has a different API. Boxable has its dependencies in its own pom.xml, it is currently requesting 1.8.8. (Which is not the latest version, but I don't think this matters for simple PDF creation)