I have requirement to read data from a database and analyse the data using pig.
I have written a UDF in java Referring following link
register /tmp/UDFJars/CassandraUDF_1-0.0.1-SNAPSHOT-jar-with-dependencies.jar;
A = Load '/user/sampleFile.txt' using udf.DBLoader('10.xx.xxx.4','username','password','select * from customer limit 10') as (f1 : chararray);
DUMP A;
package udf;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.mapreduce.InputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.RecordReader;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.pig.LoadFunc;
import org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigSplit;
import org.apache.pig.data.Tuple;
import org.apache.pig.data.TupleFactory;
import com.data.ConnectionCassandra;
import com.datastax.driver.core.ResultSet;
import com.datastax.driver.core.Row;
import com.datastax.driver.core.Session;
public class DBLoader extends LoadFunc {
private final Log log = LogFactory.getLog(getClass());
Session session;
private ArrayList mProtoTuple = null;
private String jdbcURL;
private String user;
private String pass;
private int count = 0;
private String query;
ResultSet result;
List<Row> rows;
int colSize;
protected TupleFactory mTupleFactory = TupleFactory.getInstance();
public DBLoader() {
}
public DBLoader(String jdbcURL, String user, String pass, String query) {
this.jdbcURL = jdbcURL;
this.user = user;
this.pass = pass;
this.query = query;
}
#Override
public InputFormat getInputFormat() throws IOException {
log.info("Inside InputFormat");
// TODO Auto-generated method stub
try {
return new TextInputFormat();
} catch (Exception exception) {
log.error(exception.getMessage());
log.error(exception.fillInStackTrace());
throw new IOException();
}
}
#Override
public Tuple getNext() throws IOException {
log.info("Inside get Next");
Row row = rows.get(count);
if (row != null) {
mProtoTuple = new ArrayList<Object>();
for (int colNum = 0; colNum < colSize; colNum++) {
mProtoTuple.add(row.getObject(colNum));
}
} else {
return null;
}
Tuple t = mTupleFactory.newTuple(mProtoTuple);
mProtoTuple.clear();
return t;
}
#Override
public void prepareToRead(RecordReader arg0, PigSplit arg1) throws IOException {
log.info("Inside Prepare to Read");
session = null;
if (query == null) {
throw new IOException("SQL Insert command not specified");
}
if (user == null || pass == null) {
log.info("Creating Session with user name and password as: " + user + " : " + pass);
session = ConnectionCassandra.connectToCassandra1(jdbcURL, user, pass);
log.info("Session Created");
} else {
session = ConnectionCassandra.connectToCassandra1(jdbcURL, user, pass);
}
log.info("Executing Query " + query);
result = session.execute(query);
log.info("Query Executed :" + query);
rows = result.all();
count = 0;
colSize = result.getColumnDefinitions().asList().size();
}
#Override
public void setLocation(String location, Job job) throws IOException {
log.info("Inside Set Location");
try {
FileInputFormat.setInputPaths(job, location);
} catch (Exception exception) {
log.info("Some thing went wrong : " + exception.getMessage());
log.debug(exception);
}
}
}
Above is my pig script and java code.
Here /user/sampleFile.txt is a dummy file with no data.
I am getting following exception:
Pig Stack Trace
ERROR 1066: Unable to open iterator for alias A
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias A
at org.apache.pig.PigServer.openIterator(PigServer.java:892)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
at org.apache.pig.Main.run(Main.java:484)
at org.apache.pig.Main.main(Main.java:158)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
at org.apache.pig.PigServer.openIterator(PigServer.java:884)
... 13 more
Vivek! Do you even get in prepareToRead? (I see you did some logging, so it would be nice to know what you actually have in log) Also it would be really great to provide full stacktrace as I see you don't have full underlying exception.
Just some thoughts - I never tried writing a LoadFunc without implementing my own InputFormat and RecordReader - TextInputFormat checks for file existence and it's size (and creates a number of InputSplits based on file size(s)), so if your dummy file is empty there is a big possibility that no InputSplits are produced or zero-length InputSplit is produced. As it has zero-length it may cause pig to throw that exception. So the good suggestion is to implement own InputFormat (it's actually pretty easy). Also just as a fast try - try
set pig.splitCombination false
Probably it won't help, but it's easy to try.
Related
I realized that the images files of the some people are directed landscape, whereas all of them should be portrait, in our database. So, I need to determine which file has width more than its height.
I wonder if there is any method to get the height and width of BLOB type columns such like dbms_lob.getlength function which returns the number of characters (bytes) in the CLOB/BLOB column.
A BLOB is binary data - it does not intrinsically have a format (such as JPEG/PNG/BMP) and as such is not implicitly an image and asking what its width/height is does not make sense.
What you need to do is take the binary data (a BLOB) to from its (unknown) binary format (i.e. JPG/PNG/BMP/etc.) and use an Image reader to read the dimensions from the file's meta-data (so you don't have to load the entire file).
You could write a Java class that has a function that takes a BLOB/binary stream and the image format and then uses ImageIO or ImageReader & ImageInputStream (for example, as the first hits I found on reading images from binary data; there will be other solutions/libraries) extract the dimensions from the header [1, 2] and return it.
Then, to load that class into the Oracle database, use the loadjava utility or CREATE OR REPLACE AND COMPILE JAVA SOURCE (example for uncompressing zipped strings stored in an Oracle BLOB).
Then write an SQL function to wrap the Java implementation so that it passes the BLOB to the Java function and returns the width or the height (or a struct containing both values).
Java Code:
import java.io.IOException;
import java.sql.Blob;
import java.sql.SQLException;
import java.util.Iterator;
import javax.imageio.ImageIO;
import javax.imageio.ImageReader;
import javax.imageio.stream.ImageInputStream;
import javax.imageio.stream.MemoryCacheImageInputStream;
public class ImageMetaDataReader {
public static Integer getHeight(
final Blob blob,
final String fileType
) throws SQLException
{
Iterator<ImageReader> iter = ImageIO.getImageReadersBySuffix( fileType );
while(iter.hasNext())
{
ImageReader reader = iter.next();
try
{
ImageInputStream stream = new MemoryCacheImageInputStream( blob.getBinaryStream() );
reader.setInput(stream);
return reader.getHeight(reader.getMinIndex());
} catch ( IOException e ) {
} finally {
reader.dispose();
}
}
return null;
}
public static Integer getWidth(
final Blob blob,
final String fileType
) throws SQLException
{
Iterator<ImageReader> iter = ImageIO.getImageReadersBySuffix( fileType );
while(iter.hasNext())
{
ImageReader reader = iter.next();
try
{
ImageInputStream stream = new MemoryCacheImageInputStream( blob.getBinaryStream() );
reader.setInput(stream);
return reader.getWidth(reader.getMinIndex());
} catch ( IOException e ) {
} finally {
reader.dispose();
}
}
return null;
}
}
Testing:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.InputStream;
import java.io.OutputStream;
import java.sql.Blob;
import java.sql.SQLException;
public class MockBlob implements Blob {
private final File file;
public MockBlob(
final File file
)
{
this.file = file;
}
#Override
public long length() throws SQLException {
return file.length();
}
#Override
public InputStream getBinaryStream() throws SQLException {
try
{
return new FileInputStream( this.file );
}
catch( FileNotFoundException e )
{
throw new SQLException( e.getMessage() );
}
}
#Override public byte[] getBytes(long pos, int length) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public long position(byte[] pattern, long start) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public long position(Blob pattern, long start) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public int setBytes(long pos, byte[] bytes) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public int setBytes(long pos, byte[] bytes, int offset, int len) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public OutputStream setBinaryStream(long pos) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public void truncate(long len) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public void free() throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
#Override public InputStream getBinaryStream(long pos, long length) throws SQLException { throw new UnsupportedOperationException("Not supported yet."); }
}
import java.io.File;
import java.sql.Blob;
import java.sql.SQLException;
public class ImageTest {
public static void main(
final String[] args
) throws SQLException
{
File file = new File( "/path/to/test.png" );
Blob blob = new MockBlob( file );
System.out.println(
"height: "
+ ImageMetaDataReader.getHeight( blob, "png" )
);
System.out.println(
"width: "
+ ImageMetaDataReader.getWidth( blob, "png" )
);
}
}
SQL:
CREATE AND COMPILE JAVA SOURCE NAMED "ImageMetaDataReader" AS
<the java code from above>
/
CREATE FUNCTION getImageHeight(
file IN BLOB,
fileType IN VARCHAR2
) RETURN NUMBER
AS LANGUAGE JAVA
name 'ImageMetaDataReader.getHeight( java.sql.Blob, String) return Integer';
/
CREATE FUNCTION getImageWidth(
file IN BLOB,
fileType IN VARCHAR2
) RETURN NUMBER
AS LANGUAGE JAVA
name 'ImageMetaDataReader.getWidth( java.sql.Blob, String) return Integer';
/
(The code is untested in an Oracle database as I don't have an instance to hand at the moment.)
#MT0's answer is the way to go assuming this is a process that needs to work going forward.
Assuming you're not yet on 19.1 and if this is just an ad hoc/ short term requirement, you can create an ORDImage from the BLOB. Assuming that the image is in one of the file types that ORDImage understands (which is, realistically, going to include basically anything that a normal user would be uploading), the constructor can parse the image and extract properties like the height and width that you can then query. It also provides a variety of methods to manipulate the image (scaling/ rotating/ etc.)
Unfortunately, ORDImage has been deprecated in Oracle 18 and I believe it has been removed in Oracle 19 so it's not something that you'd want to use any longer for writing code that you're going to be relying on permanently. If you're just trying to get an ad hoc report or make a short term data fix, though, it's probably easier than finding, loading, and using a Java image processing library.
I am trying to query a table that has long-raw() type data which is having a text in it. I need to export this data to a flat file. For a given id, I see there are 14 rows in the table. I am fetching the data using JDBC connection and when fetching the data using ResultSet, I am getting ArrayIndexOutOfBoundsException at 13th row. Not sure why this issue occurs.
The data in the long raw column could be large. I am suspecting it is not able to fetch all the data that is present. I might be wrong too. I cant find much information when could a ArrayIndexOutOfBoundsException occur in this scenario. Complete code is below-
import java.io.*;
import java.sql.*;
import java.util.ArrayList;
public class TestMain {
private static String URL = "jdbc:oracle:thin:#localhost:8080:xe";
private static String USER_NAME = "scott";
private static String PASSWORD = "tiger";
public static void main(String[] args) throws SQLException, IOException {
Connection newConnection = getNewConnection();
BufferedWriter bw = new BufferedWriter(new FileWriter("./extractedFile/RawDataFile.txt"));
PreparedStatement extractableRowCount = getExtractableRowCount(newConnection, (long)34212);
ResultSet foundCountRs = extractableRowCount.executeQuery();
foundCountRs.next();
int foundCount = foundCountRs.getInt(1);
//`here I get 14 as the count`
System.out.println("Available rows for id:: 34212 are "+foundCount);
foundCountRs.close();
extractableRowCount.close();
PreparedStatement fetchBinaryQueryStatement = getExtractableRow(newConnection, (long)34212);
ResultSet fetchedRowsRs = fetchBinaryQueryStatement.executeQuery();
int i=0;
while (fetchedRowsRs.next()) {
i++;
//`I see outputs upto i=13, and then I get ArrayIndexOutOfBoundsException
System.out.println("i = " + i);
String userName = fetchedRowsRs.getString("user_name");
InputStream savedTextData = fetchedRowsRs.getBinaryStream("saved_text");
bw.write(userName + ":: ");
int len = 0;
if (savedTextData != null) {
while ((len = savedTextData.read()) != -1) {
bw.write((char) len);
bw.flush();
}
}
fetchedRowsRs.close();
fetchBinaryQueryStatement.close();
}
bw.close();
}
public static Connection getNewConnection() throws SQLException {
DriverManager.registerDriver(new oracle.jdbc.OracleDriver());
return DriverManager.getConnection(URL, USER_NAME, PASSWORD);
}
public static PreparedStatement getExtractableRow(Connection connection, Long id) throws SQLException {
PreparedStatement statement = connection.prepareStatement("SELECT user_name, saved_text FROM user_email_text_data where id = ?");
statement.setLong(1, id);
return statement;
}
public static PreparedStatement getExtractableRowCount(Connection connection, Long id) throws SQLException {
PreparedStatement statement = connection.prepareStatement("SELECT count(1) FROM user_email_text_data where id = ?");
statement.setLong(1, id);
return statement;
}
}
Full stack trace of error:
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 8
at oracle.jdbc.driver.T4CMAREngineNIO.buffer2Value(T4CMAREngineNIO.java:814)
at oracle.jdbc.driver.T4CMAREngineNIO.unmarshalUB2(T4CMAREngineNIO.java:577)
at oracle.jdbc.driver.T4CMAREngineNIO.unmarshalSB2(T4CMAREngineNIO.java:557)
at oracle.jdbc.driver.T4CMAREngine.processIndicator(T4CMAREngine.java:1573)
at oracle.jdbc.driver.T4CMarshaller$StreamMarshaller.unmarshalOneRow(T4CMarshaller.java:179)
at oracle.jdbc.driver.T4CLongRawAccessor.unmarshalOneRow(T4CLongRawAccessor.java:159)
at oracle.jdbc.driver.T4CTTIrxd.unmarshal(T4CTTIrxd.java:1526)
at oracle.jdbc.driver.T4CTTIrxd.unmarshal(T4CTTIrxd.java:1289)
at oracle.jdbc.driver.T4C8Oall.readRXD(T4C8Oall.java:850)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:543)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:252)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:612)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:226)
at oracle.jdbc.driver.T4CPreparedStatement.fetch(T4CPreparedStatement.java:1023)
at oracle.jdbc.driver.OracleStatement.fetchMoreRows(OracleStatement.java:3353)
at oracle.jdbc.driver.InsensitiveScrollableResultSet.fetchMoreRows(InsensitiveScrollableResultSet.java:736)
at oracle.jdbc.driver.InsensitiveScrollableResultSet.absoluteInternal(InsensitiveScrollableResultSet.java:692)
at oracle.jdbc.driver.InsensitiveScrollableResultSet.next(InsensitiveScrollableResultSet.java:406)
I was getting the exact same exception when calling ResultSet.next() using ojdbc8-12.2.0.1:
java.lang.ArrayIndexOutOfBoundsException: 8
at oracle.jdbc.driver.T4CMAREngineNIO.buffer2Value(T4CMAREngineNIO.java:814)
at oracle.jdbc.driver.T4CMAREngineNIO.unmarshalUB2(T4CMAREngineNIO.java:577)
at oracle.jdbc.driver.T4CMAREngineNIO.unmarshalSB2(T4CMAREngineNIO.java:557)
at oracle.jdbc.driver.T4CMAREngine.processIndicator(T4CMAREngine.java:1573)
at oracle.jdbc.driver.T4CMarshaller$StreamMarshaller.unmarshalOneRow(T4CMarshaller.java:179)
at oracle.jdbc.driver.T4CLongRawAccessor.unmarshalOneRow(T4CLongRawAccessor.java:159)
at oracle.jdbc.driver.T4CTTIrxd.unmarshal(T4CTTIrxd.java:1526)
at oracle.jdbc.driver.T4CTTIrxd.unmarshal(T4CTTIrxd.java:1289)
at oracle.jdbc.driver.T4C8Oall.readRXD(T4C8Oall.java:850)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:543)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:252)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:612)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:226)
at oracle.jdbc.driver.T4CPreparedStatement.fetch(T4CPreparedStatement.java:1023)
at oracle.jdbc.driver.OracleStatement.fetchMoreRows(OracleStatement.java:3353)
The exception disappeared when I upgraded driver version to ojdbc8-18.3.0.0.
Updating JDBC driver might be worth a try, if anyone should find themselves in the same situation.
The below lines are at the end of the outer while loop but they need to be executed after the loop since you can't close the result set object and then call next()
fetchedRowsRs.close();
fetchBinaryQueryStatement.close();
As your size of array is 14 therefore the last index is 13 as it starts from 0.So you are accessing value of index greater than size of array it gives ArrayOutOfBound exception.
I'm trying create a generic Java Agent to instrument any Java application's methods.
I've followed this tutorial https://javapapers.com/core-java/java-instrumentation/ and created a java agent.
The java agent is supposed to look for a particular class ( I'm restricting it to one class now since it's not working for me)
Once the class is found, I'm using JavaAssist API to add a local variable to the beginning of each method and capture the current time. In the end of the method I'd like to simply print the time it took for the method to execute. (Pretty much following all the typical examples about Java agent.
I run my test application ( a web server using Vert.x ) with --javaagent flag pointing to the Java agent jar file I created ( the code is down below).
This works just fine for methods that either don't have return value and no parameters or return/take a primitive type.
However when a method is returning or taking a parameter that is an object from a another class (that has not been loaded yet I think) I get a CannotCompileException exception with the message that that class which is in the parameters list or in the return statement is not found.
For example the instrumentation for this method works:
#Override
public void start() throws Exception {
logger.debug("started thread {}", Thread.currentThread().getName());
for (int port : ports) {
HttpServer httpServer = getVertx().createHttpServer(httpServerOptions);
Router router = setupRoutes();
httpServer.requestHandler(router::accept);
logger.info("Listening on port {}", port);
httpServer.listen(port);
}
}
However for this method that returns io.vertx.ext.web.Router:
private Router setupRoutes() {
Router router = Router.router(getVertx());
router.get(STATUS_PATH).handler(this::statusHandler);
router.route().handler(BodyHandler.create());
router.post().handler(this::handleBidRequest);
router.put().handler(this::handleBidRequest);
router.get(SLEEP_CONTROLLER_PATH).handler(this::sleepControllerHandler);
return router;
}
I get an exception and the output of my java agent is :
Instrumenting method rubiconproject.com.WebServerVerticle.setupRoutes()
Could not instrument method setupRoutes error: cannot find io.vertx.ext.web.Router
This the code for my java agent:
import java.lang.instrument.Instrumentation;
import transformers.TimeMeasuringTransformer;
public class TimeCapturerAgent {
public static void premain(String agentArgs, Instrumentation inst) {
System.out.println(TimeCapturerAgent.class.getCanonicalName() + " is loaded...... ");
inst.addTransformer(new TimeMeasuringTransformer());
}}
package transformers;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.lang.instrument.ClassFileTransformer;
import java.lang.instrument.IllegalClassFormatException;
import java.security.ProtectionDomain;
import javassist.CannotCompileException;
import javassist.ClassPool;
import javassist.CtClass;
import javassist.CtMethod;
public class TimeMeasuringTransformer implements ClassFileTransformer {
public TimeMeasuringTransformer() {
System.out.println("TimeMeasuringTransformer added ");
}
#Override
public byte[] transform(ClassLoader loader,
String className,
Class<?> classBeingRedefined,
ProtectionDomain protectionDomain,
byte[] classfileBuffer) throws IllegalClassFormatException {
if(className != null && className.contains("WebServerVerticle")) {
System.out.println("Instrumenting class " + className);
return modifyClass(classfileBuffer);
}
return null;
}
private byte[] modifyClass(byte[] originalClassfileBuffer) {
ClassPool classPool = ClassPool.getDefault();
CtClass compiledClass;
try {
compiledClass = classPool.makeClass(new ByteArrayInputStream(originalClassfileBuffer));
System.out.println("Created new compiled Class " + compiledClass.getName());
} catch (IOException e) {
e.printStackTrace();
return null;
}
instrumentMethods(compiledClass);
byte [] newClassByteCode = createNewClassByteArray(compiledClass);
compiledClass.detach();
return newClassByteCode;
}
private byte[] createNewClassByteArray(CtClass compiledClass) {
byte[] newClassByteArray = null;
try {
newClassByteArray = compiledClass.toBytecode();
} catch (IOException e) {
e.printStackTrace();
} catch (CannotCompileException e) {
e.printStackTrace();
} finally {
return newClassByteArray;
}
}
private void instrumentMethods(CtClass compiledClass) {
CtMethod[] methods = compiledClass.getDeclaredMethods();
System.out.println("Class has " + methods.length + " methods");
for (CtMethod method : methods) {
try {
System.out.println("Instrumenting method " + method.getLongName());
method.addLocalVariable("startTime", CtClass.longType);
method.insertBefore("startTime = System.nanoTime();");
method.insertAfter("System.out.println(\"Execution Duration "
+ "(nano sec): \"+ (System.nanoTime() - startTime) );");
} catch (CannotCompileException e) {
System.out.println("Could not instrument method " + method.getName()+" error: " + e.getMessage());
continue;
}
}
}}
I am trying to fetch the data using data provider in selenium from an excel sheet. When the data is returned/passed on to the caller function, i get null values for the first time even though loop begins from the second row having the actual data. I am not sure why this is happening.
package pageobjectmodel;
import java.io.IOException;
import java.lang.reflect.InvocationTargetException;
import org.testng.Assert;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
import Utils.ReadingExcelfile;
import Utils.TestBase;
import pagebasedexecution.LinkedInloginPage;
public class LinkedInLoginTest extends TestBase
{
//public static ReadExcel excelfile;
public static ReadingExcelfile excelfile;
LinkedInloginPage loginPage;
public LinkedInLoginTest() throws IOException
{
super();
}
#BeforeMethod
public void Setup() throws IOException, NoSuchMethodException, SecurityException, IllegalAccessException, IllegalArgumentException, InvocationTargetException
{
BrowserSetup();
loginPage = new LinkedInloginPage();
}
#Test(dataProvider="TestData")
public void LoginTest(String uname, String password) throws InterruptedException
{
// System.out.println("received data is --- " +uname + " , " + password);
loginPage.LoginIntoAccount(uname, password);
String title = loginPage.VerifyTitle();
Assert.assertEquals(title, "LinkedIn", "Unable to login: invalid credentials");
Thread.sleep(5000);
}
/*#Test(priority=2)
public void VerifyloginPageTitleTest() throws InterruptedException
{
String title = loginPage.VerifyTitle();
Assert.assertEquals(title, "LinkedIn", "Unable to login: invalid credentials");
}*/
#DataProvider
public Object[][] TestData() throws IOException
{
excelfile = new ReadingExcelfile(System.getProperty("user.dir")+"\\src\\main\\java\\testData\\LinkedIn.xlsx");
int rows = excelfile.RowCount(1);
int colm = excelfile.TotalColm("LoginPage", 0);
Object[][] credentials = new Object[rows][colm];
for(int i = 1; i < rows; i++)
{
for(int j = 0; j<colm; j++)
{
credentials[i][j] = excelfile.getdata("LoginPage", i, j);
System.out.println("Fetched data from excel sheet is -- "+credentials[i][j]);
}
}
return credentials;
}
#AfterMethod
public void closebrowser()
{
System.out.println("quitting browser");
driver.quit();
}
}
Even though i am fetching the data from second row, somehow it's fetching the data from first row(column names) and first set of data is returned as null, null. i have provided the screenshot of error console and highlighted the error portion. Any help is appreciated.
Thanks
This is because here
Object[][] credentials = new Object[rows][colm];
you are creating array of object with number of rows and colums present in your sheet but you are inserting value in this array from second row So In first row it is taking default null values.
Replace code :
for(int j = 0; j<colm; j++)
{
credentials[i][j] = excelfile.getdata("LoginPage", i, j);
System.out.println("Fetched data from excel sheet is -- "+credentials[i][j]);
}
With
for(int j = 0; j<colm; j++)
{
credentials[i-1][j] = excelfile.getdata("LoginPage", i, j);
System.out.println("Fetched data from excel sheet is -- "+credentials[i-1][j]);
}
I wrote a java program about transfering a txt file to a sqlite db but it takes really more time there are about 83000 data (200 data takes about 1 minute).
how can i increase transfering speed. İ tried adding arraylist than get but its not change
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.Statement;
public class NewMain {
public static Connection connect() {
Connection conn = null;
try {
String url1 = "jdbc:sqlite:c:/Users/sozdemir/Desktop/sozluk.db";
conn = DriverManager.getConnection(url1);
String sql1 = "CREATE TABLE IF NOT EXISTS KELIMELER (Kelime PRIMARYKEY NOT NULL);";
Statement s = conn.createStatement();
s.execute(sql1);
} catch (Exception e) {
System.out.println(e.getMessage());
}
return conn;
}
public void Verigir(String kelime){
String sql = "INSERT INTO KELIMELER (kelime) VALUES(?)";
try (Connection conn = this.connect();
PreparedStatement statement = conn.prepareStatement(sql)
){
statement.setString(1, kelime);
statement.executeUpdate();
} catch (Exception e) {
}
}
public static void main(String[] args) throws IOException {
/* connect();*/
NewMain app = new NewMain();
String kelime = null;
BufferedReader in = null;
int adet;
adet= 0;
in = new BufferedReader(new FileReader("C://Users//sozdemir//Desktop//ozluk.txt"));
while ((kelime=in.readLine()) !=null) {
app.Verigir(kelime);
adet +=1;
System.out.println(81742 - adet);
}
}
}
Some hints:
Use a single prepareStatement
Put everything inside a transaction
Use PRAGMA synchronous = OFF and PRAGMA journal_mode = OFF
This will give you very fast inserts.