Query Ignite cache created in Java with sqlline - ignite

I am using ignite's CacheQueryExample:
public class CacheQueryExample {
/** Organizations cache name. */
private static final String ORG_CACHE = CacheQueryExample.class.getSimpleName() + "Organizations";
/** Persons collocated with Organizations cache name. */
private static final String PERSON_CACHE = CacheQueryExample.class.getSimpleName() + "Persons";
/**
* Executes example.
*
* #param args Command line arguments, none required.
* #throws Exception If example execution failed.
*/
public static void main(String[] args) throws Exception {
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
System.out.println();
System.out.println(">>> Cache query example started.");
CacheConfiguration<Long, Organization> orgCacheCfg = new CacheConfiguration<>(ORG_CACHE);
orgCacheCfg.setCacheMode(CacheMode.PARTITIONED); // Default.
orgCacheCfg.setIndexedTypes(Long.class, Organization.class);
...
Using sqlline, the following tables are created:
+-----------+--------------------------------+-----------------------------+------------+---------+----------+------------+-----------+---------------------------+---------------+
| TABLE_CAT | TABLE_SCHEM | TABLE_NAME | TABLE_TYPE | REMARKS | TYPE_CAT | TYPE_SCHEM | TYPE_NAME | SELF_REFERENCING_COL_NAME | REF_GENERATIO |
+-----------+--------------------------------+-----------------------------+------------+---------+----------+------------+-----------+---------------------------+---------------+
| IGNITE | CacheQueryExampleOrganizations | ORGANIZATION | TABLE | | | | | | |
| IGNITE | CacheQueryExamplePersons | PERSON | TABLE
How do I query these tables in sqlline? I have tried the following and none works:
0: jdbc:ignite:thin://127.0.0.1:10800> select * from person;
Error: Failed to parse query. Table "PERSON" not found; SQL statement:
select * from person [42102-197] (state=42000,code=1001)
0: jdbc:ignite:thin://127.0.0.1:10800> select * from CacheQueryExamplePersons.person;
Error: Failed to parse query. Schema "CACHEQUERYEXAMPLEPERSONS" not found; SQL statement:
select * from CacheQueryExamplePersons.person [90079-197] (state=42000,code=1001)
And logging to sqlline for the specific schema:
0: jdbc:ignite:thin://127.0.0.1:10800/CacheQu> select * from person;
Error: Failed to set schema for DB connection for thread [schema=CACHEQUERYEXAMPLEPERSONS] (state=50000,code=1)

Try enclosing the table scheme name with double quotes.

Related

How do i get the Specflow scenario outline example data to a table

Is there any way to get the scenario context outline example values i mean all the values in to a table
Scenario Outline: Create a Matter
Given I enter "< parameter1 >"
Then I enter "<parameter2>"
Then I enter "<parameter3>"
Then I enter "<parameter4>"
Then review all the parameters entered above in this final step
Examples:
| parameter1 | Paramter2|Parameter3|Parameter4|....|parameter14|
| value |value2 |value3 |value4 |....|value14|
in the above scenario is there any way to get all the example values in step4 to a table
I know I can set ScenarioContext.Current[parameter1] = value in each step
In my case I have 14 parameters which are used in each step but in the final step i need to use all the 14 parameters
is there any way I get the example values in to table.
I don't want to break in to smaller scenario
like below
Scenario: breaking in to smaller chunks
Given I enter the following
| parameter1 | Paramter2|
| value |value2|
Here is something I use that may help. Andreas is the expert though on this stuff and he probably has a better idea. Since your format was less than ideal, I used a basic scenario.
Change it to a "Scenario" and Drop the "Scenario Outline".
The feature looks like this:
Scenario: Validate Shipping Fees
When the user enters the State then we can verify the city and shipping fee
| City | State | Shipping |
| Boulder | Colorado | 6.00 |
| Houston | Texas | 8.00 |
Add the Table.
public class ShippingTable
{
public string City { get; set; }
public string State { get; set; }
public string Shipping { get; set; }
}
Then in your step:
[When(#"the user enters the State then we can verify the city and shipping fee")]
public void WhenTheUserEnterTheStateThenWeCanVerifyTheCityAndShippingFee(Table table)
{
var CityState = table.CreateSet<ShippingTable>();
foreach (var row in CityState)
{
try
{
Pages.CheckoutPage.SelectState(row.State);
Pages.CheckoutPage.SelectCity(row.City);
var recdPrice = Pages.CheckoutPage.GetShippingPrice;
Assert.AreEqual(row.shipping, recdPrice);
}
catch (Exception)
{
throw new Exception("This is jacked up");
}
}
}

How to get Google Dataflow to write to a BigQuery table name from the input data?

I'm new to Dataflow/Beam. I'm trying to write some data to BigQuery. I want the destination table name to be brought in from the previous stage a map entry keyed "table". But I couldn't find out how I pass this table name through the pipeline to BigQuery. Here's where I'm stuck.. any ideas what do to next?
pipeline
// ...
//////// I guess I shouldn't output TableRow here?
.apply("ToBQRow", ParDo.of(new DoFn<Map<String, String>, TableRow>() {
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
////////// WHAT DO I DO WITH "table"?
String table = c.element().get("table");
TableRow row = new TableRow();
// ... set some records
c.output(row);
}
}))
.apply(BigQueryIO.writeTableRows().to(/* ///// WHAT DO I WRITE HERE?? */)
.withSchema(schema)
.withWriteDisposition(
BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
));
You can use DynamicDestinations for that.
As an example I create some dummy data and I'll use the last word as the key:
p.apply("Create Data", Create.of("this should go to table one",
"I would like to go to table one",
"please, table one",
"I prefer table two",
"Back to one",
"My fave is one",
"Rooting for two"))
.apply("Create Keys", ParDo.of(new DoFn<String, KV<String,String>>() {
#ProcessElement
public void processElement(ProcessContext c) {
String[] splitBySpaces = c.element().split(" ");
c.output(KV.of(splitBySpaces[splitBySpaces.length - 1],c.element()));
}
}))
and then with getDestination we control how to route each element to a different table according to the key and getTable to build the fully qualified table name (prepending the prefix). We could use getSchema if the different tables had different schemas. Finally, we control what to write in the table using withFormatFunction:
.apply(BigQueryIO.<KV<String, String>>write()
.to(new DynamicDestinations<KV<String, String>, String>() {
public String getDestination(ValueInSingleWindow<KV<String, String>> element) {
return element.getValue().getKey();
}
public TableDestination getTable(String name) {
String tableSpec = output + name;
return new TableDestination(tableSpec, "Table for type " + name);
}
public TableSchema getSchema(String schema) {
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("Text").setType("STRING"));
TableSchema ts = new TableSchema();
ts.setFields(fields);
return ts;
}
})
.withFormatFunction(new SerializableFunction<KV<String, String>, TableRow>() {
public TableRow apply(KV<String, String> row) {
TableRow tr = new TableRow();
tr.set("Text", row.getValue());
return tr;
}
})
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
To fully test this I created the following tables:
bq mk dynamic_key
bq mk -f dynamic_key.dynamic_one Text:STRING
bq mk -f dynamic_key.dynamic_two Text:STRING
And, after setting the $PROJECT, $BUCKET and $TABLE_PREFIX (in my case PROJECT_ID:dynamic_key.dynamic_) variables, I run the job with:
mvn -Pdataflow-runner compile -e exec:java \
-Dexec.mainClass=com.dataflow.samples.DynamicTableFromKey \
-Dexec.args="--project=$PROJECT \
--stagingLocation=gs://$BUCKET/staging/ \
--tempLocation=gs://$BUCKET/temp/ \
--output=$TABLE_PREFIX \
--runner=DataflowRunner"
We can verify that each element went to the correct table:
$ bq query "SELECT * FROM dynamic_key.dynamic_one"
+---------------------------------+
| Text |
+---------------------------------+
| please, table one |
| Back to one |
| My fave is one |
| this should go to table one |
| I would like to go to table one |
+---------------------------------+
$ bq query "SELECT * FROM dynamic_key.dynamic_two"
+--------------------+
| Text |
+--------------------+
| I prefer table two |
| Rooting for two |
+--------------------+
Full code here.

Finding index using switch case statement in javascript

I'm using Pentaho(ETL) tool to achieve the output using a javascript component which accepts javascript code to achieve the desired transformation.The following table is imported into pentaho from a .csv file(source file).
For example this is my table structure
+--------+--------+--------+
| RLD | MD | INC |
+--------+--------+--------+
| 0 | 3868 | 302024 |
| 53454 | 7699 | 203719 |
| 154508 | 932 | 47694 |
| 107547 | 36168 | 83592 |
I want to use a script which would give me the max_value and its index number, such that my output would look like
Output Table
+--------+--------+--------+-----------+-----------+
| RQD | MT | IZC | max_value | max_index |
+--------+--------+--------+-----------+-----------+
| 0 | 3868 | 302024 | 302024 | 3 |
| 53454 | 7699 | 203719 | 203719 | 3 |
| 154508 | 932 | 47694 | 154508 | 1 |
| 456 | 107547| 83592 | 107547 | 2 |
To get the max value from rows I have used
var max_value = Math.max(RQD,MT,IZC);
println(max_value);
I tried to get their index using the following script
var max_index = switch (Math.max(RQD,MT,IZC))
{
case "RQD":document.write("1")
case "MT":document.write("2")
case "MT":document.write("3")
default:document.write("0")
}
How can I get the desired result in the form of javascript data structure? Any help would be much appreciated.Thanks
var list = [
{RLD:0,
MD:3868,
INC:302024
},
{RLD:53454,
MD:7699,
INC:203719
},
{RLD:154508,
MD:932,
INC:47694
},
{RLD:107547,
MD:36168,
INC:83592
},
];
list = list.map(function(item){
var keys = Object.keys(item);
item.max_value = item[keys[0]];
item.max_index = '';
for(var i = 1, l = keys.length; i < l; i++) {
var key = keys[i];
var keyValue = item[key];
if (item.max_value < keyValue){
item.max_value = keyValue;
item.max_index = key;
}
}
return item;
})
There are several issues with your code, lets solve them!
Use breaks: you must use breaks in order to avoid the switch moving to cases below its match.
switch cases do not return a value like functions, you cannot use a switch return to define a variable, you need to define the variable inside the switch case.
Math.max does not return the name of its maximum variable, instead it returns the maximum number from its given parameters.
to solve this issue, i would not use a switch case with math.max to be honest, however to answer your question:
var tableArray = [RQD,MT,IZC];
var maxIndex = tableArray.indexOf(Math.max.apply(null, arr));
if(maxIndex > -1) document.write(maxIndex+1);
i used +1 because you have your index in the example table starting from 1 instead of 0.
the way the array is sorted should match the way the table is sorted per raw.
First of all you could not solve this problem with a switch statement.
In a javascript switch you should provide a value that is one of the followed case(s), otherwise the switch will go to the default if defined.
Your problem seems to be to find out the higher value of 3 columns and print out, the colums row by row adding a column with the max value and the index of the column where you found it.
So for example on the row:
1, RLD : 0
2, MD : 3868
3, INC : 302024
In this case the higher value is INC with the column index 3.
If you have just the variables with the number values, you could do nothing more than something like this:
function getMaxValueRow (RLD, MD, INC) {
var max_value = RLD;
var max_index = 1;
if (MD > max_value) {
max_value = MD;
max_index = 2;
}
if (INC > max_value) {
max_value = INC;
max_index = 3;
}
return [RLD, MD, INC, max_value, max_index];
}
You could return an object too like this:
retrun {
'RQD': RLD,
'MT': MD,
'IZC': INC,
'max_value': max_value,
'max_index': max_index
}

org.springframework.jdbc.BadSqlGrammarException: bad SQL grammar

I am getting the following error in my code:
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad
SQL grammar [insert into bulletins (date, name, subject, note, approved) values
(?, ?, ?, ?, ?)]; nested exception is com.mysql.jdbc.exceptions.MySQLSyntaxError
Exception: Unknown column 'date' in 'field list'
This line is in my Spring controller.
bulletinDAO.writeBulletin(bulletin);
The actual place in my DAO class where I'm trying to write using Hibernate.
public void writeBulletin(Bulletin bulletin) {
try {
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
session.save(bulletin);
tx.commit();
} catch (Exception e) {
System.out.println(e.toString());
}
}
Here is my model class.
#Entity
#Table(name="login")
public class Bulletin {
#Id
#Column(name="id")
#GeneratedValue
private int id;
#Column(name="bulletin_date")
private String date;
#Column(name="name")
private String name;
#Column(name="subject")
private String subject;
#Column(name="note")
private String note;
#Column(name="approved")
private boolean approved;
// Getters and setters follow
}
Finally, here is the layout of the table.
+---------------+---------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------------+---------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| bulletin_date | varchar(10) | YES | | NULL | |
| name | varchar(30) | YES | | NULL | |
| subject | varchar(50) | YES | | NULL | |
| note | varchar(2500) | YES | | NULL | |
| approved | tinyint(1) | YES | | NULL | |
+---------------+---------------+------+-----+---------+----------------+
There must be something wrong with your getters and setters.
I would recommend changing the property name from date to bulletinDate. And then set & get it correctly...
#Column(name="bulletin_date")
private String bulletinDate;
public String getBulletinDate() {
return bulletinDate;
}
public void setBulletinDate(String bulletin_date) {
this.bulletinDate = bulletin_date;
}
Your issue is here
[insert into bulletins (date, name, subject, note, approved)]
Whereas you need bullletin_date.
From experience i needed to rebuild the project completely to ensure that right column name is referenced.
Clear your project cache and rebuild it.
Let me know how you go and i'll help further if it doesn't help.

Hibernate: Refrain update on Many-to-Many insert

Problem
I use hibernate to store data in an MySQL database. I now want to store a Company and one of its Branches.
The company:
#Entity
#Table(name="company")
public class Company {
#Id
#GeneratedValue
#Column(name="id")
private int id;
#Column(name="name")
private String name;
#ManyToMany(cascade = CascadeType.ALL)
#JoinTable(name="company_branch_join",
joinColumns={#JoinColumn(name="company_id")},
inverseJoinColumns={#JoinColumn(name="branch_id")})
private Set<CompanyBranch> branches;
// Getters and setters...
}
And the branch:
#Entity
#Table(name="company_branch")
public class CompanyBranch {
#Id
#GeneratedValue
#Column(name="id")
private int id;
#Column(name="branch")
private String branch;
#ManyToMany(mappedBy="branches", cascade = CascadeType.ALL)
private Set<Company> companies;
// Getters and setters...
}
Question
The code works and i can insert the data in the join table. The problem is the override policy regarding the branches. My branch table in the database is already filled with branches and its IDs so i don't want to modify the data. However on an company-insert the branches associated with the company get stored again and override the data with the same ID in the database. How can I prevent this behavior?
CompanyBranch cb1 = new CompanyBranch();
cb1.setId(1);
cb1.setBranch("Manufacturing");
CompanyBranch cb2 = new CompanyBranch();
cb2.setId(2);
cb2.setBranch("DONT-INSERT");
Company c = new Company();
c.setName("[Random-Company-Name]");
c.addBranch(cb1);
c.addBranch(cb2);
CompanyManager cm = new CompanyManagerImpl();
cm.saveCompany(c);
The branch table before execution looks like this:
| id | branch |
+----+----------------+
| 1 | Manufacturing |
| 2 | IT |
|... | ... |
The table should not change. But after execution it looks like this:
| id | branch |
+----+----------------+
| 1 | Manufacturing |
| 2 | DONT-INSERT |
|... | ... |
Instead of creating new branch instances with the new operator, retrieve a reference to them using EntityManager.getReference(), e.g.:
CompanyBranch cb1 = entityManager.getReference(CompanyBranch.class, 1);