Apache Ignite 'Failed to prepare update plan' when executing SQL query on cache - ignite

I am playing around with different approaches on how to configure caches and tables in Ignite and then insert an entry via the SQL API using the .NET SDK.
I create two caches with each having a table. The first is created via CacheClientConfiguration and QueryEntities and the second using the 'CREATE TABLE...' DDL command. I then try to insert the same object (same values) into both tables using 'INSERT INTO...'. For the table created using the 'CREATE Table...' command it works, but the for the table created using QueryEntities i get an IgniteClientException stating: 'Failed to prepare update plan'. Both Insert commands look exactly the same (besides the table name).
What is the exception trying to tell me, why does the insert work for the second approach but not for the first?
See example code below.
Creating caches and tables:
public class ValueClass
{
[QuerySqlField(IsIndexed = true)]
public long Id { get; set; }
[QuerySqlField]
public string Content { get; set; }
}
var cache01 = igniteClient.CreateCache<long, ValueClass>(new CacheClientConfiguration
{
Name = "Cache01",
QueryEntities = new[]
{
new QueryEntity(
typeof(long),
typeof(ValueClass))
{
TableName = "table01"
}
},
SqlSchema = "PUBLIC",
});
var cache02 = igniteClient.CreateCache<long, ValueClass>("Cache02");
cache02.Query(new SqlFieldsQuery(
"CREATE TABLE IF NOT EXISTS table02 (" +
"Id BIGINT PRIMARY KEY, " +
"Content VARCHAR, " +
")" +
"WITH " +
"\"cache_name=Cache02, " +
$"VALUE_TYPE={typeof(ValueClass)}" +
$"\""
)
{ Schema = "PUBLIC" });
Executing insert queries:
// EXCEPTION: Apache.Ignite.Core.Client.IgniteClientException: 'Failed to prepare update plan.'
cache01.Query(
new SqlFieldsQuery(
"INSERT INTO " +
"table01 ( " +
"Id, " +
"Content ) " +
"VALUES ( " +
"?, " +
"? )"
)
{
Arguments = new object[]
{
3,
"entry3"
},
Schema = "PUBLIC"
});
// This one works
cache02.Query(
new SqlFieldsQuery(
"INSERT INTO " +
"table02 ( " +
"Id, " +
"Content ) " +
"VALUES ( " +
"?, " +
"? )"
)
{
Arguments = new object[]
{
3,
"entry3"
},
Schema = "PUBLIC"
});
Exception:
Unhandled exception. Apache.Ignite.Core.Client.IgniteClientException: Failed to prepare update plan.
at Apache.Ignite.Core.Impl.Client.Cache.CacheClient`2.HandleError[T](ClientStatusCode status, String msg)
at Apache.Ignite.Core.Impl.Client.ClientSocket.DecodeResponse[T](BinaryHeapStream stream, Func`2 readFunc, Func`3 errorFunc)
at Apache.Ignite.Core.Impl.Client.ClientSocket.DoOutInOp[T](ClientOp opId, Action`1 writeAction, Func`2 readFunc, Func`3 errorFunc)
at Apache.Ignite.Core.Impl.Client.ClientFailoverSocket.DoOutInOp[T](ClientOp opId, Action`1 writeAction, Func`2 readFunc, Func`3 errorFunc)
at Apache.Ignite.Core.Impl.Client.Cache.CacheClient`2.DoOutInOp[T](ClientOp opId, Action`1 writeAction, Func`2 readFunc)
at Apache.Ignite.Core.Impl.Client.Cache.CacheClient`2.Query(SqlFieldsQuery sqlFieldsQuery)
at ApacheIgniteConfigurationDemo.Worker.ExecuteAsync(CancellationToken stoppingToken) in C:\ApacheIgniteConfigurationDemo\Worker.cs:line 60
at Microsoft.Extensions.Hosting.Internal.Host.StartAsync(CancellationToken cancellationToken)
at Microsoft.Extensions.Hosting.HostingAbstractionsHostExtensions.RunAsync(IHost host, CancellationToken token)
at Microsoft.Extensions.Hosting.HostingAbstractionsHostExtensions.RunAsync(IHost host, CancellationToken token)
at Program.<Main>$(String[] args) in C:\ApacheIgniteConfigurationDemo\Program.cs:line 26
at Program.<Main>(String[] args) [StatusCode=Fail]
Ignite is running in a docker container using the default configuration.
Solution:
As pointed out by #Alexandr Shapkin (see Accepted Answer) I had to specify the "KeyFieldName" as I wanted to use the "Id" field of the POJO class as key.
Configuring the table like this worked for my case:
var cache01 = igniteClient.CreateCache<long, ValueClass>(new CacheClientConfiguration
{
Name = "Cache01",
QueryEntities = new[]
{
new QueryEntity(
typeof(long),
typeof(ValueClass))
{
TableName = "table01",
KeyFieldName = nameof(ValueClass.Id)
}
},
SqlSchema = "PUBLIC",
});
Another solution would have been to add the "_key" row to the Insert command:
cache01.Query(
new SqlFieldsQuery(
"INSERT INTO " +
"table01 ( " +
"_key, " +
"Id, " +
"Content ) " +
"VALUES ( " +
"?, " +
"?, " +
"? )"
)
{
Arguments = new object[]
{
3 ,
3,
"entry3"
},
Schema = "PUBLIC"
});

I believe this is because you didn't provide the _KEY to the first query explicitly and would like to keep it on your POJO model.
Specify the key configuration explicitly using the following configuration and give it a try.
var cache01 = igniteClient.CreateCache<long, ValueClass>(new CacheConfiguration()
{
Name = "Cache01",
QueryEntities = new[]
{
new QueryEntity(
typeof(long),
typeof(ValueClass))
{
TableName = "table01",
KeyFieldName = "Id",
Fields = new List<QueryField>()
{
new QueryField("Id", typeof(long)),
new QueryField("Content", typeof(string)),
}
}
},
SqlSchema = "PUBLIC",
});

Related

Passing Variables into Dapper are query parameters... "A request to send or receive data was disallowed because the socket is not connected

Ok , I"m doing what looks like a simple Dapper query
but if I pass in my studId parameter, it blows up with this low level networking exception:
{"A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied."}
if I comment out the line of sql that uses it, fix the where clause, ,and comment out the line where it's added the the parameters object. It retrieves rows as expected.
I've spent the last 2.5 days trying everything I could think of, changing the names to match common naming patterns, changing type to a string (just gave a error converting string to number), yadda yadda yadda..
I'm at a complete loss as to why it doesn't like that parameter, I look at other code that works and they pass Id's just fine...
At this point I figure it has to be an ID-10-t that's staring me in the face and I'm just assuming my way right past it with out seeing it.
Any help is appreciated
public List<StudDistLearnSchedRawResponse> GetStudDistanceLearningScheduleRaw( StudDistLearnSchedQueryParam inputs )
{
var aseSqlConnectionString = Configuration.GetConnectionString( "SybaseDBDapper" );
string mainSql = " SELECT " +
" enrollment.stud_id, " +
" sched_start_dt, " +
" sched_end_dt, " +
" code.code_desc, " +
" student_schedule_dl.enrtype_id, " +
" student_schedule_dl.stud_sched_dl_id, " +
" dl_correspond_cd, " +
" course.course_name, " +
" stud_course_sched_dl.sched_hours, " +
" actual_hours, " +
" course_comments as staff_remarks, " + // note this column rename - EWB
" stud_course_sched_dl.sched_item_id , " +
" stud_course_sched_dl.stud_course_sched_dl_id " +
" from stud_course_sched_dl " +
" join student_schedule_dl on student_schedule_dl.stud_sched_dl_id = stud_course_sched_dl.stud_sched_dl_id " +
" join course on stud_course_sched_dl.sched_item_id = course.sched_item_id " +
" left join code on student_schedule_dl.dl_correspond_cd = code.code_id " +
" join enrollment_type on student_schedule_dl.enrtype_id = enrollment_type.enrtype_id " +
" join enrollment on enrollment_type.enr_id = enrollment.enr_id " +
" where enrollment.stud_id = #studId " +
" and sched_start_dt >= #startOfWeek" +
" and sched_end_dt <= #startOfNextWeek";
DapperTools.DapperCustomMapping<StudDistLearnSchedRawResponse>();
//string sql = query.ToString();
DateTime? startOfWeek = StartOfWeek( inputs.weekStartDateTime, DayOfWeek.Monday );
DateTime? startOfNextWeek = StartOfWeek( inputs.weekStartDateTime.Value.AddDays( 7 ) , DayOfWeek.Monday );
try
{
using ( IDbConnection db = new AseConnection( aseSqlConnectionString ) )
{
db.Open();
var arguments = new
{
studId = inputs.StudId, // it chokes and gives a low level networking error - EWB
startOfWeek = startOfWeek.Value.ToShortDateString(),
startOfNextWeek = startOfNextWeek.Value.ToShortDateString(),
};
List<StudDistLearnSchedRawResponse> list = new List<StudDistLearnSchedRawResponse>();
list = db.Query<StudDistLearnSchedRawResponse>( mainSql, arguments ).ToList();
return list;
}
}
catch (Exception ex)
{
Trace.WriteLine(ex.ToString());
return null;
}
}
Here is the input object
public class StudDistLearnSchedQueryParam
{
public Int64 StudId;
public DateTime? weekStartDateTime;
}
Here is the dapper tools object which just abstracts some ugly code to look nicer.
namespace EricSandboxVue.Utilities
{
public interface IDapperTools
{
string ASEConnectionString { get; }
AseConnection _aseconnection { get; }
void ReportSqlError( ILogger DalLog, string sql, Exception errorFound );
void DapperCustomMapping< T >( );
}
public class DapperTools : IDapperTools
{
public readonly string _aseconnectionString;
public string ASEConnectionString => _aseconnectionString;
public AseConnection _aseconnection
{
get
{
return new AseConnection( _aseconnectionString );
}
}
public DapperTools( )
{
_aseconnectionString = Environment.GetEnvironmentVariable( "EIS_ASESQL_CONNECTIONSTRING" );
}
public void ReportSqlError( ILogger DalLog, string sql, Exception errorFound )
{
DalLog.LogError( "Error in Sql" );
DalLog.LogError( errorFound.Message );
//if (env.IsDevelopment())
//{
DalLog.LogError( sql );
//}
throw errorFound;
}
public void DapperCustomMapping< T >( )
{
// custom mapping
var map = new CustomPropertyTypeMap(
typeof( T ),
( type, columnName ) => type.GetProperties( ).FirstOrDefault( prop => GetDescriptionFromAttribute( prop ) == columnName )
);
SqlMapper.SetTypeMap( typeof( T ), map );
}
private string GetDescriptionFromAttribute( System.Reflection.MemberInfo member )
{
if ( member == null ) return null;
var attrib = (Dapper.ColumnAttribute) Attribute.GetCustomAttribute( member, typeof(Dapper.ColumnAttribute), false );
return attrib == null ? null : attrib.Name;
}
}
}
If I change the SQL string building to this(below), but leave everything else the same(Including StudId in the args struct)... it doesn't crash and retrieves rows, so it's clearly about the substitution of #studId...
// " where enrollment.stud_id = #studId " +
" where sched_start_dt >= #startOfWeek" +
" and sched_end_dt <= #startOfNextWeek";
You name your data members wrong. I had no idea starting a variable name with # was possible.
The problem is here:
var arguments = new
{
#studId = inputs.StudId, // it chokes and gives a low level networking error - EWB
#startOfWeek = startOfWeek.Value.ToShortDateString(),
#startOfNextWeek = startOfNextWeek.Value.ToShortDateString(),
};
It should have been:
var arguments = new
{
studId = inputs.StudId, // it chokes and gives a low level networking error - EWB
startOfWeek = startOfWeek.Value.ToShortDateString(),
startOfNextWeek = startOfNextWeek.Value.ToShortDateString(),
};
The # is just a hint to Dapper, that it should replace with a corresponding member name.
## has special meaning in some SQL dialects, that's probably what makes the trouble.
So here's what I Found out.
The Sybase implementation has a hard time with Arguments.
It especially has a hard time with arguments of type int64 (this existed way pre .NetCore)
So If you change the type of the passed in argument from int64 to int32, everything works fine.
You can cast it, or just change the type of the method parameter

jdbcTemplate and raw JSON column

Suppose I query a table which contains raw json in a column, or create json myself with a subquery like this:
select
p.name,
(select json_build_array(json_build_object('num', a.num, 'city', a.city)) from address a where a.id = p.addr) as addr
from person p;
How can I instruct Spring's NamedParameterJdbcTemplate not to escape
the addr column but leave it alone?
So far it insists returning me something like this:
[{
name: "John",
addr: {
type: "json",
value: "{"num" : "123", "city" : "Luxembourg"}"
}
}]
This is the working solution.
Service:
#Transactional(readOnly = true)
public String getRawJson() {
String sql = "select json_agg(row_to_json(json)) from ( "
+ "select "
+ "p.id, "
+ "p.name, "
+ "(select array_to_json(array_agg(row_to_json(c))) from ( "
+ " ... some subselect ... "
+ ") c ) as subq "
+ "from person p "
+ "where type = :type "
+ ") json";
MapSqlParameterSource params = new MapSqlParameterSource("type", 1);
return jdbcTemplate.queryForObject(sql, params, String.class);
}
Controller:
#ResponseBody
#GetMapping(value = "/json", produces = MediaType.APPLICATION_JSON_VALUE)
public String getRawJson() {
return miscService.getRawJson();
}
The key is the combination of json_agg and row_to_json / array_to_json plus returning a String to avoid any type conversions.

BigQueryIO returns TypedRead<TableRow> instead of PCollection<TableRow>. How to get the real data?

I have a problem with retrieving a data from bigquery table inside DoFn. I can't find example to extract values from TypedRead.
This is a simplified pipeline. I would like to check does record with target SSN exists or not in bigquery table. The target SSN will be received via pubsub in real pipeline, I have replaced it with array of strings.
final BigQueryIoTestOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().as(BigQueryIoTestOptions.class);
final List<String> SSNs = Arrays.asList("775-89-3939");
Pipeline p = Pipeline.create(options);
PCollection<String> ssnCollection = p.apply("GetSSNParams", Create.of(SSNs)).setCoder(StringUtf8Coder.of());
ssnCollection.apply("SelectFromBQ", ParDo.of(new DoFn<String, TypedRead<TableRow>>() {
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
TypedRead<TableRow> tr =
BigQueryIO.readTableRows()
.fromQuery("SELECT pid19PatientSSN FROM dataset.table where pid19PatientSSN = '" + c.element() + "' LIMIT 1");
c.output(tr);
}
}))
.apply("ParseResponseFromBigQuery", ParDo.of(new DoFn<TypedRead<TableRow>, Void>() {
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
System.out.println(c.element().toString());
}
}));
p.run();
Big query returns PCollection only, we can get the result as entry set like the below example or we can serialize to objects as well like mentioned here
If you want to query from BigQuery middle of your pipeline use BigQuery instead of BigQueryIO like mentioned here
BigQueryIO Example:
PipelineOptions options = PipelineOptionsFactory.fromArgs(args).create();
Pipeline pipeline = Pipeline.create(options);
PCollection<TableRow> result = pipeline.apply(BigQueryIO.readTableRows()
.fromQuery("SELECT id, name FROM [project-test:test_data.test] LIMIT 1"));
result.apply(MapElements.via(new SimpleFunction<TableRow, Void>() {
#Override
public Void apply(TableRow obj) {
System.out.println("***" + obj);
obj.entrySet().forEach(
(k)-> {
System.out.println(k.getKey() + " :" + k.getValue());
}
);
return null;
}
}));
pipeline.run().waitUntilFinish();
BigQuery Example:
// [START bigquery_simple_app_client]
BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
// [END bigquery_simple_app_client]
// [START bigquery_simple_app_query]
QueryJobConfiguration queryConfig =
QueryJobConfiguration.newBuilder(
"SELECT "
+ "CONCAT('https://stackoverflow.com/questions/', CAST(id as STRING)) as url, "
+ "view_count "
+ "FROM `bigquery-public-data.stackoverflow.posts_questions` "
+ "WHERE tags like '%google-bigquery%' "
+ "ORDER BY favorite_count DESC LIMIT 10")
// Use standard SQL syntax for queries.
// See: https://cloud.google.com/bigquery/sql-reference/
.setUseLegacySql(false)
.build();
// Create a job ID so that we can safely retry.
JobId jobId = JobId.of(UUID.randomUUID().toString());
Job queryJob = bigquery.create(JobInfo.newBuilder(queryConfig).setJobId(jobId).build());
// Wait for the query to complete.
queryJob = queryJob.waitFor();
// Check for errors
if (queryJob == null) {
throw new RuntimeException("Job no longer exists");
} else if (queryJob.getStatus().getError() != null) {
// You can also look at queryJob.getStatus().getExecutionErrors() for all
// errors, not just the latest one.
throw new RuntimeException(queryJob.getStatus().getError().toString());
}
// [END bigquery_simple_app_query]
// [START bigquery_simple_app_print]
// Get the results.
QueryResponse response = bigquery.getQueryResults(jobId);
TableResult result = queryJob.getQueryResults();
// Print all pages of the results.
for (FieldValueList row : result.iterateAll()) {
String url = row.get("url").getStringValue();
long viewCount = row.get("view_count").getLongValue();
System.out.printf("url: %s views: %d%n", url, viewCount);
}
// [END bigquery_simple_app_print]

Execute IN() SQL query with JDBCTemplate

I get error running the following query. I know that I need to use '?' as many as list size, but the size of list differs and I don't know how to replace the '?'.
public List<Object> findAllGrades(String pnr, List<String>codes) {
return jdbcTemplate.query(
"select CourseParticipantship.grade from [thd].[dbo].[CourseParticipantship] "
+ "inner join thd.dbo.CourseMaterial on courseMaterial_id = CourseMaterial.id "
+ "inner join thd.dbo.Course on course_id = Course.id and Course.code in (?)"
+ "inner join thd.dbo.Student on student_id = Student.id "
+ "where Student.personalNumber in (?)",new Object[] { codes, pnr }, new RowMapper() {
public Object mapRow(ResultSet rs, int arg1) throws SQLException {
return rs.getObject("grade");
}
});
}
I get this error: Unable to convert between java.util.ArrayList and JAVA_OBJECT
You should use in here: + "where S.pnr in (?)"
You have to change
+ "where S.pnr= ?",new Object[] { codes, pnr }, new RowMapper() {
to
+ "where S.pnr in (?)",new Object[] { codes, pnr }, new RowMapper() {

Oracle vs Oracle ODBC

The following code works fine from within Oracle's SqlPlus (using Oracle 11.2.02.0g) however when I connect with and ODBC connection via C# code, I get told I have an invalid character.
Since the single quote didn't work in SQLplus, I'm assuming the characters that are consider invalid by ODBC are the double quotes. I've tried braces '{' and brackets '[' but still get the same error -> ERROR [HY000][Oracle][ODBC][Ora]ORA-00911:invalid character <-
Any help would be much appreciated. I still don't understand why SQL statements would be interpreted differently because of the connection type.
CREATE USER "AD1\EGRYXU" IDENTIFIED EXTERNALLY;
Error if ran alone that states the username conflicts with another user or role name. It does create the user in the database.
C# Code is below.
private void button1_Click(object sender, EventArgs e)
{
string happy = "";
string sql1 = "";
string sql2 = "";
string sql3 = "";
string sql4 = "";
string column;
int rownum = -1;
bool frst = false;
string dirIni = "\\\\ramxtxss021-f01\\hou_common_013\\globaluser\\";
string fileIni = "add_users.sql";
string transIniFullFileName = Path.Combine(dirIni, fileIni);
System.Data.Odbc.OdbcConnection conn = new System.Data.Odbc.OdbcConnection();
num_users = (usrdetails.Count > 0);
if (regions && num_users)
{
using (StreamWriter sw = new StreamWriter(transIniFullFileName))
{
for (int y = 0; y < usrdetails.Count; y++)
{
switch(usrdetails[y].add_del.ToUpper())
{
case "A":
sql1 = "CREATE USER \"" + usrdetails[y].userID.ToUpper() + "\" IDENTIFIED EXTERNALLY;";
sql2 = "GRANT EDMROLE TO \"" + usrdetails[y].userID.ToUpper() + "\";";
sql3 = "INSERT INTO MD_SITE_USER VALUES(generate_key(5), (select user_id from MD_SITE_USER where user_name = '" +
usrdetails[y].group + "') , {" + usrdetails[y].userID.ToUpper() + "}, " + usrdetails[y].seclev +
", '" + usrdetails[y].username.ToUpper() + "', 'U', '" + usrdetails[y].isext.ToUpper() + "', 'N');";
sw.WriteLine(sql1);
sw.WriteLine(sql2);
sw.WriteLine(sql3);
break;
case "D":
sql2 = "DELETE MD_SITE_APP_ACTION_OWNER WHERE user_id in (SELECT user_id FROM MD_SITE_USER where user_name = ‘"+ usrdetails[y].userID + "’+ and user_or_group = ‘U’);";
sql3 = "DELETE FROM MD_SITE_USER where user_name = ‘"+ usrdetails[y].userID + "’ and user_or_group = ‘U’;";
sql4 = "DROP USER "+ usrdetails[y].userID + " FROM USERS;";
sw.WriteLine(sql2);
sw.WriteLine(sql3);
sw.WriteLine(sql4);
break;
default:
MessageBox.Show("Add/Delete command argument not recognized for user\r\n" + usrdetails[y].userID + " \r\n Argument -> " + usrdetails[y].add_del);
break;
}
}
sw.Close();
}
for (int x = 0; x < region.Count; x++)
{
OdbcCommand command = new OdbcCommand();
conn.ConnectionString = "Driver={Oracle in OraClient11g_home1};" +
"Dbq=" + region[x].dbname +
";Uid=" + region[x].username + ";Pwd=" + region[x].password + ";";
try
{
string cmdTexts = File.ReadAllText(transIniFullFileName);
conn.Open();
using (conn)
{
command.Connection = conn;
command.CommandText = cmdTexts;
command.ExecuteNonQuery();
OdbcDataReader dr = command.ExecuteReader();
Form6.dataGridView2.AutoGenerateColumns = false;
if (!frst)
{
for (int i = 0; i < dr.FieldCount; i++)
{
column = dr.GetName(i);
Form6.dataGridView2.Columns.Add("col" + i, column);
Form6.dataGridView2.Columns[i].FillWeight = 1;
}
frst = true;
}
rownum++;
dataGridView1.Rows.Add();
dataGridView1.Rows[rownum].Cells[0].Value = "Results for Region -> " + Form5.region[x].dbname;
dataGridView1.Refresh();
while (dr.Read())
{
rownum++;
Form6.dataGridView2.Rows.Add();
for (int i = 0; i < dr.FieldCount; i++)
{
column = dr.GetValue(i).ToString();
Form6.dataGridView2.Rows[rownum].Cells[i].Value = column;
}
}
Form6.dataGridView2.Refresh();
Form6.dataGridView2.Show();
Form6.Show();
}
conn.Close();
Form6.dataGridView2.Refresh();
}
catch (Exception ex)
{
MessageBox.Show("Error Message: " + ex.Message);
}
}
}
else
{
if (!regions)
happy = "Error - You have not selected any regions.\r\n";
else
happy = "Regions are now selected.\r\n";
if (!num_users)
happy = happy + "Error - You have not entered any users.\r\n";
MessageBox.Show(happy);
}
File.Delete(transIniFullFileName);
}
Don't use ";" (semi-colon) in the command text..
The command text within ODBC or ODP should be a command, e.g. not a set of commands, therefore - ";" is not relevant, and is an invalid character.
it appears you are trying to run a script..
if that is your intent, it should be padded with a "begin" and "end" for the code to be able to run:
BEGIN
INSERT...;
DELETE ...;
END;
(refer to http://www.intertech.com/Blog/executing-sql-scripts-with-oracle-odp/ for more info)
Last thing - if you want to run a "create user" (or any other DDL) from within an anonymous block or a procedure you need to run it with "execute immediate" syntax:
BEGIN
execute immediate 'CREATE USER test IDENTIFIED EXTERNALLY';
END;