Error - mismatched input 'SWAP' expecting 'REROUTE' in CrateDB for Alter Cluster query - cratedb

I am upgrading CRATE DB from version 2.x to 3.1.6 as per suggestion when I try to upgrade tables created in version 2.x using document,
https://crate.io/docs/crate/reference/en/latest/admin/system-information.html#tables-need-to-be-upgraded
In step 5,
Query -
alter CLUSTER SWAP table transactions2 to transactions;
I am getting error SQLActionException[SQLParseException: line 1:15: mismatched input 'SWAP' expecting 'REROUTE']
I am not sure what would be the correct query to resolve this.

You are following the latest documentation instead of the e.g. 2.3 (https://crate.io/docs/crate/reference/en/2.3/admin/system-information.html#tables-need-to-be-recreated) documentation version.
The SWAP SQL command support was added in version 3.2, see https://crate.io/docs/crate/reference/en/latest/appendices/release-notes/3.2.0.html#database-administration.

Related

SQL parse error when using "WITH CHANGE DATA FEED" on Databricks

This works:
ALTER SHARE my_share ADD TABLE main.default.my_table;
This doesn't:
ALTER SHARE my_share ADD TABLE main.default.my_table WITH CHANGE DATA FEED;
But according to the docs it should. But I get a "ParseException" and no further details.
What's the right syntax?
The cluster is running Databricks Runtime 10.4 LTS.
This is fixed in Databricks Runtime 11.3 LTS.

How to make work Django Window expression with SQLite?

I am testing Django ORM 'Window' SQL-wrapper capabilities.
I have following query in my code:
queryset = TripInterval.objects.annotate(
num=Window(RowNumber(), order_by=F('id').asc())
).values('id', 'num')
which results in the following SQL query string (from debugger):
SELECT "request_tripinterval"."id",
ROW_NUMBER() OVER (ORDER BY "request_tripinterval"."id" ASC) AS "num"
FROM "request_tripinterval"
and is pretty straightforward. It WORKS when I copy/paste it into third party db-client. But Django ORM gives me an error:
OperationalError
near "(": syntax error
What is wrong here?
System: Windows 10
RDBMS: SQLite
Django: 2.2.4
Python: 3.6.0
Sounds like your Python is using an outdated version of SQLite.
SQLite added support for window functions in version 3.25, released in August 2018. Prior to that version, the exact same syntax error you're seeing would be thrown when trying to use window functions.
You can check the SQLite version used by Python by running this in the interpreter:
import sqlite3
sqlite3.sqlite_version
If the version that is output is older that 3.25, you'll need to upgrade your SQLite library version.
On a Windows system, the easiest way to do that is by installing the sqlite package from Anaconda. Otherwise, the general approach is to upgrade your installed system SQLite libraries, then recompile/reinstall Python. Alternatively, you could try installing the pysqlite package from PyPi.

Hive on Spark execution engine failed

I am trying Hive on Spark execution engine.I am using Hadoop2.6.0 ,hive 1.2.1,spark 1.6.0.Hive is successfully running in mapreduce engine.Now I am trying Hive on Spark engine.Individually all are working properly.In Hive I set property as
set hive.execution.engine=spark;
set spark.master=spark://INBBRDSSVM294:7077;
set spark.executor.memory=2g;
set spark.serializer=org.apache.spark.serializer.KryoSerializer;
Added spark -asembly jar in hive lib.
and I am trying this command,
select count(*) from sample;
I am getting like this,
Starting Spark Job = b1410161-a414-41a9-a45a-cb7109028fff
Status: SENT
Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
Am I missing any other settings required,please guide me.
I think the problem may be because you use incompatible versions. If you see the version compatibility on Hive on Spark: Getting Started, you'll see that these two specific versions don't ensure the correct work.
I advise you to change the version and use the compatibility version that they advise. I had same problem and I solved when change the versions for compatibility versions.

missing column 'CONTEXTS' after Liquibase migration to version 3.4.0 from 2.0.4

I try to migrate Liquibase from old version 2.0.5 to 3.4.0. The first thing I tried was the status command, what resulted in a problem reading the table 'databasechangelog':
Error executing SQL SELECT
FILENAME,AUTHOR,ID,MD5SUM,DATEEXECUTED,ORDEREXECUTED,EXECTYPE,DESCRIPTION,COMMENTS,TAG,LIQUIBASE,LABELS,CONTEXTS
FROM IDENTSERVICE.DATABASECHANGELOG ORDER BY DATEEXECUTED ASC,
ORDEREXECUTED ASC: ORA-00904: "CONTEXTS"
I seems the are two new columns in the table: LABELS and CONTEXTS
The Lquibase website told it is just a drop-in replacement.
So my question: Do I need to go fist to version 3.0.0, or how do I get the new columns? Manual manipulation is not a option.
Checkout the ´StandardChangeLogHistoryService´ class.
Search for hasContexts e.g. There is code that checks if those columns are there.
So if the test for the column CONTEXTS fails there should be a log line that says:
"Adding missing databasechangelog.contexts column".
You can check you log for this.
So, yes it is supposed to be a drop-in replacement.
The solution is not running the status command but the update command that worked well.
Be careful, you can not go back to the old version of Liquibase. The old version thows checksum errors.

Timestamp out of range - PostgreSQL with OLEDB .NET

We have a .Net application which stores data in PostgreSQL. We have a perfectly working code with PostgreSQL 8.3 and now we are trying to upgrade our PostgreSQL server from version 8.3 to 9.3 and our code seems to break.
For connecting PostgeSQL we are using OLEDB.
The issue we are getting is “Timestamp out of range”. When looked through the logs we are receiving weird timestamp “152085-04-28 06:14:51.818821”.
From our application We are trying to pass a value from .Net code to postgreSQL function which is of type timestamp. As we are using OLEDB for connections, we are using parameter type as OleDbType.DBTimeStamp and sending date time value from .Net code. This code works in PostgreSQL 8.3 but breaks in 9.3. From the logs of Postgresql 9.3 the parameter value which we are receiving is “152085-04-28 06:14:51.818821”.
We tried to execute the same function using npgsql provider from sample .net code by passing Date time value and giving parameter type as NpgsqlDbType.TimestampTZ with this we are getting correct results. From the logs of PostgreSQL the parameter values received at the function is shown as “E'2014-01-30 12:17:50.804220'::timestamptz”.
Tried in other versions of postgresql i.e. 9.1, 9.2, 9.3 and was breaking in all these versions.
Any Idea why this is breaking in other versions of PostgreSQL when perfectly working in 8.3?
Thanks