Is there support for BLOBs in YugabyteDB like in Oracle or Postgres? - blob

Since YugabyteDB is a Postgres based implementation, will Postgres large objects( https://www.postgresql.org/docs/9.2/largeobjects.html) work in YSQL?

YugabyteDB currently supports BYTEA same as Postgresql. But doesn't have support for Large Objects like Postgresql (splitting the blob internally into chunks). There is a feature request issue on github for large object support.

Related

DHIS2 and mysql?

The DHIS2 documentation mentions that it supports mysql (https://docs.dhis2.org/2.28/en/implementer/html/installation.html), however thats the last point mysql is ever mentioned.
Does the current version really support mysql? If it does, will GIS still work?
From direct dhis2 support email...
Up until and including version 2.28, mysql should work.
However, from version 2.29 we require PostgreSQL as the database platform, together with the PostGIS spatial extension. This means that MySQL is no longer supported.
The minimum version required is PostgreSQL 9.1. However we recommend upgrading to a later version as we plan to take advantage of some of the useful features part of PostgreSQL 10 such as logical replication and native partitioning in future versions of DHIS 2.
First of all it is recommended to use postgres.
Secondly most of the testing and QA is done on instances with postgres.
Thirdly POST GIS extension is available only in postgres , which can cause a hurdle for you at later stage.
Fourthly , the GIS data points and boundaries are stored in a format which is better handled in postgres db structure.
Therefore please go with postgres and chill

Does Google BigQuery support jsonb as data type?

We are thinking move from PostgreSql to Google BigQuery. PostGresql support Jsonb as a data type. Can someone tell me does BigQuery also support that type?
You can read about supported data types in the documentation. There is no jsonb type, but you may be interested in the related feature request for a JSON type, which has some workarounds.

Limitation of SQL in BigQuery vs Cloud Spanner

While comparing Cloud Spanner vs BigQuery, Am trying to figure what kind of limitations there are in BigQuery's in SQL, compared to ANSI SQL (select part only) ?
Does BigQuery support all complex joins of ANSI SQL ?
Additionally, is there anything that Cloud Spanner can do and BigQuery cannot?
BigQuery Standard SQL is compliant with the SQL 2011 standard and has extensions that support querying nested and repeated data.
You can see about SELECT and JOINS and other details of the BigQuery Standard SQL in Query Syntax documentation
Additionnally, is there anything than Spanner can do and BigQuery cannot ?
Main difference between BigQuery and Spanner:
BigQuery - Large scale data warehouse service with append-only tables
Spanner - A horizontally scalable, globally consistent, relational database service
Foreign Keys, Transaction support, Indexes - are good examples of what is supported in Spanner but not in BigQuery
Note: above not supported features are by design and reflect respective purpose of those two products. What is "a must" feature for one is even conceptually not in another. Comparing BigQuery and Spanner is close to comparing Hadoop and mySQL for example if this will make it easier for you to imagine.
I think it would be great if you read respective documentations and then ask specific questions
cloud.google.com/­bigquery/­docs
cloud.google.com/­spanner/­docs

Can I use Madlib with Amazon Redshift?

The Madlib website suggests it is compatible with Postgresql. Amazon Redhift is based on Postgresql. Can install Madlib on Redshift?
The Madlib library suggests it is compatible with postgres, but the full advantage of MADlib you will take when you will start using it with a MPP Database( Massively Parallel Database ) and also uses some internal pyhton libraries which are similar in both and which may not be the case in Amazon Redshift, it will be good if you use it with greenplum which is also an opensource now and is totally based on Postgres otherwise you will not be able to get the most out of it.

How to retrieve text stored in BLOB column in ORACLE 11g using SQL?

I have compressed json text stored in BLOB column in Oracle 11g.
Is it possible to retrieve it using SQL only?
EDIT:
AFAIK the data was compressed on Linux OS using ZLIB and loaded using dbms_lob.loadfromfile
Oracle doesn't provide any built-in functions that would uncompress a ZLIB-compressed stream (though utl_compress uses very, very similar algorithms).
You would realistically need to load one of the various Java libraries that uncompresses a ZLIB-compressed stream into the database, write a bit of code to wrap that library, and then call that library from SQL. This wouldn't be a pure SQL implementation.
If you're really ambitious, it should be possible to implement the DEFLATE algorithm in pure SQL though that would likely be exceedingly painful SQL to write (or debug or maintain).