Unable to archive documents on Enterprise Scan to Opentext after adding Text: Table Key Lookup in category - enterprise

Good day,
I am struggling with an Enterprise scan issue after adding table key lookups to a category in Opentext.
Error: Critical error during archiving. Invalid JSON primitive: . The archiving procedure aborted.
Would someone be able to guide me on how to fix this issue?
Thank youenter image description here

Related

SSAS corrupt string store data file for one of the table columns error

I'm getting error on SSAS when redeploy the project. The error is;
The JSON DDL request failed with the following error: Error happened while loading table data. Possible cause is: corrupt string store data file for one of the table columns.Error happened while loading table data.A duplicate value has been detected in the Unique Value store associated with the dictionary.Database consistency checks (DBCC) failed while checking the data segments.Error happened while loading table '', file '1245.H$Countries (437294994)$Country (437295007).POS_TO_ID.0.idf'.Database consistency checks (DBCC) failed while checking the data segments.Error happened while loading table '', file '1245.H$Countries (437294994)$City ....
I checked the table Countries but there is no duplicated data.
Is there anybody who can help please?
As the error implies, the model has some corrupted data (not to be confused with duplicated data).
Microsoft has some resolutions for there kinds of errors here: https://learn.microsoft.com/en-us/analysis-services/instances/database-consistency-checker-dbcc-for-analysis-services?view=asallproducts-allversions#common-resolutions-for-error-conditions
TL:DR:
Depending on the error, the recommended resolution is to either
reprocess an object, delete and redeploy a solution, or restore the
database.

Why does WiX Torch giving me an error code 0279?

I have a problem when building .wix MST difference file. I get the following error:
"The table definition of target database does not match the table definition updated database. A transform requires that the target database schema match the update database schema".
I tried finding solution on internet for almost 2 hours but no luck. I know It is probably caused by difference in MSI tables schema but I have no idea how to fix that.

Is there size limit on appending ORC data files to Vora tables

I created a Vora table in Vora 1.3 and tried to append data to that table from ORC files that I got from SAP BW archiving process (NLS on Hadoop). I had 20 files, in total containing approx 50 Mio records.
When I tried to use the "files" setting in the APPEND statement as "/path/*", after approx 1 hour Vora returned this error message:
com.sap.spark.vora.client.VoraClientException: Could not load table F002_5F: [Vora [eba156.extendtec.com.au:42681.1640438]] java.lang.RuntimeException: Wrong magic number in response, expected: 0x56320170, actual: 0x00000000. An unsuccessful attempt to load a table might lead to an inconsistent table state. Please drop the table and re-create it if necessary. with error code 0, status ERROR_STATUS
Next thing I tried was appending data from each file using separate APPEND statements. On the 15th append (of 20) I've got the same error message.
The error indicates that the Vora engine on node eba156.extendtec.com.au is not available. I suspect it either crashed or ran into an out-of-memory situtation.
You can check the log directory for a crash dump. If you find one, please open a customer message for further investigation.
If you do not find a crash dump, it is likely a out-of-memory situation. You should find confirmation in either the engine log file or in /var/log/messages (if the oom killer ended the process). In that case, the available memory is not sufficient to load the data.

Unable to allocate new pages in table space "XXX" IBM DB2 SQL Replication

Product: IBM DB2
OS: Windows 2008 R2
I am trying to perform SQL replication on my database, I have created the capture tables, while I am trying to register the tables I got the following error message
[IBM][CLI Driver][DB2/NT64] SQL0289N Unable to allocate new pages in table space "xxxxxxx". SQLSTATE=57011"
Thanks In advance.
I suggest you to check the extent size of your tablespace. Run the following command SELECT TBSPACE, OWNER, EXTENTSIZE FROM SYSCAT.TABLESPACES This will give details of EXTENTSIZE of TABLESPACE. You may need to change the extent size depending on your requirements.
An extent is a block of storage within a table space container. It represents the number of pages of data that will be written to a container before writing to the next container. When you create a table space, you can choose the extent size based on your requirements for performance and storage management. See more details here

Rails 3 - heroku PGError: ERROR: type modifier is not allowed for type "text"

I am a newbie on Rails and doing ok so far. I wanted to find out what schema file is uploaded when you do heroku rake db:setup. Because even though I have deleted a table it keeps trying to create it on heroku and gives error.
I even tried recreating the table but it keeps remembering the old setting and errors out.
PGError: ERROR: type modifier is not allowed for type "text" LINE 1:
"trainings" ("id" serial primary key, "content" text(255),...
It's trying to create table trainings with content column text but I no longer have that setting and I think the setting is saved somewhere.
I even tried deleting my app and restarting it but no luck.
Any clues?
Thanks.
the default database on heroku is postgresql. And the text type in postgresql does not accept a size: it is unlimited.
See http://www.postgresql.org/docs/9.1/static/datatype-character.html