"The maximum total size of all input parameters is 5242880 bytes." Error message when using BigQuery remote function - google-bigquery

When calling a remote function on BigQuery, I get the following error The maximum total size of all input parameters is 5242880 bytes.. This error is undocumented (or my Googling ability is lacking). It is even more unexpected as the query that pop this error is as simple as it gets:
SELECT
`remote_function`( payload ) AS results
FROM
`source_table`
The error is definitely on BQ's side, as no error is observed on the cloud-function called.
How to proceed on such issue, I'm guessing I hitting some kind of limit related to remote functions, but which one?

Related

How do I solve "-307" error on ZKTeco SDK?

Hello everyone and thanks for reading this problem.
I have a solution in C# using the zkemkeeper dll to get the records from some access control devices. When I "ping" them, there isn't any problem, but when I try to connect to them (Using my solution or the standalone demo to get attendance) I get the "-307" error with the "Unable to connect message". That's not very clear and I would really aprecciate if someone can explain what this error is (please!!!!). I would really like to understand these errors myself, so, where can I find all the definition of these errors?
In short:
1.- What is the problem regarding the "-307" error?
2.- Is there any place where all these errors are documented?
Thanks in advance!!
enter image description here
maybe you should check your device, you can ping them
Attention
The dwErrorCode parameter specifies the error code. The values are described as follows:
During connection, the following error codes may be returned:
0 Connected successfully
-1 Failed to invoke the interface
-2 Failed to initialize
-3 Failed to initialize parameters
-5 Data mode read error
-6 Wrong password
-7 Reply error
-8 Receive timeout
-307 Connection timeout
In invoking other interfaces, the following error codes may be returned:
-201 Device is busy
-199 New Mode
-103 device send back error of face version error
-102 face template version error, like 8.0 face template send to 7.0 device
-101 malloc memory failed
-100 Not supported or the data does not exist
-10 The length of transmitted data is incorrect
-5 Data already exists
-4 Insufficient space
-3 Wrong size
-2 File read/write error
-1 The SDK is not initialized and needs to be reconnected
0 Data not found or duplicate data
1 Correct operation
4 Parameter error
101 Buffer allocation error
102 repeat invoking
Underlying error codes:
-12001 Socket creation timeout (connection timeout)
-12002 Insufficient memory
-12003 Wrong Socket version
-12004 Not TCP protocol
-12005 Waiting timeout
-12006 Data transmission timeout
-12007 Data reading timeout
-12008 Failed to read Socket
-13009 Waiting event error
-13010 Exceeded retry attempts
-13011 Wrong reply ID
-13012 Checksum error
-13013 Waiting event timeout
-13014 DIRTY_DATA
-13015 Buffer size too small
-13016 Wrong data length
-13017 Invalid data read1
-13018 Invalid data read2
-13019 Invalid data read3
-13020 Data loss
-13021 Memory initialization error
-15001 Invoking return value of status key issued by SetShortkey interface repeatedly
-15002 Invoking return value of description issued by SetShortkey interface repeatedly
-15003 The two level menu is not opened in the device, and the data need not be issued
getdevicedata and setdevicedata invocation error codes
-15100 Error occurs in obtaining table structure
-15101 The condition field does not exist in the table structure
-15102 Inconsistency in the total number of fields
-15103 Inconsistency in sorting fields
-15104 Memory allocation error
-15105 Data parsing error
-15106 Data overflow as the transmitted data exceeds 4M
-15108 Invalid options
-15113 Data parsing error: table ID not found
-15114 A data exception is returned as the number of fields is smaller than or equal to 0
-15115 A data exception is returned as the total number of table fields is inconsistent with the
total number of fields of the data
Firmware error codes:
2000 Return OK to execute
-2001 Return Fail to execute command
-2002 Return Data
-2003 Regstered event occorred
-2004 Return REPEAT Command
-2005 Return UNAUTH Command
0xffff Return Unknown Command
-4999 Device parameter read error
-4998 Device parameter write error
-4997 The length of the data sent by the software to the device is incorrect
228
229
-4996 A parameter error exists in the data sent by the software to the device
-4995 Failed to add data to the database
-4994 Failed to update the database
-4993 Failed to read data from the database
-4992 Failed to delete data in the database
-4991 Data not found in the database
-4990 The data amount in the database reaches the limit
-4989 Failed to allocate memory to a session
-4988 Insufficient space in the memory allocated to a session
-4987 The memory allocated to a session overflows
-4986 File does not exist
-4985 File read failure
-4984 File write failure
-4983 Failed to calculate the hash value
-4982 Failed to allocate memory
Note
This interface is applicable to the new architecture firmware.

ERROR: Limit set by ERRORS= option reached. Further errors for this INPUT function will not be printed

While using the code able I'm getting an error message as below and also I'm getting the output table.
input(put(serv_to_DT_KEY,8.),yymmdd8.)
between datepart(D.throughdate)
and datepart(intnx('day',d.throughdate,31))
Error: INPUT function reported 'ERROR: Invalid date value' while processing WHERE clause.
ERROR: Limit set by ERRORS= option reached. Further errors for this INPUT function will not be printed.
Could you please help
It's not a true error, exactly; it's a data error, which SAS will let by. What it's saying is the value in serv_to_DT_KEY is not a valid yymmdd8 for some records. (The rest of the message, about "limit set by ...", is just SAS saying it's only going to tell you about 20 or so individual data errors instead of showing you every single one, so your log isn't hopeless.)
To fix this you have several options:
If there is a data issue [ie, all rows should have a valid date], then fix that.
If it's okay that some rows don't have a valid date (for example, they might have a missing), you can use the ?? modifier in the informat in input to tell it to ignore these.
Like this:
input(put(serv_to_DT_KEY,8.),?yymmdd8.)
between datepart(D.throughdate)
and datepart(intnx('day',d.throughdate,31))

Why do I get a "Conversion from string to number failed with value ''" error when connecting to Google BigQuery via the Simba ODBC driver?

I'm trying to use the Simba ODBC Driver for Google BigQuery. The connection succeeded initially, but after I enabled the High-Throughput API in "Advanced Options", I get the following error:
[Simba][Support] (50090) Conversion from string to number failed with value ''
I still get this error after unchecking the High-Throughput API option.
These are the options I initially set:
This error happens because the Minimum Query Results Size for HTAPI and Ratio of Results to Rows Per Block are both set to empty string.
Set both values to 0 to always download query results with the BigQuery Storage API.

Importing data from multi-value D3 database into SQL issues

Trying to use the mv.NET by bluefinity tools. Made some integration packages with it for importing data from a d3 multi-value database into MS SQL 2012 but seem to be having some trouble with the mapping.
For the VOYAGES table have some commentX fields in the D3 application that are acting quite unwieldy and the INSERT fails after a certain number of rows with the following message
>Error: 0xC0047062 at INSERT, mvNET Source[354]: System.Exception: Error #8: dataReader[0] = LTPAC002 ci.BufferColumnIndex = 52, ci.ColumnName = COMMGROUP(Error #8: dataReader[0] = LTPAC002 ci.BufferColumnIndex = 52, ci.ColumnName = COMMGROUP(The value is too large to fit in the column data area of the buffer.))
at mvNETDataSource.mvNETSource.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
Error: 0xC0047038 at INSERT, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.The PrimeOutput method on mvNET Source returned error code 0x80131500.The component returned a failure code when the pipeline engine called PrimeOutput().The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.There may be error messages posted before this with more information about the failure.
The value is too large to fit in the column data area of the buffer. -> tried changing the input / outputs types but can't seem to get it right.
In the SQL table the columns are of type ntext.
In the .dtsx job the data type for the columns are of type Unicode String [DT_WSTR] with length 4000 , I guess these are auto-detected.
The import worked for other D3 files like this not sure why it fails for these comment fields.
Running the query on the mv.NET Data Manager ( on the d3 server) times out after 240 seconds so maybe this is the underlying issue?
Any ideas how to proceed? Thank you ~
Most like reason is column COMMGROUP does not have correct data type or some record in source do not fit in output type
To find error record (causing) you have to use on redirect row (property of component failing component ) and get the result set in some txt.csv /or tsv file .
then check data
The exception is being thrown from mv.NET so I suggest you call (or ask your reseller) to call Bluefinity support and ask them about this. You're paying for support, might as well use it. Those programs shouldn't be allowed to throw exceptions like that.
D3 doesn't export Unicode, that might be one issue. But if the Data Manager times-out then I suspect something is wrong in the connectivity into D3. Open a Connection Monitor from the Session Monitor and watch the connection when you make the request. I'm guessing it's either hanging or more probably it's falling into BASIC Debug.
Make sure all D3-side programs related to this are either all Flash-compiled, or all Not Flashed. Your app code will fall into Debug if it's not Flashed but MVNET.BP is.
If it's your program that's in Debug, fix it. If you're not sure which program it is, LIST-RUNTIME-ERRORS in DM.
If it's a MVNET.BP program, again work with Bluefinity. If you are using MVSP for connectivity then the Connection Monitor may be useless, you'll need to change that to an IP (Telnet) connection to see the raw data exchange.

Getting exception while reading data from blob in azure

While I am trying to read the list of blob data on azure, I am getting the following error:
Function evaluation disabled because a previous function evaluation timed out. You must continue execution to reenable function evaluation.
How to resolve this?
Please see the following link. Your code likely has a endless loop. https://msdn.microsoft.com/en-us/library/ms234762.aspx