Am trying to execute the IoT Rule as part of the CFT template.
This rule has to ignore the message which has this field sNumber starting with F0F1.
The rule is like
SELECT * FROM 'topic/+/+/+' WHERE 'sNumber' NOT LIKE 'F0F1%'
But, am facing this error:
Resource handler returned message: "Expected a comparison operation:
StringNode(sNumber) 'sNumber' NOT LIKE 'F0F1%'
--------------------------------------------------------------------------------------------------------------------------------^ at 1:34 (Service: Iot, Status Code: 400, Request ID:
75e91f11-05c8-4e22-8cd7-0a3567261695, Extended Request ID: null)"
(RequestToken: 6cd8d39d-1b2d-4076-6253-60212009a63a, HandlerErrorCode:
InvalidRequest)
Can you help me understand what need to be done to achieve my requirement.
Try using startswith() function
SELECT * FROM 'topic/+/+/+' WHERE NOT startswith(sNumber, 'F0F1')
Related
when I'm trying creating new table in access db via dbeaver using the following code
SELECT *
INTO new_tab
FROM tab
WHERE col_name = smth
I obtain error
"SQL Error [42581]: UCAExc:::5.0.1 unexpected token : INTO required: FROM : line: "
What can be the reason of this problem?
I've updated hsqldb.jar to version hsqldb-2.7.1
Please direct me to way where to dig the neccessary information to resolve this bug
I am trying to create AWS userpool client using AWS CDK. I am doing this using python code.
Below is my code-
oAuthScopes = ["access-db-data"]
supportedIdentityProviders = ["COGNITO"]
allowedOAuthFlows = ["Token"]
cognito_userpool_clients = _cognito.CfnUserPoolClient(stack, id="user-pool-client-id", user_pool_id="****", client_name="client-name",
generate_secret=True, allowed_o_auth_scopes=oAuthScopes, supported_identity_providers=supportedIdentityProviders, allowed_o_auth_flows=allowedOAuthFlows, allowed_o_auth_flows_user_pool_client=True)
I have tried different options for allowedOAuthFlows value as "TOKEN", "token", "CODE", "Code", "code". Still it is not working. For above code I am getting below error -
validation error detected: Value '[Token]' at 'allowedOAuthFlows' failed to satisfy constraint: Member must satisfy constraint: [Member must satisfy enum value set: [implicit, client_credentials, code]] (Service: AWSCognitoIdentityProviderService; Status Code: 400; Error Code: InvalidParameterException;
I don't know what's going wrong here. I referred this link for - https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cognito-userpoolclient.html#cfn-cognito-userpoolclient-allowedoauthflows
Solution -
I have updated its value to "client_credentials" and it worked.
I am trying to UNLOAD a Redshift table to an S3 bucket, but I am getting errors that I can't resolve.
When using 's3://mybucket/' as the destination (which is the documented way to specify the destination), I have an error saying S3ServiceException:The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint..
After some research I have tried to change the destination to include the full bucket url, without success.
All these destinations:
's3://mybucket.s3.amazonaws.com/',
's3://mybucket.s3.amazonaws.com/myprefix',
's3://mybucket.s3.eu-west-2.amazonaws.com/',
's3://mybucket.s3.eu-west-2.amazonaws.com/myprefix'
return this error S3ServiceException:The authorization header is malformed; the region 'eu-west-2' is wrong; expecting 'us-east-1', which is also the error returned when I use a bucket name that doesn't exist.
My Redshift cluster and my s3 buckets all exist in the same region, eu-west-2.
What am I doing wrong?
[appendix]
Full command:
UNLOAD ('select * from mytable')
to 's3://mybucket.s3.amazonaws.com/'
iam_role 'arn:aws:iam::0123456789:role/aws-service-
role/redshift.amazonaws.com/AWSServiceRoleForRedshift'
Full errors:
ERROR: S3ServiceException:The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.,Status 301,Error PermanentRedirect,Rid 6ADF2C929FD2BE08,ExtRid vjcTnD02Na/rRtLvWsk5r6p0H0xncMJf6KBK
DETAIL:
-----------------------------------------------
error: S3ServiceException:The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.,Status 301,Error PermanentRedirect,Rid 6ADF2C929FD2BE08,ExtRid vjcTnD02Na/rRtLvWsk5r6p0H0xncMJf6KBK
code: 8001
context: Listing bucket=mybucket prefix=
query: 0
location: s3_unloader.cpp:226
process: padbmaster [pid=30717]
-----------------------------------------------
ERROR: S3ServiceException:The authorization header is malformed; the region 'eu-west-2' is wrong; expecting 'us-east-1',Status 400,Error AuthorizationHeaderMalformed,Rid 559E4184FA02B03F,ExtRid H9oRcFwzStw43ynA+rinTOmynhWfQJlRz0QIcXcm5K7fOmJSRcOcHuVlUlhGebJK5iH2L
DETAIL:
-----------------------------------------------
error: S3ServiceException:The authorization header is malformed; the region 'eu-west-2' is wrong; expecting 'us-east-1',Status 400,Error AuthorizationHeaderMalformed,Rid 559E4184FA02B03F,ExtRid H9oRcFwzStw43ynA+rinTOmynhWfQJlRz0QIcXcm5K7fOmJSRcOcHuVlUlhGebJK5iH2L
code: 8001
context: Listing bucket=mybucket.s3.amazonaws.com prefix=
query: 0
location: s3_unloader.cpp:226
process: padbmaster [pid=30717]
-----------------------------------------------
Bucket zone
Cluster zone
I am using kibi-community-demo-full-4.6.4-linux-x64 version.
In datasource:
"connection_string": "jdbc:hive://localhost:10000/root",
"libpath": "/home/pare/Downloads/jar/",
"drivername": "org.apache.hadoop.hive.jdbc.HiveDriver",
"libs": "hive-jdbc-0.11.0.jar,hive-metastore-0.11.0.jar,libthrift-0.9.1.jar,hive-service-0.13.1.jar,hive-jdbc-1.2.1.2.3.2.0-2950-standalone.jar,hadoop-common-2.7.1.2.3.2.0-2950.jar",
After that when in queries I write a query it will show error like:
Queries Editor: Error 400 Bad Request: Error running static method java.lang.IllegalArgumentException: Bad URL format at org.apache.hive.jdbc.Utils.parseURL(Utils.java:185) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:84)
What is the error can any one explain me how to solve it?
I am able to connect after changing the jars version.
and also I changed the driver name "org.apache.hive.jdbc.HiveDriver".
I try to get information from Yodlee API.
I have a test user where I've implemented adding an account and I got refresh OK from the site:
{ siteRefreshStatus: {
siteRefreshStatusId: 8
siteRefreshStatus: "REFRESH_COMPLETED_WITH_UNCERTAIN_ACCOUNT"
}
- siteRefreshMode: {
refreshModeId: 2
refreshMode: "NORMAL"
}
- updateInitTime: 0
nextUpdate: 1391603301
code: 403
noOfRetry: 0
}
}
Now when I try to perform search and get the actual transactions I get this error:
{
errorOccured: "true"
exceptionType: "com.yodlee.core.IllegalArgumentValueException"
refrenceCode: "_57c250a9-71e8-4d4b-830d-0f51a4811516"
message: "Invalid argument value: Container type cannot be null"
}
The problem is that I have container type!
Check out the parameters I send:
cobSessionToken=08062013_2%3Ad02590d4474591e507129bf6baaa58e81cd9eaacb5753e9441cd0b1ca3b8bd00a3e6b6a943956e947458307c1bb94b505e2eb4398f890040a3db8c98606c0392&userSessionToken=08062013_0%3A8e8ef9dd4f294e0f16dedf98c1794b96bf33f2e1f2686eda2f35dfe4901dd3a871eed6d08ce52c99a74deb004c025ebf4bf94c7b17baf8ba18aacb331588f5f5&transactionSearchRequest.containerType=bank&transactionSearchRequest.higherFetchLimit=1000&transactionSearchRequest.lowerFetchLimit=1&transactionSearchRequest.resultRange.endNumber=500&transactionSearchRequest.resultRange.startNumber=1&transactionSearchRequest.searchClients.clientId=1&transactionSearchRequest.searchClients.clientName=DataSearchService&transactionSearchRequest.ignoreUserInput=true&transactionSearchRequest.searchFilter.currencyCode=USD&transactionSearchRequest.searchFilter.postDateRange.fromDate=01-01-2014&transactionSearchRequest.searchFilter.postDateRange.toDate=01-31-2014&transactionSearchRequest.searchFilter+.transactionSplitType=ALL_TRANSACTION&transactionSearchRequest.searchFilter.itemAccountId+.identifier=10008425&transactionSearchRequest.searchClients=DEFAULT_SERVICE_CLIENT
There is an error occurred while adding the account, which can be interpreted by this parameter code: 403 and hence you will not be seeing that account when you call the getItemSummary API. An account is successfully linked if the code has zero as value. E.g.code:0 . 403 is an error which is show if Yodlee's data agent has encountered an unhandled use case. Hence for any such error you should file a service request using Yodlee customer care tool.
To know more about error codes please visit -
https://developer.yodlee.com/FAQs/Error_Codes
The status is show as completedsiteRefreshStatus: "REFRESH_COMPLETED_WITH_UNCERTAIN_ACCOUNT"because addition of any account is followed by a refresh in which Yodlee's data agent logs into the websites of FIs and try scraping data. Hence completion of this activity is denoted as REFRESH_COMPLETED even when there is an error occurred.
TranasctionSearch issue -
I can see two of the parameters with a "+" sign. Since transactionSlipttype and containerType are dependent on each other the error is thrown.
&transactionSearchRequest.searchFilter+.transactionSplitType=ALL_TRANSACTION
&transactionSearchRequest.searchFilter.itemAccountId+.identifier=10008425
The right parameters are -
&transactionSearchRequest.searchFilter.transactionSplitType=ALL_TRANSACTION
&transactionSearchRequest.searchFilter.itemAccountId.identifier=10008425