What is the size limit for the 'name' field when creating an event with the Layout Automation Social Tables API endpoint?
https://developer-portal.socialtables.com/api-console#!/Layout_Automation/post_4_0_layout_automation
The name field is automatically truncated to 255 characters, but this limit will not result in an error at request time.
Related
My goal is to get the length of text and perform other functions on the text that is nested in a Google Big Query field. The data in question comes from the BQ public dataset: public patents data. Right now I'm using the BQ console to fetch the data, but in the end I will use an API.
I just want to get the length of the text instead of having to fetch the whole field locally to analyze it, or even to truncate the field at a certain length to make the download feasible.
This query runs, but returns all NULL for all fields except the application_number field. If I specify WHERE all fields IS NOT NULL, I get the same response. All null fields.
SELECT
-- Get the application number
application_number,
-- Get the length of the claims text
LENGTH(claims_localized[SAFE_OFFSET(0)].text) as claims_length,
-- Get the length of the description text
LENGTH(description_localized[SAFE_OFFSET(0)].text) as description,
-- Get claims truncated at the first double line break
SPLIT(claims_localized_html[SAFE_OFFSET(0)].text, "\n\n")[SAFE_OFFSET(0)] as first_claim_text,
-- Get the number of claims tags in claims html
ARRAY_LENGTH(SPLIT(claims_localized_html[SAFE_OFFSET(0)].text, "<claim>")) as claims_num,
-- Get the number of image tags in claims html
ARRAY_LENGTH(SPLIT(claims_localized_html[SAFE_OFFSET(0)].text, "<figref>")) as drawings_num
-- Specify database
FROM `patents-public-data.patents.publications_201909`
-- Specify not NULL claims text
WHERE claims_localized[SAFE_OFFSET(0)].text IS NOT NULL
LIMIT 1000
What am I doing wrong here to collect the data from these fields?
Here is what I get for results from the BQ console. It always NULL even when I specify the results not be NULL.
TT: I don't have a way to add an image to comment so i will add it in here, take a look: Nothing from your query is changed here and I still see the results perfectly okay. "May be try unchecking 'Use cache results' in your query settings and try again, it might be something that u run before which is stuck in cache/memory"
I have podio data with more number of column, but we need to fetch 5-6 column data through API. I attached column name screenshot. If we need only for example order id, city, country then how to write API query?
/item/app/{app_id}/filter/
If it is right, how to write query with selected column name with GET/POST.
The filter endpoint uses a POST body to filter which records to return, not which fields/columns to return. It is not possible to specify which fields/columns to return with an API call according to this SO thread from an old Podio support person.
If you are looking to remove fields from the query to reduce your datasource size within Klipfolio, I would recommend returning the API call in CSV format instead of JSON. Klipfolio support documents how to do this HERE by performing a GET operation and adding /csv to the end of the URL.
https://api.podio.com/item/app/Your-APP-ID/csv/
Is there a way to set a limit on a user to not query data of more than a particular size in a table.
Eg - If a user uses 'Select' command, he should be limited to query a certain amount of data irrespective of the query he writes.
I have been trying to follow this link - https://cloud.google.com/bigquery/quotas.
You can set quotas at the project-level or at the user-level. Be careful though since it is not possible to assign a custom quota to a specific user or service account.
You can set quotas by following this steps in the cloud console:
Go to quotas page
Select only BigQuery service
Look for Query usage per day per user or Query usage per day and select it
Click on Edit Quotas
Set the limit on the right side, validate and Submit
More information:
https://cloud.google.com/bigquery/docs/custom-quotas
https://cloud.google.com/docs/quota#managing_your_quota_console
I have an entity that I have populated with multiple thousands of values via the Wit API (https://wit.ai/docs/http/20160526#post--entities-:entity-id-values-link).
The script to add them seems to have completed successfully, but when I try to retrieve the entity to verify all of it's values, only 1000 are returned.
This appears to be a limit on the GET Entity API call (https://wit.ai/docs/http/20160526#get--entities-:entity-id-link). Is there a way to retrieve all entity values if there are more than 1000?
Can you try with the following pagination params in your url:
limit
offset
For example limit=1000&offset=1 should retrieve the expression from 1001 to 2000
I am encountering issue in fetching the keywords report with google adwords api.
Everything else is coming proper but for some keywords its giving "Content" as keword text.
Please could any one suggest me what should I do to get actual keyword.
I am using gem 'google-adwords-api'.
Thanks in advance.
It is a special fake keyword returned by AdWords for the "contextual display" stats for each ad group. (the actual per-keyword stats on the display network are given in the separate DISPLAY_KEYWORDS_PERFORMANCE_REPORT)
From the official developer guides: https://developers.google.com/adwords/api/docs/guides/reporting-concepts#keywordid3000000
In single attribution reports, all keywords that triggered impressions
on the display network will be represented by a special keyword (text:
Content) with ID 3000000.
Example
Keyword ID,Impressions
23458623485,2
23655322314,2
23953456345,2
3000000,4
If you target keywords and placements for Display Only and
run an Ad Performance report, you'll get a row for each ad and
triggering criteria combination for placements, and a single row with
ad and ID 3000000 which accounts for all the display keywords that
triggered that ad (where single attribution chose the keyword rather
than a placement).