Dialogflow Entity having more than 30,000 parameters - entity

I want to get the utterance for the medicine name. I have more than 300k medicines but Dialogflow has a limit of only 30,000 parameters in an entity. Can I do something about it? Can I attach an external spread sheet with it?

Related

Is there an API call to list all available product id's (ID's alone) in Shopify?

I use POSTMAN GUI for retrieving a list of items from Shopify API.
I would like to know if there exists a way to get available product id's alone, preferably as a list of values over a single api GET call. The one I know of is
/admin/api/2022-04/products.json
It returns a list of all product information, and looping over them/traversing the json is not very efficient. I hope there must be an easy way to fetch all ID's alone in one go. Should there be not?
You can add /admin/api/2022-04/products.json?fields=id to the end of your request to limit the output only for a single field or multiply.
Please note that if you have more than 250 products you will need to make more than one request, since the request is limited to 250 products.
You can make one call to the Admin API for all your product IDs using the Bulk Query. That will result in you receiving a URL where you download a file in JSONL format with every single product ID in your store, without the paging or limits of other approaches.

Bulk data filters in Tableau

Our organization is in e-commerce and users are looking to change a filter everyday with a different list of items, and none of the users will have their own license, just read-only access. The data is connected through Google Big Query, is there a way to have this bulk filter upload capability without the License owners having to touch the filter each time?
Example
Product ID is the filter
Monday: they have a list of 10,000 ID's they want to check sales for
Tuesday: They have a new list of 4,000 different ID's they want to check sales for.
Without clicking each ID each time, is there a way to just upload a list, csv, google sheet etc.
We thought users can upload a list of Product ids to Google sheets which can map to a BigQuery table. We can use it to join with the sales table and get the relevant data. However this becomes unmanageable when we have more than 1 user as users might step on to others data.
Any suggestions/recommendations are welcome. Our team is pretty new to Tableau as such. Let me know if any additional details are needed.
Have you tried changing the filter type to "Multi Values (custom list)" and then having the report user paste their list into the filter? See below:

How to access weather data from all the stations in one country using openweathermap?

I am using Alteryx to extract weather data for a handful of cities and it works great. I'd like to expand this to able to download data for all weather stations in the UK. At the moment I am specifying which cities I want, e.g. London / Manchester.
Is there a way of specifying in the api call to download all stations in 'GB' or 'UK'?
Ideally I'd like to do this in one call rather than listing all locations which will be very laborious
Get a list of stations or cities that you want to retrieve weather data from. I found some good sources from openweather here: http://bulk.openweathermap.org/sample/
Then build a url request using the list of id's above that retrieves specific weather information. Using an id for the weather station in Cairns, id=2172797, the url ends up looking like:
http://api.openweathermap.org/data/2.5/weather?id=2172797&appid=843798874aac0ef138e6f77c72f3af80
Note that this url will return an error because this isn't a real appid. If you replace the appid with your own, this url will give you data for that station.
Putting this process into Alteryx lets you put the list of station id's together with the url and the appid to make many calls into openweather and then process all of the data together. I could not find information from the API on rate limits, so be conscious of how many requests you are posting to the service.
There is an example of this process here: https://www.dropbox.com/s/yoppbx3bw0p4rug/Get%20individual%20stations.yxzp?dl=0
Keep in mind that you have to update the Appid in the text input tool within this sample as well.

Dynamic number of columns exceeds max column limitation SQL Server

I have what I consider a real need to create a query with several hundred columns.
We are working on a mailing for our client. In this mailing, they are listing out several locations where their customers can go to get information. As our designers create the template for this mailing, they are setting up "Slots" for each address. The number of slots on the mailing varies from one mailing to the other, from 6 to possibly 50.
My need for the query is to setup the merge of data into the mailing. I need to provide a query where each mailing is 1 record containing all the information they need for that mailing. I am dynamically creating the SQL statement with the max number of slots on that mailing. With up to 50 slots on that mailing, my query needs to look like this:
MailingID,
LogoLocation,
APNCode,
TFN,
CopyVersion,
Slot1_Name,
Slot1_Address,
Slot1_City,
Slot1_State,
Slot1_DateTime,
...
Slot50_Name,
Slot50_Address,
Slot50_City,
Slot50_State,
Slot50_DateTime
My first attempt was to create a table with all these fields, but I got this error:
The table has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit.
They only want the data in a CSV file, so I don't need to create a temp table for it.
My problem is that I'm trying to create a standard process and with the number of fields varying like that, I want to set this up in a way that we won't blow up the system every time we try and run it.
I've looked at a few pages and found details on the size limitations of SQL Server and several comments saying a table like this shows a bad database design.
http://msdn.microsoft.com/en-us/library/ms143432(v=sql.105).aspx
http://social.msdn.microsoft.com/Forums/en-US/fec1efbb-94ff-4fe9-8d69-12e95c48587d/its-maximum-row-size-exceeds-the-allowed-maximum-of-8060-bytes-insert-or-update-to-this-table-will?forum=transactsql
Work around SQL Server maximum columns limit 1024 and 8kb record size
I'm hoping that someone out there has some experience doing this and can share some insights on how to make this efficient. Is there another way to accomplish this that I don't know about?
UPDATE:
Thanks for all the quick replies.
More detail on my scenario. You get a flyer in the mail and when you turn the flyer over, it lists 50 locations in your county where you could go take a class or attend a meeting or something. All the details for that flyer needs to be in 1 record so they can map the fields on the one page. If that county has 50 address/date/time combinations, they need them included in the 1 record so they can properly slot the flyer. Think giant mail merge where there might only be 100 counties (100 flyers) but each flyer has tons of information.
When the data is actually stored in the database, I'm storing an id for the specific flyer (MailingID) and each address/date/time combo is its own record. It's just the file they need to merge the details onto the creative piece that has to be denormalized like this.
I haven't been able to find any details on limitations on views. Does a View have the same limitations as a table? Would it work to create a view for them that they can download when they need the data?
All the details for that flyer needs to be in 1 record so they can map the fields on the one page That is a questionable assumption. Why can't the data be stored in 50 rows in a 2nd table?
Anyway, if you insist on storing everything in one row you should probable use XML or JSON. That makes all these problems go away. SQL Server has great support for XML. You can even generate XML on the fly. So you could properly store the 50 items in a 2nd table and only combine them into one XML value for query purposes.

WCF Service call to multiple rows. How to architecture it?

Let's say I have a scenario where I have a Service that is responsible for retrieving the price of a single product given the ID and another Service that gives me the current stock position of it.
It's okay when I'm looking only at one product, but what about a list page where I have 60 products? What should I do in this case? Call the service product by product to retrieve it's current price and stock position?
I think this would be extremely slow and cost a lot of resources. But what could I do to make it look good and at the same time have performance?
Currently we have these information in the database in a column next to the product. These price and stock columns are updated by a sql service that updates it when is necessary so when I need to grab the value for a lot of products at the same time I just need to select two columns more. It's really fast, but right now we have some more systems in need of this information and I would like to transform all these stuff in a service.
Thanks a lot!
Well, you could always have multiple service methods:
the one you already have, to retrieve a single product, and the price for a single product
create a new service method which would take a list of product ID's, and return a list of product objects - or a new DTO (data-transfer object) that holds product and price
This way, you could keep your current users happy, and also do something to make batch requests work more efficiently.
Can your service method take an array of IDs? And return an array of values? That way if you want one record your array only has 1 item, if you want more, you just populate the array with multiple values?