I was wondering what the limits were on the number of metafields that an entity in Shopify could have. For instance, under a given namespace for a product object, could you have 1000 unique key value pairs? Is there a hard limit?
Please note I have consulted the documentation on Shopify's Metafield API page (http://api.shopify.com/metafield.html) but it only states the following limits:
The namespace has a maximum of 20 characters, and the key has a maximum of 30 characters.
Thanks for help!
There's no hard limit, but if you're storing that much info you might want to consider doing it locally as retrieving it will become a pain.
The most metafields that we've applied to any given element to date is 5434. We have a collection that currently contains that many metafields, and it seems to be working fine!
I wouldn't advise doing this, as it's a nightmare to find and remove any via Postman if manual intervention is required. But it's certainly possible!
If you had the following
MyNS.Key1 = 1
MyNS.Key2 = 2
...
MyNS.Key1000 = 1000
You should be able to access it like
products.metafields.MyNS[someKey]
So not too difficult to retrieve, or am i missing something else
Related
First time posting on here because Google is yielding no results!
So, I have a website that is based around travelling and locations. Everytime someone enters content into the site, they select a location and that content then has lat and long, country, etc.
The issue I face is that I have a DB of all the "cities and areas" of the world and there are a good 3.5 million records in the database I believe.
My question to you is how would you guys recommend doing a 1 field autocomplete form for all the cities? I don't need advice on the autocomplete form itself, I need advice on HOW and WHERE I should be storing the data... text files? SQL? Up until now, I have been using SQL but I don't know how it should be done. Would an AJAX autoloader be able to handle it if I only returned 100 records or so? Should all the results be preloaded?
Thanks for your help guys!
EDIT: I have actually found another way to do it. I found this awesome little plugin to integrate Google Maps with it
http://xilinus.com/jquery-addresspicker/demos/index.html
Fantastic.
Benny
I have a few thoughts here:
since you don't know whether a user will enter the english or local (native) name, each city record in your database should have both. Make sure to index these fields.
Do not do auto-complete until you have a minimum number of characters. Otherwise, you will match way too many rows in your table. For example, assuming an even distribution of english characters (26), then at 3.5 million records you would statistically get thar = he following matches per character:
1 char = 135k
2 char = 5.2k
3 char = 200
4 char = 8
If you are using MySQL you will want to use the LIKE specifier.
There are much more advance methods for predictive matching, but this should be a good start.
I've gone through a number of pages and PDFs within the Paypal guides and x.com, but I can't find any reference to the maximum field lengths for the API login/connection. I see the Transaction ID maxes out at 19 characters, but they seem to avoid saying the maximums for the access fields.
I'm setting up a database table to hold multiple Paypal API logins as well as a section to edit it with I'll have validate by length. I want to use some real values instead of guessing 255 characters.
Surely someone must have been as specific as me in this regard, I'm hoping someone has found this answer.
I've always gone with 75 and that seems to be safe.
On the NVP API Method and Field Reference page at http://www.paypalobjects.com/en_US/ebook/PP_NVPAPI_DeveloperGuide/Appx_fieldreference.html they indicate EMAIL as "127 single byte characters".
As for the other items, I did not see any documented lengths, but checking 5 client accounts, it appears that
API PWD is 16 characters
API signature is 56 characters - in each account (these must be fixed length).
I was reading an article here and it looks like he is grabbing the IDs by the 100s. I thought it was possible to grab by 5000 each time?
The reason I'm asking is because sometimes there are profiles with much larger amounts of followers and you wouldn't have enough actions to do it all in one hour if one was to grab it by 100 each time.
So is it possible to grab 5000 ids each time, if so, how would I do this?
GET statuses/followers as shown in that article has been deprecated, but did used to return batches of 100
If you're trying to get follower ids, you would use GET followers/ids. This does return batches of up to 5000, and should just require you to change the URL slightly (see example URL at the bottom of the documentation page)
I have been trying to get the full list of playlists matching a certain keyword. I have discovered however that using start-index past 100 brings the same set of results as using start-index=1. It does not matter what the max-results parameter is - still the same results. The total results returned however is way above 100, thus it cannot be that the query returned only 100 results.
What might the problem be? Is it a quota of some sort or any other authentication restriction?
As an example - the queries bring the same result set, whether you use start-index=1, or start-index=101, or start-index = 201 etc:
http://gdata.youtube.com/feeds/api/playlists/snippets?q=%22Jan+Smit+Laura%22&max-results=50&start-index=1&v=2
Any idea will be much appreciated!
Regards
Christo
I made an interface for my site, and the way I avoided this problem is to do a query for a large number, then store the results. Let your web page then break up the results and present them however is needed.
For example, if someone wants to do a search of over 100 videos, do the search and collect the results, but only present them with the first group, say 10. Then when the person wants to see the next ten, you get them from the list you stored, rather than doing a new query.
Not only does this make paging faster, but it cuts down on the constant queries to the YouTube database.
Hope this makes sense and helps.
I cannot get the Flickr API to return any data for lat/lon queries.
view-source:http://api.flickr.com/services/rest/?method=flickr.photos.search&media=photo&api_key=KEY_HERE&has_geo=1&extras=geo&bbox=0,0,180,90
This should return something, anything. Doesn't work if I use lat/lng either. I can get some photos returned if I lookup a place_id first and then use that in the query, except then all the photos returned are from anywhere and not the place id
Eg,
http://api.flickr.com/services/rest/?method=flickr.photos.search&media=photo&api_key=KEY_HERE&placeId=8iTLPoGcB5yNDA19yw
I deleted out my key obviously, replace with yours to test.
Any help appreciated, I am going mad over this.
I believe that the Flickr API won't return any results if you don't put additional search terms in your query. If I recall from the documentation, this is treated as an unbounded search. Here is a quote from the documentation:
Geo queries require some sort of limiting agent in order to prevent the database from crying. This is basically like the check against "parameterless searches" for queries without a geo component.
A tag, for instance, is considered a limiting agent as are user defined min_date_taken and min_date_upload parameters — If no limiting factor is passed we return only photos added in the last 12 hours (though we may extend the limit in the future).
My app uses the same kind of geo searching so what I do is put in an additional search term of the minimum date taken, like so:
http://api.flickr.com/services/rest/?method=flickr.photos.search&media=photo&api_key=KEY_HERE&has_geo=1&extras=geo&bbox=0,0,180,90&min_taken_date=2005-01-01 00:00:00
Oh, and don't forget to sign your request and fill in the api_sig field. My experience is that the geo based searches don't behave consistently unless you attach your api_key and sign your search. For example, I would sometimes get search results and then later with the same search get no images when I didn't sign my query.