In the Trading API, it was possible to leave feedback for a buyer with either the CompleteSale or LeaveFeedback calls.
I'm building a new application from scratch and using the newer REST APIs, but was unable to find any reference to feedback in any of them. I would have assumed this would be part of the Fulfillment API, which handles orders and their fulfillment, but apparently the new version of CompleteSale (createShippingFulfillment) no longer supports leaving feedback and there's no call dedicated to leaving feedback as far as I can tell.
Is feedback no longer a thing in the new REST APIs?
LeaveFeedback call works and is fairly simple: tried to give code example but its not allowing me thinks I am trying to inject code :)
Use XML and PHP works fine
Related
Ive been making API's for about a year now and I was taught to use http://IPAddress:Port/api/v1 all the time when building an API with express.js. Is there a specific reason I would want to do that? Is this just denoting that the API is in development? Ive recently changed my API to not run on port 3000 so that I am able to just say http://IPAddress.com/ instead of http://IPAddress.com:3000/api/v1 and it works just fine the new way.
One main reason for versioning an API is because it may be that an API can be improved upon but doing so might lead to breaking changes (for example, it might not work for applications that are consuming the API because an endpoint has been modified).
So, the solution to this is to allow consumers of the current API (v1) to keep using it until they want to switch, and release an updated version (v2) for new consumers.
Here's some more info on it: https://restfulapi.net/versioning/
I got a project assigned where we already have an up and running website and one of our clients wants to be able to track statistics from the website.
We want to make this available to all our clients as soon as we finish the development. Note that each 'client' have their own 'subdomain' to say so. Eg. www.website.com/client1 , www.website.com/client2 , etc. And we want to track the usage separately for each of these clients.
We will need to create statistics based on the usage of our own platform, pull in data registered by Google Analytics and also pull in data from a 3rd party which they will offer by an API of their own (they have a 3rd party solution that uses the data accessible via our API).
All this data needs to be shown on a webpage with graphs and tables.
I wanted to make sure we choose the right architecture from the start, in order to avoid scalability issues later on.
Started reading about Private and Public API's lately.
For now, we do not have another (internal) application yet that would use our own statisics, it would just be the website using it. But in order to be able to scale-up later if needed, and another application would like to use the statistics I think a private API would benefit us greatly.
In order to allow 3rd parties to use the statistical data we chose to let out, I was thinking of creating a Public API.
Is a Private&Public API the correct way to go about this?
One of the questions I am stuck with is how does the architecture for these API's look like. Mostly, right now we already have a public API regarding vacancy data. This 'API' is basically just a PHP class (controller) inside our CodeIgniter solution. It gets called via its URL and returns a JSON object with the results. (e.g. www.website.com/api/vacancy/xxx)
In order to create a (proper) private & public API solution/architecture. Should the API be set free from the website (CodeIgniter)? What are the common go-to solutions for this?
Or is it fine to keep it in our current platform the way it is now? (and people call the stats API via www.website.com/api/stats/xxx for example?)
It's almost always right to go with microservices like architecture so your initial thoughts sounds reasonable. Acting like this will give the possibility to scale and deploy your api independently and also will help you avoid performance side effects to your site (and vice versa). Pay attention how you access your main site data from within the new api if you don't want to finish with a monolith application.
Regarding the API i would suggest you to implement protocol like oauth2 in order to achieve the flexibility you (might) need. Also you can use swagger to document and test your API.
All i said might helps you a lot but first you have to answer yourself do you really need to go so deep or you just need a simple solution.
I think multitenancy is the best choice. Generally speaking, multitenancy is when every customer has own database. Data is separate. The codebase is same and already exists. As I understood the project is in progress status. You do not redesign and rewrite anything.
So, I stepped once at this problem. I had offered a website that used the SoundCloud API. Everything worked properly. Content was extracted from the JSON and placed in the layout of the website. However, I received an email one day from the owner of the website, which indicated that the website did not work properly. I then came out to investigate and came to the conclusion that the "problem" was not on my side, but at SoundCloud's side. I studied on the API page of SoundCloud and came to the conclusion that the API had received a major update, making the link with SC and the site no longer worked.
Lately I'm trying many new APIs to, including those from Instagram and Dribbble. I was therefore wondering if it is at all possible to ensure that such problems can be reduced in the future or it might be appropriate API pages of this third-party APIs to monitor?
There's no "right" answer. After many years of using and maintaining many APIs here are some of the conclusions I've come to:
The best providers let you work with a specific version of their API whose interface and expected behavior never changes. They might release bug fixes and new endpoints, but you can be confident that as long as the API is supported it will not break your system.
A good provider will provide an end-of-life date for each version of their API. It's up to you to keep track of when you need to update.
Paid services will often be supported longer than free services. Plus the contract / SLA will guarantee it remains available for a specific amount of time.
The most popular APIs often have mailing lists and/or blogs. For those that offer it, sign up to be notified of updates. For those that don't you'll have to monitor their blogs or news posts. And I suggest not using any service that would drop support for an API version without warning.
Anyone having trouble building API's with the new Kimono Desktop app. When I click create API the browser just stays on create API screen loading but nothing happens.
I even tried building an API that is the same as one I built when Kimono was still in business and no luck.
My existing api's still run and pull data fine.
Working just fine here.
Just a shot in the blue...did you install the new extension specifically for the desktop app? The old one can't work anymore since Kimono disabled the online service.
You can't build new APIs I think - only existing ones. :( Said something like it was not the acquiring company's policy to provide this publiclu available service but since they did not know how many businesses depended on their APIs they decided to allow existing APis to run.
You can also use their software for replace the online API. But it's an abandonware.
It's working great but on some website I can create new API because of an error.
I just find a possible (free) product that can replace Kimono : Datascraping.co.
They will include an REST FULL API within few days / weeks, as they said here.
It's a good software, but comparing to Kimono Desktop it's a little much more complicated.
Kimono got bought out and it's not working anymore.
Import.io is good but it kinda expensive for me. I've tried the free trial before but see the price starts from $249, so... https://www.import.io/standard-plans/
I don't have any progamming skills so I know very little about some web scraping frameworks like Scrapy.
You may want to look at this new web scraping software called Octoparse.
In case you want to know, here is the link:http://www.octoparse.com/
It takes a little time to begin. But they have rich tutorials on the website. Plus it doesn't require programming skills.
They provide cloud-based scraping service and API access.
You can use it to create API but you have to pay for the service. Free version doesn't have the API and cloud service.
I've been using it for 2 month with the Standard plan $89, which a lot cheaper than import.io. So far so good.
Hope it would be helpful for you.
So. I have embarked on the journey of learning Laravel in the last couple of weeks, and am thoroughly enjoying it.
It has come time for a site redesign and I thought it was about time to tighten up some of our functionality, so I am making the switch from CodeIgniter to Laravel.
I was wondering whether it is worth starting off with a RESTful API layer in Laravel (easy enough to create) and use it as a base even for the web application. In the future we are likely to build a mobile app that will need to use the API. So:
Is it worth having the web application connect to the API
and what is the easiest way/s to make calls to the API without having to write a bazillion
lines for cURL everytime I want to make a request?
It is definitely worth it.
I am currently redesigning a messy PHP code for an established hosting company turning it into beautiful Laravel code. I already have a mobile app working with it - Laravel makes it easy to return JSON data with one line -
Response::json('OK', 200);
or
Response::eloquent(Auth::user());
or
$tasks = Task::all();
Response::eloquent($tasks);
You don't need to use CURL as far as I know.
You can make all requests with simple AJAX, jQuery would simplify that.
Also using some MVC JS framework would help you make the application code structure more elegant for the client side, and the advantage is that you can easily package that into PhoneGap when you are ready to have your API take some real testing.
Today I posted on my blog about a simple example that you can try to see if this approach is worth your time : http://maxoffsky.com/code-blog/login-to-laravel-web-application-from-phonegap-or-backbone/
Check it out and accept the answer if you think it's on the right track.
As always, your results may vary, but this is exactly what I'm going through at the moment. I'm migrating a large .Net site with this API architecture and I've decided to keep it for Laravel.
I personally decided for this because:
More scalable. I can setup api.domain.com and then add additional
boxes/vm/whatever as our traffic grows. In fact, you could load
balance just the api by "round robin" or multiple dns entries for
that domain.
Future proofing for new sites and apps. Sounds like you're in the
same situation. I can see an app or two being added in the next year
or so.
Lost cost. You'll already be laying out your controllers, so really
it can be just a matter of setting them to RESTful and making small
tweaks to accommodate.
To be fair, some counter points:
Possibly additional load time, from processing through the API, though this should be minimal.
Additional security items to consider if you'd like to lock things down to just your app.
Either way, welcome to Laravel!
and what is the easiest way/s to make calls to the API without having to write a bazillion lines for cURL everytime I want to make a request?
#Sneaksta try postman chrome extension for calling rest services. you can create forms in this extension and pass data from these forms to you Rest services
https://chrome.google.com/webstore/detail/postman-rest-client/fdmmgilgnpjigdojojpjoooidkmcomcm?utm_source=chrome-ntp-icon