While Zapier is very powerful at putting glue between various APIs, I stumbled upon the use case of creating Zaps programmatically, which would offer even more flexibility.
I did some investigation and it appears that Zapier, to this date, does not provide a public API for doing so. Is there any workaround for that ?
Not currently. Creating a zap involves a lot of UI steps (6 or 7) - many of which depend on choices made in each previous step, so trying to do all that through a remote script might be difficult and messy. Maybe share some ideas on your specific use-case: why do you need to create a zap programmatically?
Keep in mind you can also create private apps on Zapier so if there is specific functionality you need to use with Zapier you can do that without exposing it to the world.
The year 2017 Zapier introduced their Command Line Interface, which makes possible building your app programatically via JavaScript.
You can take a look at the Example Apps as well as the CLI-docs.
Related
This is my first post/question. If I missed an existing thread that answers my question, I missed that thread in my search and definitely appreciate you linking me! Please let me know if I should be posting/asking this elsewhere....
My question relates to Salesforce.
I have a use case where a client has a monthly batch of files that need to be made available on various cloud-based storage/distribution platforms like Box and Dropbox but also other less ubiquitous tools specific to the sector. Currently, the client is logging into each distribution platform, one-at-a-time, and uploading the files; then, if at any point any files need to be updated or removed/restricted, the client logs into each platform one-by-one and takes the necessary action. Obviously the process being described is tedious/laborious and leaves multiple gaps for error. The client and I are discussing a solution that would allow for create/read/update/delete actions in all of the distribution platforms without having to leave their Salesforce org. I am aware of existing AppExchange integrations for Box, Dropbox, etc. but they don’t quite do everything we need (to my knowledge)—they tie-in nicely and there are use cases where they are powerful tools...but—my understanding of those existing integration is that they would still each require dedicated tabs within the Object and repeated ‘drags’ and ‘drops’ of the same files to each tab. Again, the end goal here is that, for example, the client wants to drag and drop one time and have it pushed to the various platforms, etc. Or another example is they would like to choose "delete" one time from within Salesforce and have the file removed/restricted on all distribution platforms.
I am a certified SF Admin 1, so...perhaps this should be in my wheelhouse but...I feel unsure how to approach. My feeling is this is asking for a combination of integrations via API and Process/Flow work, but I am hoping for some ideas/input/guidance. Any insight or help any of you have to offer would be so greatly appreciated!!
Thanks so much!
Over the years I have written many apps for myself and for customers using VB.NET, most of these were standalone apps which worked on workstations and servers and didn't involve much if anything to do with networking, client/server or a backend.
I want to evolve and learn more, and specifically am interested in creating a "backend" for some of my apps so that the data from the client app, whether it be error information or custom data for a client can be sent from the "client" to whatever is the best option for the "backend". I would then store the received data in a database I assume and then have ability to view it, and parse the data to show specific results etc
This is a new area to me but one I want to learn more about, so I am hoping that others who have this knowledge / experience can give me some pointers as to the best route to take etc, especially covering the points below and anything else you think I need to know but have missed
What to use for the "backend"? I prefer to have it in the cloud rather than having my own server, so do I use something like Azure? if so what? or do I sign up for hosting with a company which provided ASP and .NET Core hosting and create something there? if so what would I use?
What are the best methods for sending data from a to b - I notice most services I currently use for my apps e.g. for sending mail or error reports etc often use JSON? is that best option?
How do I send the data, any built in or suggested frameworks or API's to use to "post / send" information from my client to my backend / remote endpoint?
Hope this all make sense, please ask if not, all comments appreciated, as I say I want to use this both as a way to create / modernize some of my existing apps, but also as a learning exercise to learn areas I don't know and hopefully also make user of newer / evolving technologies and solutions.
Thanks
I got a project assigned where we already have an up and running website and one of our clients wants to be able to track statistics from the website.
We want to make this available to all our clients as soon as we finish the development. Note that each 'client' have their own 'subdomain' to say so. Eg. www.website.com/client1 , www.website.com/client2 , etc. And we want to track the usage separately for each of these clients.
We will need to create statistics based on the usage of our own platform, pull in data registered by Google Analytics and also pull in data from a 3rd party which they will offer by an API of their own (they have a 3rd party solution that uses the data accessible via our API).
All this data needs to be shown on a webpage with graphs and tables.
I wanted to make sure we choose the right architecture from the start, in order to avoid scalability issues later on.
Started reading about Private and Public API's lately.
For now, we do not have another (internal) application yet that would use our own statisics, it would just be the website using it. But in order to be able to scale-up later if needed, and another application would like to use the statistics I think a private API would benefit us greatly.
In order to allow 3rd parties to use the statistical data we chose to let out, I was thinking of creating a Public API.
Is a Private&Public API the correct way to go about this?
One of the questions I am stuck with is how does the architecture for these API's look like. Mostly, right now we already have a public API regarding vacancy data. This 'API' is basically just a PHP class (controller) inside our CodeIgniter solution. It gets called via its URL and returns a JSON object with the results. (e.g. www.website.com/api/vacancy/xxx)
In order to create a (proper) private & public API solution/architecture. Should the API be set free from the website (CodeIgniter)? What are the common go-to solutions for this?
Or is it fine to keep it in our current platform the way it is now? (and people call the stats API via www.website.com/api/stats/xxx for example?)
It's almost always right to go with microservices like architecture so your initial thoughts sounds reasonable. Acting like this will give the possibility to scale and deploy your api independently and also will help you avoid performance side effects to your site (and vice versa). Pay attention how you access your main site data from within the new api if you don't want to finish with a monolith application.
Regarding the API i would suggest you to implement protocol like oauth2 in order to achieve the flexibility you (might) need. Also you can use swagger to document and test your API.
All i said might helps you a lot but first you have to answer yourself do you really need to go so deep or you just need a simple solution.
I think multitenancy is the best choice. Generally speaking, multitenancy is when every customer has own database. Data is separate. The codebase is same and already exists. As I understood the project is in progress status. You do not redesign and rewrite anything.
Like most people, we're pretty impressed with BigQuery. We're willing to put up with it being based on proprietary "Dremel" in exchange for not having to configure a ton of servers in our LAN, on EC2, or anywhere else.
The REST API is excellent, and we're incorporating that into our apps, but we still find ourselves using the BQ Browser interface as well. We'd like to incorporate something like a 'generic SQL window' into our app, without divulging that the backend is BQ or that data is stored in Google at all, for that matter. Does Google provide a way to use their BQ browser tool in a white-label manner?
Note also, that even extending access to the existing browser tool is problematic. It relies on user-accounts existing in one's own domain - something that can't be done, in our case, with a customer's email address. The REST interface solves this with service-level accounts, but that doesn't get you to the SQL window/browser tool.
If the folks at Google are listening (and I know that you are), consider the benefits of white-labeling the browser tool: I think you'd find a lot of software companies integrating it into their suites of products and, then, running circles around any Hadoop/CDH/EMR/Impala/Hive combination.
So, to summarize: How does a software developer import or emulate the BQ browser tool (with all it's autocompletes, query histories, etc..) in their own web-based app?
The initial version of the BigQuery web interface was considered just an 'example' UI that anyone could create themselves. It uses only the public BigQuery API to talk to BigQuery.
There are a couple of Google-internal things we've added since then, such as the current design of 'saved queries', and an auth shortcut so that users don't have to explicitly grant permission to the UI to access BigQuery data. But it is still mostly plain-ol-javascript talking to BigQuery via the REST API the same way anybody else does.
The javascript is obfuscated, however, but my understanding is that this is just for compression purposes so that it downloads more quickly.
The SQL highlighting is done by CodeMirror with special configuration for the BigQuery SQL variant.
I'll talk to the other members of the BigQuery team about open-sourcing the javascript code in the Web UI. It may be difficult to do at this point, but it doesn't hurt to have a conversation about it. I'll bring this up with the team and update this thread. The most likely answer will be "We'll think about it", but hopefully we can also think about it and start working on it too :-)
Let me know if that sounds like it would meet your needs. It might not solve the auth problems you mention, since your users likely won't have BigQuery accounts, but you may be able to solve that by proxying oauth2 access tokens.
So. I have embarked on the journey of learning Laravel in the last couple of weeks, and am thoroughly enjoying it.
It has come time for a site redesign and I thought it was about time to tighten up some of our functionality, so I am making the switch from CodeIgniter to Laravel.
I was wondering whether it is worth starting off with a RESTful API layer in Laravel (easy enough to create) and use it as a base even for the web application. In the future we are likely to build a mobile app that will need to use the API. So:
Is it worth having the web application connect to the API
and what is the easiest way/s to make calls to the API without having to write a bazillion
lines for cURL everytime I want to make a request?
It is definitely worth it.
I am currently redesigning a messy PHP code for an established hosting company turning it into beautiful Laravel code. I already have a mobile app working with it - Laravel makes it easy to return JSON data with one line -
Response::json('OK', 200);
or
Response::eloquent(Auth::user());
or
$tasks = Task::all();
Response::eloquent($tasks);
You don't need to use CURL as far as I know.
You can make all requests with simple AJAX, jQuery would simplify that.
Also using some MVC JS framework would help you make the application code structure more elegant for the client side, and the advantage is that you can easily package that into PhoneGap when you are ready to have your API take some real testing.
Today I posted on my blog about a simple example that you can try to see if this approach is worth your time : http://maxoffsky.com/code-blog/login-to-laravel-web-application-from-phonegap-or-backbone/
Check it out and accept the answer if you think it's on the right track.
As always, your results may vary, but this is exactly what I'm going through at the moment. I'm migrating a large .Net site with this API architecture and I've decided to keep it for Laravel.
I personally decided for this because:
More scalable. I can setup api.domain.com and then add additional
boxes/vm/whatever as our traffic grows. In fact, you could load
balance just the api by "round robin" or multiple dns entries for
that domain.
Future proofing for new sites and apps. Sounds like you're in the
same situation. I can see an app or two being added in the next year
or so.
Lost cost. You'll already be laying out your controllers, so really
it can be just a matter of setting them to RESTful and making small
tweaks to accommodate.
To be fair, some counter points:
Possibly additional load time, from processing through the API, though this should be minimal.
Additional security items to consider if you'd like to lock things down to just your app.
Either way, welcome to Laravel!
and what is the easiest way/s to make calls to the API without having to write a bazillion lines for cURL everytime I want to make a request?
#Sneaksta try postman chrome extension for calling rest services. you can create forms in this extension and pass data from these forms to you Rest services
https://chrome.google.com/webstore/detail/postman-rest-client/fdmmgilgnpjigdojojpjoooidkmcomcm?utm_source=chrome-ntp-icon