I am trying to use UlrFetch to submit CSV data to Zoho reports. I am using the following code:
function doImport(tabla,file) {
var url="https://reportsapi.zoho.com/api/xxxxxxxx/yyyyyyyyyyy/"+tabla;
var ticket="zzzzzzzzzzzzzzzz" ;//getTicket();
url=url + "?ZOHO_ACTION=IMPORT&ZOHO_OUTPUT_FORMAT=XML&ZOHO_ERROR_FORMAT=json&ZOHO_API_VERSION=1.0"
var params={"ZOHO_API_KEY":"vvvvvvvvvvvvvvvvvvvvvv"
,"ticket":ticket
,"ZOHO_FILE":file
,"ZOHO_IMPORT_TYPE":"APPEND"
,"ZOHO_ON_IMPORT_ERROR":"ABORT"
,"ZOHO_AUTO_IDENTIFY":"true"
,"ZOHO_CREATE_TABLE":"false"
,"ZOHO_DATE_FORMAT":"dd-MM-YYYY"
,"ZOHO_DELIMITER":"0"
};
var options =
{
"method" : "post",
"payload" : params,
"contentType": "multipart/form-data"
};
var response=UrlFetchApp.fetch(url, options);
var tableDataString=response.getContentText();
expireTicket(ticket);
Logger.log(tableDataString);
return tableDataString;
}
However, the data is not submitted in correct multiform format (getting error 500 status). This issue backtracks to early 2011. Please, one or two examples of how to submit blob files in multipart/form-data format would be welcome.
Thanks
For payload, you are passing it as an Object, which looks correct. This will be interpreted as an HTTP form (which you want).
To fix your script, try the following:
Make sure the value you're using for ZOHO_FILE is a Blob. This makes sure the HTTP form will automatically be sent with:Content-Type: multipart/form-data; boundary=[automatically determined]
Do not specify contentType for the HTTP POST. This allows UrlFetchApp to automatically use its own contentType value, which includes the boundary field. (Minor detail: It's ok to still specify contentType on the Blob itself, just not the overall post request. This allows specifying the contentType of each Blob within the post, if that interests you.)
UrlFetchApp will use multipart/form-data encoding automatically if you pass a Blob as a payload value. You may need to use:
"ZOHO_FILE": file.getBlob()
Related
I'm trying to send a POST request to the server, this post requires parameters "email" and "password".
but I don't know how to specify parameters, I read the documentation but I didn't understand.
this is my code:
val request=client.post<String> {
url(BASE_URL+"login.php")
body="email=$email,password=$password"
}
fwiw I use something like following here....though I would have thought specifying url like you do should also work. What issue do you see? The body might also be some json for example, or maybe a data class etc if you have serialization setup.
response = client.post(url) {
body = "some params/data etc"
}
It should work if you use serialization, but I solved my problem by using 'Uploading multipart/form-data'
val request=client.post(url) {
body=MultiPartFormDataContent(formData {
append("email","data")
append("password","data")
})
}
see Documentation
I am completely new to coding. I am trying to build a dashboard in Klipfolio. I am using a CATSone API to pull data from CATSone to Klipfolio. However, I can only get 100 rows a time, which means I would have to pull data 2600 times.
I am now trying to build a script to get data from the API through Google Script Editor. However, since I have no experience in this, I am just trying stuff. I watched some videos, also from Ben Collins. The basis is simple, and I get what he is doing.
However, I have a problem with putting the API key.
var API_KEY = 'key'
function callCATSone(){
//Call the CATSone API for all candidate list
var response = UrlFetchApp.fetch("https://api.catsone.nl/v3/candidates");
Logger.log(response.getContentText());
// URL and params for the API
var url = 'https://api.catsone.nl/v3/candidates';
var params = {
'method': 'GET',
'muteHttpExceptions': true,
'headers': {
'Authorization': 'key ' + apikey
}
};
// call the API
var response = UrlFetchApp.fetch(url, params);
var data = response.getContentText();
var json = JSON.parse(data);
}
In the end, I would like to transfer all candidate list data to my sheets. Therefore, I call on the API with Authorization key. After that, I will manipulate the data, but that's for later. The first problem I now encounter, is this fail code:
'Verzoek voor https://api.catsone.nl/v3/candidates is mislukt. Foutcode: 401. Ingekorte serverreactie: {"message":"Invalid credentials."} (Gebruik de optie muteHttpExceptions om de volledige reactie te onderzoeken.) (regel 6, bestand 'Code')'.
I expect to get a list of all data from CATSone into my sheets.
Does anyone know how I can accomplish this?
Two changes should fix the credentials error:
Authorization header should be Authorization: 'Token ' + yourApiKey instead of 'key ', see the v3 API documentation https://docs.catsone.com/api/v3/#authentication.
API key in your case is stored in a global variable API_KEY, you should reference it exactly like that, not as an apikey (unless there is a typo in your sample or some missing code): Authorization : 'Token ' + API_KEY.
Btw, it should probably set either a Content-Type header or a contentType parameter for UrlFetchApp.fetch() method call to application/json as UrlFetchApp.fetch() request content type defaults to application/x-www-form-urlencoded.
If you plan to continue working with APIs, it would be beneficial to read this MDN article.
I am trying to do a domain availability search using an API from free domain API.
After i create an account, it shows:
**Make a REST request using this URL:**
http://freedomainapi.com/?key=11223344&domain=freedomainapi.com
And looking in the documentation page, it has only:
Request http://freedomainapi.com?key=YOUR_API_KEY&domain=DOMAIN_NAME
Result:
{
"status": "success",
"domain": "freedomainapi.com",
"available": false
}
I am very new to APIs...
What I need is to show a domain search box, and when the user enters, it should return with result.
It claims to show domain suggestions as well. I hope it will also work.
Using jquery and a jsonp proxy
http://jsfiddle.net/mp8pukbm/1/
$.ajax({
type: 'GET',
url: "https://jsonp.nodejitsu.com/?callback=?",
data: {url: 'http://freedomainapi.com?key=14ejhzc5h9&domain=freedomainapi.com'},
dataType: "jsonp",
success: myfn
});
function myfn(data) {
console.log(data);
}
you have to use the proxy because cross domain json is not permitted
EDIT:
i made an update to show the result in a div (stringified)
http://jsfiddle.net/mp8pukbm/2/
EDIT #2: i created a test key on that site, you have to use your own
EDIT #3: and there's your combo: http://jsfiddle.net/mp8pukbm/4/
Assuming that you will use java script for showing the search box, you can use AJAX feature of java script (or jQuery or Dojo) ... All you need to do is a "GET" request that like you can pasted and you will get the result back on the response object. To try out the API you can use "Postman" application in Chrome. https://chrome.google.com/webstore/detail/postman-rest-client/fdmmgilgnpjigdojojpjoooidkmcomcm?hl=en
In the response object of the AJAX call you will get a JSON object which you can parse and display the result.
Normally when we use REST we need to differentiate one REST call from another.
Assuming this url
http://freedomainapi.com/checkAvailability?key=YOUR_API_KEY&domain=DOMAIN_NAME
In Application layer we need to write an interface
#GET
#Path("/checkAvailability")
#Produces({MediaType.APPLICATION_JSON})
public ReturnObject getDomainAvailability(#QueryParam("key") String key,
#QueryParam("domain") String doaminName );
Once interface is done you need to write your implementation class.
This class will intract with business layer and perform search task and based on
result collected will create ReturnObject.
ReturnObject => will contain status, domain and availability
On screen
$.ajax({
type: "GET",
url: 'root/checkAvailability',
success: function(jsonData)
{
// read json and perform operation
}
,
error: function (error)
{
// handle error
}
});
If you are using JAVA as backend then you can use gson to parse the result, which is a json. After parsing you can read the values from result and display accordingly :)
Any API is a way to extend a given software. (Might be a website or an application)
In both ways there is a certain way to communicate with the software. In your example freedomainapi.com allows you to fetch if given domain is avaiable. There is no such thing as a suggestion tho, atleast i cannot find any suggestions at all.
Given output is a message format know as JSON. It can be easily interpreted by many major Languages such as Java, Javascript and PHP.
Given String might be easily interpreted as a Map consisting of a status (String), a domain (string) and avaiable (boolean)
A domain availability search could not be easier, assuming K is your key, D is your search input (Domain):
Download http://freedomainapi.com/checkAvailability?key=K&domain=D as input
Parse JSON from input as json
return json["status"] == "success" and json["avaiable"]
Depending on your language you might need to use methods to access properties of json, but that does not influence the basic usage of this api.
on user enters, it calls click_button function and I am assuming your result displaying div id is "main_container" you can give domain suggestions by passing related DOMAIN_NAME s as arguments to click_button function
function click_button(DOMAIN_NAME){
$.ajax({
url : 'http://freedomainapi.com?key=YOUR_API_KEY&domain=DOMAIN_NAME',
type: 'GET',
crossDomain: true,
contentType: "application/json; charset=utf-8",
success: function(data) {
data=JSON.parse(data);
if(data['available']){
$('#main_container').html($('#main_container').html()+'<br>'+DOMAIN_NAME+': Available');
else{
$('#main_container').html($('#main_container').html($('#main_container').html()+'<br>'+DOMAIN_NAME+': Not Available');
}//success
});//ajax
}
hope it helpful !
There seems to be some differences between saving a model using this.model.save() and using jquery ajax type PUT?
I have the following method in my api controller
public void Put(string id, [FromBody]IContent value) {
// save
}
I have also enabled TypeNameHandling on JSON formatter serializer setting like this:
config.Formatters.JsonFormatter.SerializerSettings.TypeNameHandling = TypeNameHandling.Auto;
If I PUT some data using jquery ajax like this
$.ajax({
url: "/api/page/articles/1",
type: "PUT",
dataType: "json",
contentType: "application/json",
data: JSON.stringify({"$type": "BrickPile.Samples.Models.Article,BrickPile.Samples","id": "articles/1", "heading": "Some heading..." })
});
my object binds correct in the put method but when I try to save my object using model.save() in backbone the input value is null and it cannot bind the object?
This is how I do it:
this.model.set({ heading: 'foo' });
this.model.save();
the request headers seem to look ok and the payload is seems to be JSON, at least if I look in firebug. It's also possible to PUT some data to my api using fiddler with the same payload but not if I copy the payload source from firebug see: http://cl.ly/Nked
Can anyone explain what I'm doing wrong here?
Without knowing more about your model implementation it is hard to say for sure. One thing I can see from your firebug screenshot is that the id attribute is being passed as "articles/1" which is unusual for standard Backbone. If you were saving a model object then the id would normally be "1". So a model.save() would generate a HTTP PUT to articles/1 and pass the json as something including {"id":"1", ... }. The Backbone.sync documentation has more details on the default behaviour.
I found many posts when I queried for this problem, but they all refer to how to upload a file from your browser to a node.js server. I want to upload a file from node.js code to another server. I tried to write it based on my limited knowledge of node.js, but it doesn't work.
function (data) {
var reqdata = 'file='+data;
var request = http.request({
host : HOST_NAME,
port : HOST_PORT,
path : PATH,
method : 'POST',
headers : {
'Content-Type' : 'multipart/form-data',
'Content-Length' : reqdata.length
}
}, function (response) {
var data = '';
response.on('data', function(chunk) {
data += chunk.toString();
});
response.on('end', function() {
console.log(data);
});
});
request.write(reqdata+'\r\n\r\n');
request.end();
})
The above function is called by other code that generates data.
I tried to upload same data file using curl -F "file=#<filepath>" and the upload is successful. But my code fails. The server returns an application specific error which hints that the uploaded file was invalid/corrupt.
I collected tcpdump data and analysed it in wireshark. The packet sent from my node.js code lacks the boundary required for the multipart data. I see this message in wireshark packet
The multipart dissector could not find the required boundary parameter.
Any idea how to accomplish this in node.js code?
jhcc's answer is almost there.
Having to come up with support for this in our tests, I tweaked it slightly.
Here's the modified version that works for us:
var boundaryKey = Math.random().toString(16); // random string
request.setHeader('Content-Type', 'multipart/form-data; boundary="'+boundaryKey+'"');
// the header for the one and only part (need to use CRLF here)
request.write(
'--' + boundaryKey + '\r\n'
// use your file's mime type here, if known
+ 'Content-Type: application/octet-stream\r\n'
// "name" is the name of the form field
// "filename" is the name of the original file
+ 'Content-Disposition: form-data; name="my_file"; filename="my_file.bin"\r\n'
+ 'Content-Transfer-Encoding: binary\r\n\r\n'
);
fs.createReadStream('./my_file.bin', { bufferSize: 4 * 1024 })
.on('end', function() {
// mark the end of the one and only part
request.end('\r\n--' + boundaryKey + '--');
})
// set "end" to false in the options so .end() isn't called on the request
.pipe(request, { end: false }) // maybe write directly to the socket here?
Changes are:
ReadableStream.pipe returns the piped-to stream, so end never gets called on that. Instead, wait for end on the file read stream.
request.end puts the boundary on a new line.
Multipart is pretty complex, if you want to make it look like how a client usually handles "multipart/form-data", you have to do a few things. You first have to select a boundary key, this is usually a random string to mark the beginning and end of the parts, (in this case it would be only one part since you want to send a single file). Each part (or the one part) will need a header (initialized by the boundary key), setting the content-type, the name of the form field and the transfer encoding. Once the part(s) are completed, you need to mark the end of each part with the boundary key.
I've never worked with multipart, but I think this is how it could be done. Someone please correct me if I'm wrong:
var boundaryKey = Math.random().toString(16); // random string
request.setHeader('Content-Type', 'multipart/form-data; boundary="'+boundaryKey+'"');
// the header for the one and only part (need to use CRLF here)
request.write(
'--' + boundaryKey + '\r\n'
// use your file's mime type here, if known
+ 'Content-Type: application/octet-stream\r\n'
// "name" is the name of the form field
// "filename" is the name of the original file
+ 'Content-Disposition: form-data; name="my_file"; filename="my_file.bin"\r\n'
+ 'Content-Transfer-Encoding: binary\r\n\r\n'
);
fs.createReadStream('./my_file.bin', { bufferSize: 4 * 1024 })
// set "end" to false in the options so .end() isnt called on the request
.pipe(request, { end: false }) // maybe write directly to the socket here?
.on('end', function() {
// mark the end of the one and only part
request.end('--' + boundaryKey + '--');
});
Again, I've never done this before, but I think that is how it could be accomplished. Maybe someone more knowledgable could provide some more insight.
If you wanted to send it as base64 or an encoding other than raw binary, you would have to do all the piping yourself. It will end up being more complicated, because you're going to have to be pausing the read stream and waiting for drain events on the request to make sure you don't use up all your memory (if it's not a big file you generally wouldn't have to worry about this though). EDIT: Actually, nevermind that, you could just set the encoding in the read stream options.
I'll be surprised if there isn't a Node module that does this already. Maybe someone more informed on the subject can help with the low-level details, but I think there should be a module around somewhere that does this.
As the error message states you are missing the boundary parameter. You need to add a random string to separate each file from the rest of the files/form-data.
Here is how a request could look like:
The content type:
Content-Type:multipart/form-data; boundary=----randomstring1337
The body:
------randomstring1337
Content-Disposition: form-data; name="file"; filename="thefile.txt"
Content-Type: application/octet-stream
[data goes here]
------randomstring1337--
Note that the -- in the beginning and end of of the random string in the body is significant. Those are part of the protocol.
More info here http://www.w3.org/Protocols/rfc1341/7_2_Multipart.html
The fastest way I was able to do this, that worked, was using the request package. The code was well documented and it just worked.
(For my testing I wanted a JSON result and non-strict SSL - there are many other options...)
var url = "http://"; //you get the idea
var filePath = "/Users/me/Documents/file.csv"; //absolute path created elsewhere
var r = request.post( {
url: url,
json: true,
strictSSL: false
}, function( err, res, data ) {
//console.log( "Finished uploading a file" );
expect( err ).to.not.be.ok();
expect( data ).to.be.ok();
//callback(); //mine was an async test
} );
var form = r.form();
form.append( 'csv', fs.createReadStream( filePath ) );