Google Sheets API addProtectedRange Error: No grid with id: 0 - google-sheets-api

I am not sure if I am making a mistake or if possibly this is related to the same issue reported here:
Google Sheets API V4 - Autofill Error - No grid with id: 0
I am getting:
HttpError 400
"Invalid requests[0].addProtectedRange: No grid with id: 1"
Code is something like this (additional addProtectedRange objects removed)
def add_protected_ranges(spreadsheet_id):
service = get_sheets_service()
requests = [
{
"addProtectedRange": {
'protectedRange': {
"range": {
"sheetId": 1,
"startRowIndex": 0,
"endRowIndex": 0,
"startColumnIndex": 0
},
"description": "Headers must not be changed",
"warningOnly": True
}
}
}
]
body = {
'requests': requests
}
response = service.spreadsheets().batchUpdate(spreadsheetId=spreadsheet_id,
body=body).execute()

Had kind of the same issue. I was confusing the sheet id with sheet index.
Easiest way to find the sheet id is in the browser URL when you open the spreadsheet / sheet: https://docs.google.com/spreadsheets/d/{spreadsheetId}/edit#gid={sheetId}
If you're looking for a more programmatic way you find that property on the SheetProperties.

I know that's an old question but I was looking for it too. You're looking for:
res.data.sheets[].properties.sheetId
To get the sheetId (not the spreadsheetId) use:
sheets.spreadsheets.get({
spreadsheetId
}).then(res => {
console.log("All the sheets:");
for(i in res.data.sheets) {
let title = res.data.sheets[i].properties.title;
let id = res.data.sheets[i].properties.sheetId;
console.log(title + ': ' + id);
}
});

Make sure that there is a workbook with the id 1 in your spreadsheet. In google sheet, a spreadsheet can contain multiple worksheet(grid), there is a unique id for each of the worksheet(grid).

Related

rightnow-crm and unavailable hours

we have recently implemented rightnow proactive chat on our website, we have the chat widget set up by our central IT department to work off hours of availability for chat. I've got the widget working to an OK standard using this example
http://cxdeveloper.com/article/proactive-chat-intercepting-chat-events
However, coming from Zendesk I was hoping to replicate the 'out of hours' or 'offline' modes of the Zendesk widget where it changes its behaviour when chat is offline.
After looking around I notice the widget can take a value for 'label_unavailable_hours:' however I cannot work out if this is available to the both the ConditionalChatLink and ProactiveChat modules. Does anyone have any experience with creating such functionality? I've also had a look at trying to pull data using chatAvailability but I am not doing that right either.
If anyone has an insight on how to get some kind of out of hours smarts working or if I am wasting my time try Id love to hear. My code is as below
$(document).ready(function() {
RightNow.Client.Controller.addComponent({
instance_id: "spac_0",
avatar_image: "",
label_question: "A team member is available to help, would you like to start a chat?",
label_avatar_image: "",
label_dialog_header: "",
logo_image: "",
seconds: 2,
enable_polling: "yes",
div_id: "proactiveChatDiv",
module: "ProactiveChat",
//module: "ConditionalChatLink",
min_agents_avail: 0, //remove this when live (when set to 0, no agents have to be free)
min_sessions_avail: 1,
label_unavailable_busy_template: "'All team members are busy. please email is us' + <a href='urlForEmail'>' + 'this place' + '</a>'", //out of hours
label_unavailable_hours: "'Outside of hours' + <a href='urlForEmail'>' + 'this place' + '</a>'",
type: 2
},
"https://ourWidgetCode"
);
//Widget loaded callback - this doesn't seem to work correctly hence the code below
RightNow.Client.Event.evt_widgetLoaded.subscribe(function(event_name, data) {
if (data[0].id == "spac_0") {
//Initialization
console.log('widget loaded');
}
/* this wont work
spac_0.prototype.chatAvailability = spac_0.chatAvailability;
spac_0.chatAvailability = function()
{
console.log(spac_0.chatAvailability);
{}
};*/
//Reset prototype
spac_0.prototype = {};
//Handle Chat Offered
spac_0.prototype.chatOffered = spac_0.chatOffered;
spac_0.chatOffered = function() {
console.log("Chat Offered Handled");
spac_0.prototype.chatOffered.call(this);
//animate the widget to popup from bottom
setTimeout(function() {
$('div.yui-panel-container').addClass('animate');
}, 2000)
//delete the annoying session cookie that only allows the chat to appear once per session by default
RightNow.Client.Util.setCookie("noChat", '', -10, '/', '', '');
};
//if the 'Do not ask again' is selected
spac_0.prototype.chatRefused = spac_0.chatRefused;
spac_0.chatRefused = function () {
console.log("Do not ask again Selected");
spac_0.prototype.chatRefused.call(this);
//Reset the Cookie to be valid only for the session
RightNow.Client.Util.setCookie("noChat",'RNTLIVE',0,'/',true,'');
};
});
});

API call from google sheets is too long

I am trying to write a script in google sheets to update my 3commas bots. The API requires a number of mandatory fields are passed even when there's only 1 item that needs to be updated.
The code I have is below and it uses the values already read from the platform updating only the base_order_volume value. This works perfectly except for when the pairs value is long (more the 2k chars) and then I get an error from the UrlFetchApp call because the URL is too long.
var sheet = SpreadsheetApp.getActiveSheet();
var key = sheet.getRange('F4').getValue();
var secret = sheet.getRange('F5').getValue();
var baseUrl = "https://3commas.io";
var editBots = "/ver1/bots/"+bots[botCounter].id+"/update";
var patchEndPoint = "/public/api"+editBots+"?";
.
.
[loop around values in sheet]
.
.
var BaseOrder=Number(sheet.getRange(rowCounter,12).getValue().toFixed(2));
var botParams = {
"name": bots[botCounter].name,
"pairs": bots[botCounter].pairs,
"max_active_deals": bots[botCounter].max_active_deals,
"base_order_volume": BaseOrder,
"take_profit": Number(bots[botCounter].take_profit),
"safety_order_volume": bots[botCounter].safety_order_volume,
"martingale_volume_coefficient": bots[botCounter].martingale_volume_coefficient,
"martingale_step_coefficient": Number(bots[botCounter].martingale_step_coefficient),
"max_safety_orders": bots[botCounter].max_safety_orders,
"active_safety_orders_count": Number(bots[botCounter].active_safety_orders_count),
"safety_order_step_percentage": Number(bots[botCounter].safety_order_step_percentage),
"take_profit_type": bots[botCounter].take_profit_type,
"strategy_list": bots[botCounter].strategy_list,
"bot_id": bots[botCounter].id
};
var keys = Object.keys(botParams);
var totalParams = keys.reduce(function(q, e, i) {
q += e + "=" + encodeURIComponent(JSON.stringify(botParams[e])) + (i != keys.length - 1 ? "&" : "");
return q;
},endPoint);
var signature = Utilities.computeHmacSha256Signature(totalParams, secret);
signature = signature.map(function(e) {return ("0" + (e < 0 ? e + 256 : e).toString(16)).slice(-2)}).join("");
var headers = {
'APIKEY': key,
'Signature': signature,
};
try {
var params = {
'method': 'PATCH',
'headers': headers,
'muteHttpExceptions': true
};
var response = JSON.parse(UrlFetchApp.fetch(baseUrl + totalParams, params).getContentText());
I have tried to set the botParams as a payload in the params but when I do the signature is incorrect.
I anyone knows how to use sheets to make a call using extensive length of parameters I'd appreciate any help at all
Some sample data for the bots array would be
{
"name": "TestBot",
"base_order_volume": 0.001,
"take_profit": 1.5,
"safety_order_volume": 0.001,
"martingale_volume_coefficient": 2,
"martingale_step_coefficient": 1,
"max_safety_orders": 1,
"active_safety_orders_count": 1,
"safety_order_step_percentage": 2.5,
"take_profit_type": "total",
"stop_loss_percentage": 0,
"cooldown": 0,
"pairs": ["BTC_ADA","BTC_TRX"],
"trailing_enabled":"true",
"trailing_deviation":0.5,
"strategy_list": [{"strategy":"cqs_telegram"}]
}
Thanks in advance
I'd consider using a Cloud Function to either do the heavy lifting, or, if you're worried about costs, use it as a proxy. You can then call the cloud function from Google Sheets. Cloud Functions can be written in whatever language you're most comfortable with, including Node.
Check the GCP pricing calculator to see what the cost would be. For many cases it would be completely free.
This should give you a sense of how to use cloud functions for CSV creation:
https://codelabs.developers.google.com/codelabs/cloud-function2sheet#0
Here is a SO question with an answer that explains how to query cloud functions with authentication.

Is there any way to keep the session open after alexa timer api goes off?

I would like to know if there are any possible ways or tricks where we can leave the session open for the user's input after the timer goes off because the timer API doc doesn't cover it.
timer_request_1 = {
"duration": "PT15S",
"timerLabel": "Change name",
"creationBehavior": {
"displayExperience": {
"visibility": "VISIBLE"
}
},
"triggeringBehavior": {
"operation": {
"type": "ANNOUNCE",
"textToAnnounce": [
{
"locale": "en-US",
"text": "Would you like to proceed with the x task?"
}
]
},
"notificationConfig": {
"playAudible": False
}
}
}
REQUIRED_PERMISSIONS = ["alexa::alerts:timers:skill:readwrite"]
class TimerIntentHandler(AbstractRequestHandler):
def can_handle(self, handler_input):
return ask_utils.is_intent_name("TimerIntent")(handler_input)
def handle(self, handler_input):
permissions = handler_input.request_envelope.context.system.user.permissions
if not (permissions and permissions.consent_token):
return (
handler_input.response_builder
.speak("Please give permissions to set timers using the alexa app.")
.set_card(
AskForPermissionsConsentCard(permissions=REQUIRED_PERMISSIONS)
)
.response
)
timer_service = handler_input.service_client_factory.get_timer_management_service()
timer_response = timer_service.create_timer(timer_request)
if str(timer_response.status) == "Status.ON":
session_attr = handler_input.attributes_manager.session_attributes
if not session_attr:
session_attr['lastTimerId'] = timer_response.id
speech_text = 'Your 40 minutes timer has started!.'
return (
handler_input.response_builder
.speak(speech_text)
.response
.ask("Would you like to proceed x task?")
)
else:
speech_text = 'Timer did not start'
return (
handler_input.response_builder
.speak(speech_text)
.response
)
I tried by adding a return .ask() but I got 'Response' object has no attribute 'ask' error.
Looking forward to hearing your thoughts :)
You use response to get the response from response_builder so you should place all speak, ask etc. builder methods before response.
One way to keep the session alive is to not send "shouldEndSession". This will not let the session close. But this will not be approved if you go for certification.

Unable to find the duplicate elements from API response

We were testing an API and recently got an issue, some of the customers cannot log in to the website.
We found the issue and it is because of Duplicate keys in the API response, it is giving response even if the API contains the duplicate key.
So tests are not helping for the duplicate key conditions,So can anyone please help me or guide how I can find whether there is a duplicate element in the API response.
Tool: postman
Below is the sample API output,
In the below JSON output from API we can find there are duplicates for "operatingSystem",like this duplicate key is coming for different elements.
Since there is no way to debug the API for a while due to some reasons,so need to find out these duplicate cases.
Any idea or suggestions will be much appreciated.Thanks in advance.
JSON
eg: {
"code": 2,
"deviceId": "ID",
"deviceName": "Test",
"platform": "x64",
"operatingSystem": "test",
"operatingSystem": "test",
"gde": 000,
"productVersion": "0.0",
"build": "00000",
"receipt": null
}
How could we handle such a situation. Do we have any method to automate/test this case?
Here's something you can try although it's a bit convoluted. pm.response.json() will normalize the response and remove any duplicates i.e. you won't be able to detect any. So what you can do is take the response in text then manipulate it into a list and look for duplicates there. I used a map object so that if the map already contains a given key then set a flag and fail the test.
This is not thoroughly tested but it should give you an idea or at least a starting point to tackle the problem:
var jsonBody = pm.response.text();
var str = jsonBody.substring(1, jsonBody.length-1);
var keyArr = str.split(",");
var keyMap = {};
var foundDups = false;
for (var i = 0; i < keyArr.length; i++) {
var key = keyArr[i].split(":")[0];
if(!(key in keyMap)) {
keyMap[key] = key;
console.log("added key " + key);
}
else {
console.log("found duplicate: " + key);
foundDups = true;
break;
}
}
pm.test("Look for dups", function() {
pm.expect(foundDups).to.eql(false);
});

Community Connector getData() Request only uses the first two schema fields, not all four

I am building a Community Connector between Google Data Studio and SpyFu.com, in order to funnel SEO information for a specific url into the GDS Dashboard.
However, My getData() request only contains the first two fields from my Schema. As you can see, I have four listed in the code. The result is only the first two fields in the schema are printed to GDS.
I've been through tutorials, official documentation, YouTube videos, looked this issue up on google and checked out the community resources on GitHub.
//Step Two: Define getConfig()
function getConfig(request) {
var cc = DataStudioApp.createCommunityConnector();
var config = cc.getConfig();
config.newInfo()
.setId('instructions')
.setText('Give me SpyFu information on the following domain:');
config.newTextInput()
.setId('domain')
.setName('Enter the domain to search')
.setHelpText('e.g. ebay.com')
.setPlaceholder('ebay.com');
config.newTextInput()
.setId('SECRET_KEY')
.setName('Enter your API Secret Key')
.setHelpText('e.g. A1B2C3D4')
.setPlaceholder('A1B2C3D4');
config.setDateRangeRequired(false);
return config.build();
}
//Step Three: Define getSchema()
function getFields(request) {
var cc = DataStudioApp.createCommunityConnector();
var fields = cc.getFields();
var types = cc.FieldType;
var aggregations = cc.AggregationType;
fields.newDimension()
.setId('Keyword')
.setName('Keywords')
.setDescription('The keywords most often attributed to this domain.')
.setType(types.TEXT);
fields.newMetric()
.setId('Rank')
.setName('Rankings')
.setDescription('The ranking of the target site keyword on the Google Search Page.')
.setType(types.NUMBER);
fields.newMetric()
.setId('Local_Monthly_Searches')
.setName('Local Searches per Month')
.setDescription('Number of times, locally, that people have searched for this term within in the last month.')
.setType(types.NUMBER);
fields.newMetric()
.setId('Global_Monthly_Searches')
.setName('Global Searches per Month')
.setDescription('Number of times, globally, that people have searched for this term within in the last month.')
.setType(types.NUMBER);
return fields;
}
function getSchema(request) {
var fields = getFields(request).build();
return { schema: fields };
}
//Step Four: Define getData()
function responseToRows(requestedFields, response, domain) {
// Transform parsed data and filter for requested fields
return response.map(function(Array) {
var row = [];
requestedFields.asArray().forEach(function (field) {
switch (field.getId()) {
case 'Keyword':
return row.push(Array.term);
case 'Rank':
return row.push(Array.position);
case 'Local_Monthly_Searches':
return row.push(Array.exact_local_monthly_search_volume);
case 'Global_Monthly_Searches':
return row.push(Array.exact_global_monthly_search_volume);
case 'domain':
return row.push(domain);
default:
return row.push('');
}
});
return { values: row };
});
}
function getData(request) {
console.log("Request from Data Studio");
console.log(request);
var requestedFieldIds = request.fields.map(function(field) {
return field.name;
});
var requestedFields = getFields().forIds(requestedFieldIds);
// Fetch data from API
var url = [
'https://www.spyfu.com/apis/url_api/organic_kws?q='
+ request.configParams.domain
+ '&r=20'
+ '&p=[1 TO 10]'
+ '&api_key='
+ request.configParams.SECRET_KEY,
];
try {
var response = UrlFetchApp.fetch(url.join(''));
} catch (e) {
DataStudioApp.createCommunityConnector()
.newUserError()
.setDebugText('Failed URL Fetch Attempt. Exception details: ' + e)
.setText('There was an error accessing this domain. Try again later, or file an issue if this error persists.')
.throwException();
}
console.log("Response from API");
console.log(response);
//Parse data from the API
try {
var parsedResponse = JSON.parse(response);
} catch (e) {
DataStudioApp.createCommunityConnector()
.newUserError()
.setDebugText('Error parsing the JSON data. Exception details: ' + e)
.setText('There was an error parsing the JSON data. Try again later, or file an issue if this error persists.')
.throwException();
}
var rows = responseToRows(requestedFields, parsedResponse);
return {
schema: requestedFields.build(),
rows: rows
};
}
I need the GDS to post four columns of data. They are, "Keyword", "Rank", "Local Monthly Searches" and "Global Monthly searches".
I cannot figure out how to create a "fixed schema" so that the system always prints these four columns of data at every request. The tutorials and various documentation say it's possible, but not how to do it. Please help!
The number of metrics initially called up by the Google Community Connector is handled from the front-end, via Google Data Studio.
The back-end system (the Connector) only initially posts the default dimension and default metric. Getting the rest of the schemas to post should be handled when you are building a report on Google Data Studio. Simply click on the data set, select "data" on the right-hand menu, scroll down to either Metrics or Dimensions, and pick the ones you wish to add to the current set.
Note that these are the fields you established earlier in the coding process, when you were setting up your schemas.
Here, you're filtering your defined schema for fields that are present on the request object received by getData().
var requestedFieldIds = request.fields.map(function(field) {
return field.name;
});
var requestedFields = getFields().forIds(requestedFieldIds);
The visualization in Google Data Studio that is the catalyst for the request will determine which fields are requested.