I'm working in Silverlight 5 and am trying to write an autocomplete textbox (I'm using the telerik radwatermarktextbox control with a radcombobox to show the items) whose list of items is a list of airports returned from an async call to a WCF service.
The issue that I'm running into is that if I'm typing quickly in the textbox, multiple async calls to get the filtered list of items are kicked off (one for each keypress) and they dont necessarily finish in the same order as they were run - particularly when the list coming back is large.
So if I were to type HPN really quickly, the following calls get kicked off
Async call with H as parameter (#1 - will return 231 rows)
Async call with HP as parameter (#2 - will return 4 rows)
Async call with HPN as parameter (#3 - will rrturn 1 row)
sometimes I'm getting the results to call #1 after the others
I cant change the wcf service I'm calling or add a synchronous method to the WCF service.
Foxpro has a function called CHRSAW which can tell you if there are keys waiting in the input buffer (http://msdn.microsoft.com/en-us/library/5skwdb75(v=vs.80).aspx) which could be used to prevent calls #1 and #2 from being called.
Is there an equivalent .NET function/ality that would allow me to do this?
Here's the code I'm using
private void ICAO_TextChanged(object sender, TextChangedEventArgs e)
{
TextBox txt = (TextBox)sender;
if (txt.Text != String.Empty)
{
radBusyIndicator1.IsBusy = true;
_ServiceClient.FindAirportByPartialICAOAsync(txt.Text.Trim().ToUpper());
}
}
An even easier solution than the one you posted in the comments is to post the text you are sending as the parameter as the user state as well. So when looking for "H" you would pass "H" as the user state.
When the calls come back just only use the one where the user state = the text in the autocomplete.
Related
I have experience in Salesforce administration, but not in Salesforce development.
My task is to push a Order in Salesforce to an external REST API, if the order is in the custom status "Processing" and the Order Start Date (EffectiveDate) is in 10 days.
The order will be than processed in the down-stream system.
If the order was successfully pushed to the REST API the status should be changed to "Activated".
Can anybody give me some example code to get started?
There's very cool guide for picking right mechanism, I've been studying from this PDF for one of SF certifications: https://developer.salesforce.com/docs/atlas.en-us.integration_patterns_and_practices.meta/integration_patterns_and_practices/integ_pat_intro_overview.htm
A lot depends on whether the endpoint is accessible from Salesforce (if it isn't - you might have to pull data instead of pushing), what authentication it needs.
For push out of Salesforce you could use
Outbound Message - it'd be an XML document sent when (time-based in your case?) workflow fires, not REST but it's just clicks, no code. The downside is that it's just 1 object in message. So you can send Order header but no line items.
External Service would be code-free and you could build a flow with it.
You could always push data with Apex code (something like this). We'd split the solution into 2 bits.
The part that gets actual work done: At high level you'd write function that takes list of Order ids as parameter, queries them, calls req.setBody(JSON.serialize([SELECT Id, OrderNumber FROM Order WHERE Id IN :ids]));... If the API needs some special authentication - you'd look into "Named Credentials". Hard to say what you'll need without knowing more about your target.
And the part that would call this Apex when the time comes. Could be more code (a nightly scheduled job that makes these callouts 1 minute after midnight?) https://salesforce.stackexchange.com/questions/226403/how-to-schedule-an-apex-batch-with-callout
Could be a flow / process builder (again, you probably want time-based flows) that calls this piece of Apex. The "worker" code would have to "implement interface" (a fancy way of saying that the code promises there will be function "suchAndSuchName" that takes "suchAndSuch" parameters). Check Process.Plugin out.
For pulling data... well, target application could login to SF (SOAP, REST) and query the table of orders once a day. Lots of integration tools have Salesforce plugins, do you already use Azure Data Factory? Informatica? BizTalk? Mulesoft?
There's also something called "long polling" where client app subscribes to notifications and SF pushes info to them. You might have heard about CometD? In SF-speak read up about Platform Events, Streaming API, Change Data Capture (although that last one fires on change and sends only the changed fields, not great for pushing a complete order + line items). You can send platform events from flows too.
So... don't dive straight to coding the solution. Plan a bit, the maintenance will be easier. This is untested, written in Notepad, I don't have org with orders handy... But in theory you should be able to schedule it to run at 1 AM for example. Or from dev console you can trigger it with Database.executeBatch(new OrderSyncBatch(), 1);
public class OrderSyncBatch implements Database.Batchable, Database.AllowsCallouts {
public Database.QueryLocator start(Database.BatchableContext bc) {
Date cutoff = System.today().addDays(10);
return Database.getQueryLocator([SELECT Id, Name, Account.Name, GrandTotalAmount, OrderNumber, OrderReferenceNumber,
(SELECT Id, UnitPrice, Quantity, OrderId FROM OrderItems)
FROM Order
WHERE Status = 'Processing' AND EffectiveDate = :cutoff]);
}
public void execute(Database.BatchableContext bc, List<sObject> scope) {
Http h = new Http();
List<Order> toUpdate = new List<Order>();
// Assuming you want 1 order at a time, not a list of orders?
for (Order o : (List<Order>)scope) {
HttpRequest req = new HttpRequest();
HttpResponse res;
req.setEndpoint('https://example.com'); // your API endpoint here, or maybe something that starts with "callout:" if you'd be using Named Credentials
req.setMethod('POST');
req.setHeader('Content-Type', 'application/json');
req.setBody(JSON.serializePretty(o));
res = h.send(req);
if (res.getStatusCode() == 200) {
o.Status = 'Activated';
toUpdate.add(o);
}
else {
// Error handling? Maybe just debug it, maybe make a Task for the user or look into
// Database.RaisesPlatformEvents
System.debug(res);
}
}
update toUpdate;
}
public void finish(Database.BatchableContext bc) {}
public void execute(SchedulableContext sc){
Database.executeBatch(new OrderSyncBatch(), Limits.getLimitCallouts()); // there's limit of 10 callouts per single transaction
// and by default batches process 200 records at a time so we want smaller chunks
// https://developer.salesforce.com/docs/atlas.en-us.apexref.meta/apexref/apex_methods_system_limits.htm
// You might want to tweak the parameter even down to 1 order at a time if processing takes a while at the other end.
}
}
Is there an api to access get all available values async in Office.js, specifically Office.context.mailbox.item in Outlook?
I do not see anything in the docs.
I need to capture 10 or so fields, and to date have only implemented with callbacks, e.g.
var ITEM = Office.context.mailbox.item;
var wrapper = //fn to parse results and call next field getAsync as cb
ITEM.end.getAsync(wrapper);
The documentation reference you have provided stated the Office.context.mailbox.item is the namespace. The namespace don't have the method which would enumerate all other methods in the namespace and return some consolidated result, instead you would use specific method, get the result and move to the next method you are interested in. This is all Office.js API offered for the item.
If you need to get several item properties at once, you may look at EWS request support of Office.js API by calling to Office.context.mailbox.makeEwsRequestAsync. Inside your XML request you may specify fields you are interested in and retrieve them with one request/response. Refer to Call web services from an Outlook add-in article for more information.
Yet another option to get several item properties at once is to Use the Outlook REST APIs from an Outlook add-in
I solved this with jQuery.when
const dStart = $.Deferred()
const dEnd = $.Deferred()
Office.context.mailbox.item.start.getAsync((res) => {
// check for errors and fetch result
dStart.resolve()
})
Office.context.mailbox.item.end.getAsync((res) => {
// check for errors and fetch result
dEnd.resolve()
})
$.when(dStart, dEnd).done(function() {
// will fire when d1 and d2 are both resolved OR rejected
}
If you don't want jQuery you can use promises and Promise.all
I have to implement MVC .Net Web api (say "Main" api) which includes two parts.
1) Database call to fetch the record.
2) And more than 1000s of another web api call(response time 100 ms on avg. for each) which will use records returned by above db call.
Also, the Main api will be called in every 3 seconds continuously. I tried implementing using async/await method but didn't find much progress and when trying to test it using Apache Benchmark tool, it throws timeout specified has expired error.
Is there any way to achieve this? Please suggest.
Code snippet
[HttpGet]
public async Task<string> doTaskasync()
{
TripDetails obj = new TripDetails();
GPSCoordinates objGPS = new GPSCoordinates();
try
{
/* uriArray Contains more than 1000 APIs which needs to be exectued. */
string[] uriArray = await dolongrunningtaskasync();
IEnumerable<Task<GPSCoordinates>> allTasks = uriArray .Select(u => GetLocationsAsync(u));
IEnumerable<GPSCoordinates> allResults = await Task.WhenAll(allTasks);
}
catch (Exception ex)
{
return ex.Message ;
}
return "success";
}
Ab.exe test
There is nothing wrong programmatically with your code - except that this is practically a DOS attack on the second (location) API. You should definitely add caching to avoid at least part of the 1000 api calls, especially if you call the main api often (as you wrote).
What I would do is to make the inner api calls a centralized operation instead of making these calls individually inside your web api method. For example you could use a central list for location api calls (tasks) that have been started (but not finished), and another list for results that have been already finished. Both list could be a concurrent dictionary by the unique urls you use for the location api calls.
I am calling a web service using jquery $.ajax() function. In case of error in the web service call i need to call the web service 2 more time.
How should be the best way to do it ?
In a past project I just used Settimeout to check for empty returned results fromthe Ajax call.... so for instance
var failcounter = 0;
if (CHECK SOME CONDITION HERE FOR RESULTS && failcounter < 4){
//IF FOUND TO BE EMPTY THEN WE ADD ONE TO THE 'COUNTER' THEN USE SETTIMEOUT TO RE-TRY...
setTimeout(function () {
failcounter++;
yourajaxFunctioncall();
}, 5000);
};
this is really rough and just tossed together - not all all meant to work from cut/paste since I haven't seen your code - but hopefully you get the idea, use a 'counter' to keep track of an limit the number of times its re-tried... if NO RESULTS are found use settimeout to re-try it again. I've also used Ajax 'error:' (similar to 'success:') to hold specific functions to try upon fail - so that may be easier -- just depends on how you want to handle I guess. good luck
I am using Silverlight, WCF combination.
Currently, I am facing one problem with service response.
I am using request response service type.
I have 100 Items and i am going to call 100 services to fetch it's properties.
foreach (ItemDto item in items)
{
ServiceCall();
ServiceSendCount++;
}
private void OnServiceCallCompleted(.....)
{
ServiceReceiveCount++;
}
If i am sending 5 service calls then it returns with 5*5 = 25 responses.
same with as i am sending 10 service calls then it returns with 10*10 = 100 responses.
I am not able to figure out what was the problem....
Can anyone please shed some light on this?
Update:
Please find herewith the service call method.
I agree that each and every time i am sending OnServiceCallCompleted ..
foreach (ItemDto item in items)
{
itemPropertyService.GetItemProperties([parameters] , OnServiceCallCompleted);
ServiceSendCount++;
}
private void OnServiceCallCompleted(.....)
{
ServiceReceiveCount++;
/* here contains my logic to process the response
If it cames mulriple time then my logic will down
*/
}
Can you please let me know the solution for same. As the change in the service is not possible right now.
But it will create the problem of response time.
Suppose i am going to send the request for 100 items at a time.
In My first approch, i will get the first 100 responses with correct data in withing 5-10 sec.
ItemPropertyService itemPropertyService = new ItemPropertyService();
foreach (ItemDto item in items)
{
itemPropertyService.GetItemProperties([parameters] , OnServiceCallCompleted);
ServiceSendCount++;
}
(Problem as i discussed it is going to send me 100 X 100 responses which leads to the timeout)
In Second Approch, (As per u have suggested)
foreach (ItemDto item in items)
{
ItemPropertyService itemPropertyService = new ItemPropertyService();
itemPropertyService.GetItemProperties([parameters] , OnServiceCallCompleted);
ServiceSendCount++;
}
I am getting same responses as of request, but it will take time near about 2 min.
Actually, I am not able to figure out the problem of time consuming.
As our requests are very huge it will really going to take time.
Do you know about this that how to solve this problem?
I am just near to solve my problem.
Can't tell from the code as you didn't show the ServiceCall() function but numbers suggest you are reusing the same proxy object and adding the OnServiceCallCompleted event handler every time you make a request. If you add the same event handler multiple times it is going to fire multiple times and the completed event handler is per proxy object not per request.
Try creating a new proxy object for each request and seeing if you still get multiple responses.