Question about how to send a jQuery callback with an onSuccess: refresh from a textInput when the user presses [Enter]. We use the [Enter] press to trigger a search callback.
Our GS Seaside app uses HAProxy. This means the onSuccess: script is handled by a different gem than the one that handles the callback. Because of this, users will sometimes get the refresh because the callback, which to them looks like a lost input (a browser F5 refresh shows the right state). If running single gem or in a VW / Pharo image this problem does not come up.
I can work around the problem by using...
async: false;
...but that prevents me from show any kind of waiting feedback (I normally use a busy gif).
So, the question is: in a multi-web server configuration, how can you code a callback to...
1 - show a busy gif
2 - trigger the search callback
3 - refresh the display when done
...all in that order.
Using a form submission callback is a problem because multiple text inputs can trigger the search, since the callback is 'set value + do search', by virtual of the default [Enter] press.
For the JS callback, I'm using...
self onKeyPress: (
(JSStream
on: '(window.event ? window.event.keyCode : event.which) == 13')
then: (canvas jQuery ajax callback: aBlock value: canvas jQuery this value))
It all works fine, except for the missing busy gif, due to the 'async: false'.
Any suggestions?
You can define a beforeSend and a complete handler to show and hide the loading indicator while the request is being processed. The global parameter set to false is meant to ignore your existing handlers to process request start and end (the mentioned spinner), and only use these defined in the particular instance of the JQAjax object.
((html jQuery ajax)
async: false;
global: false; "https://api.jquery.com/category/ajax/global-ajax-event-handlers/"
callback: aBlock value: canvas jQuery this value;
onBeforeSend: (html jQuery id: 'indicator') show;
onSuccess: ((html jQuery id: 'fieldId') load html: [:h | ]);
onComplete: (html jQuery id: 'indicator') hide;
That said, keep in mind that doing a synchronous AJAX call is discouraged since it will block the whole UI thread until the request is resolved.
So it's not completely clear how you manage the state in different worker images (gems, in this case) returning different things (probably because of having different sessions), so it's also not clear to me why doing an async XHR request will be served differently to doing it synchronously, I never experienced that.
From the small sample you mention, it can't be deduced what is the "refresh" part of your code. So maybe, providing more context will help us give you more accurate answers.
Fix ended up being trivial: needed to include 'event.preventDefault();' in the [Enter] key script. Seems obvious in hindsight.
if ((window.event ? window.event.keyCode : event.which) == 13) {
event.preventDefault();
};'
This problem is confirmed to be a narrow configuration symptom: GemStone with multiple gems. On every other configuration Seaside / javascript behaves as expected. I will follow this up as a vendor problem. Thanks to everyone that looked at it (this was also posted on other forums).
Related
I trying to upload a file using pug, multer and express.
The pug form looks like this
form(method='POST' enctype="multipart/form-data")
div.form-group
input#uploaddata.form-control(type='file', name='uploaddata' )
br
button.btn.btn-primary(type='submit' name='uploaddata') Upload
The server code looks like this (taken out of context)
.post('/uploaddata', function(req, res, next) {
upload.single('uploaddata',function(err) {
if(err){
throw err;
} else {
res.json({success : "File upload sucessfully.", status : 200});
}
});
})
My issue is that while the file uploads successfully, the success message is not shown on the same page, ie: a new page is loaded showing
{success : "File upload sucessfully.", status : 200}
As an example for other elements (link clicks) the message is displayed via such javascript:
$("#importdata").on('click', function(){
$.get( "/import", function( data ) {
$("#message").show().html(data['success']);
});
});
I tried doing a pure javascript in order to workaround the default form behaviour but no luck.
Your issue has to do with mixing form submissions and AJAX concepts. To be specific, you are submitting a form then returning a value appropriate to an AJAX API. You need to choose one or the other for this to work properly.
If you choose to submit this as a form you can't use res.json, you need to switch to res.render or res.redirect instead to render the page again. You are seeing exactly what you are telling node/express to do with res.json - JSON output. Rendering or redirecting is what you want to do here.
Here is the MDN primer on forms and also a tutorial specific to express.js.
Alternatively, if you choose to handle this with an AJAX API, you need to use jquery, fetch, axios, or similar in the browser to send the request and handle the response. This won't cause the page to reload, but you do need to handle the response somehow and modify the page, otherwise the user will just sit there wondering what has happened.
MDN has a great primer on AJAX that will help you get started there. If you are going down this path also make sure you read up on restful API design.
Neither one is inherently a better strategy, both methods are used in large-scale production applications. However, you do need to choose one or the other and not mix them as you have above.
I have searched Google Tag Manager (GTM) documentation and did not find anything addressing this issue. I'm working with an affiliate network who wants their tracking pixel to be loaded synchronously to ensure that it fires before the content of the page loads. This only applies to our order confirmation page. We've implemented GTM so we can launch and fix tags in 5 minutes versus 2 weeks.
GTM is installed at the top of our order confirmation page and loads tags asynchronously so I know that the affiliate networks' tags are loading very quickly but the networks are still concerned that some data may be lost. GTM doesn't have any options or documentation that would indicate that it is possible to load the script synchronously but I notice the j.async=true name-value pair in their code.
If I change that part of the code to j.async=false, will the code load synchronously or will I just break it?
Here's the full code for reference.
<!-- Google Tag Manager -->
<noscript><iframe src="//www.googletagmanager.com/ns.html?id=GTM-XXXX"
height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript>
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'//www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','myNewName','GTM-XXXX');</script>
<!-- End Google Tag Manager -->
The article on Cardinal Path is great for learning how to setup synchronous tags.
However, as for the concern about ensuring there is no lost data, I would like to highlight Mickael's comment at the bottom of that article. Since the dataLayer.push might not be complete before the gtm.dom event occurs, you should include a custom event in the dataLayer.push so you are not relying on gtm.dom to fire your tags.
Here is an example tag with a custom event specific to page load.
Tag Name: pageLoad
Tag Type: Custom HTML Tag
HTML:
<script type="text/javascript">
var tid = setInterval( function () {
if ( document.readyState !== 'complete' ) return;
clearInterval( tid );
dataLayer.push({ "event": "pageLoaded" });
}, 100 );
</script>
Add the following Firing Rule.
Rule Name: All pages
Condition: {{url}} matches RegEx .*
Then, on the affiliate tags you want fired, all you have to do is include this custom event (i.e. pageLoaded) as a condition on the firing rule. In this example, the condition would be: {{event}} equals pageLoaded
We had a similar requirement with regards to defining a page type and then operating a rule/macro based on that input.
The suggestion I found was to have a rule in GTM to determine pageload/rule load etc.
I found it in this article, http://www.cardinalpath.com/controlling-tag-firing-order-with-google-tag-manager/
Hopefully that will help solve it - in theory you should be able to, have something that will trigger the GTM code before the page load.
Before jumping in with an answer, please make sure you understand my scenario.
I have ajax calls that CREATE flashes.
I have other ajax calls that FETCH the flashes as JSON.
What is currently happening: I click a button which creates the flash. After which I run a ajax call that executes:
public function actionGetAllFlashesAsJSON() {
$flashMessages = Yii::app()->user->getFlashes(true);
$returnResult = array();
foreach ($flashMessages as $key => $value) {
$newItem = array();
$newItem['message'] = $value;
$newItem['kind'] = $key;
$returnResult[]= $newItem;
}
print json_encode($returnResult);
die();
}
My problem is, when I execute this function twice in a row, it still keeps returning the flashes. However, if I refresh the site, it shows the error, and then if I press refresh again, it's gone. My theory is that page refresh is causing some other kind of deletion of messages... but what? And how can I force the deletion of these messages after I receive the message in the above code?
More background info: I am using the flashes as ERROR messages, but i want them to appear at the top of my site AS THEY ARE CREATED. Flashes might get created via Ajax, so I have javascript running to check for new messages, and display them, but my problem is it shows the messages several times, because they are not getting deleted after calling getFlashes?
The flash messages are controlled by SESSION variables, which Yii destroys when the page is loaded (probably somewhere quite deep in the framework). You will have to manually destroy all the previous flash messages at the start of the ajax request
You can use: getFlashes() to get all the existing flash messages
For the other flash message methods have a look at the CWebUser docs here
I'm trying to write an extension that will block access to (configurable) list of URLs if they are accessed more than N times per hour. From what I understand, I need to have a start script pass a "should I load this" message to a global HTML page (who can access the settings object to get the list of URLs), who will give a thumbs up/thumbs down message back to the start script to deny/allow loading.
That works out fine for me, but when I use the usual beforeLoad/canLoad handlers, I get messages for all the sub-items that need to be loaded (images/etc..), which screws up the #accesses/hour limit I'm trying to make.
Is there a way to synchronously pass messages back and forth between the two sandboxes so I can tell the global HTML page, "this is the URL in the window bar and the timestamp for when this request came in", so I can limit duplicate requests?
Thanks!
You could use a different message for the function that checks whether to allow the page to load, rather than using the same message as for your beforeLoad handler. For example, in the injected script (which must be a "start" script), put:
safari.self.tab.dispatchMessage('pageIsLoading');
And in the global script:
function handleMessage(event) {
if (event.name == 'pageIsLoading') {
if (event.target.url.indexOf('forbidden.site.com') > -1) {
console.log(event.timeStamp);
event.target.url = 'about:blank';
}
}
}
I am remotely submitting a form potentially several times in close succession. For this particular case, debouncing is not an option. I am looking for the jQuery equivalent of .abort() for remote forms submitted using .submit() so I can cancel all previous submissions when a new one is made.
Thanks in advance,
Garrett
What I did is attach to the beforeSend event of the form, store the xhr object as part of the form's data, and abort it if a new request is enqueued:
$('form').bind("ajax:beforeSend", function(evt, xhr) {
console.log('Enqueued new xhr request');
var prevXhr = $(this).data('current-xhr');
if (prevXhr) {
prevXhr.abort();
console.log('Aborting previous xhr request');
}
$(this).data('current-xhr', xhr);
});
And you can still use the :remote => true option in the form straightforward.
Maybe you could use jQuery .queue() to queue your submits and .clearQueue() to clear the queue when needed. Just an idea...
I don't understand your problem. If the form is submitted, a request is sent to the server... so, how can you abort that?
If by abort you mean, cancelling the processing of the response, just use a control variable that you increase/decrease to know if there are pending requests.
From the API: http://api.jquery.com/category/ajax/ you can't do that.