Session lost while on PythonAnywhere Web2Py but NOT locally - session-variables

I use a session variable to pass some info with a redirect:
session.OrigText = XML(str(OrigText))
redirect(URL('SearchResultsOrigText'))
It arrives at the new URL / page / view - SearchResultsOrigText - and works Ok.
But when from that new URL - SearchResultsOrigText- I navigate away (and it doesn't matter where I go from this new page), when returning with the 'back" button of the browser, the session.OrigText is now empty (showing as 'None').
This behaviour is happening only on PA and not locally.
I do not use session.forget anywhere in my code.
Trying to pass the 'html heavy' content in the OrigText as a dictionary variable (not a session variable) gets me into another interesting issue...Python Anywhere says "Something's wrong 502- Back End".
(Silent failing ?) This happens on PythonAnywhere but NOT locally as well.
Sanitizing this var doesn't help...
But let's focus on the first question...
Why is the session variable lost after 1.redirect and 2.leaving the new page / view - when hosted on PythonAnywhere and NOT locally ?
Thanks

Flask sessions, by default use cookies, so it's possible that somewhere in your settings you have a setting for what domain to set the cookies on and that is not set correctly. It's also possible that you haven't set a secret key for sessions.

I faced the same problem recently in PythonAnywhere. I solved it by deleting my domain's cookies from Firefox. I still don't know how they were messed up, though.

Make sure there is no "session.forget(response)" in your code or framework code along the way.

Related

Selenium Webdriver - driver.get() not working correctly when inserting a variable inside

This is solved. I had my baseURL wrong. And it was going to a different site, that redirected to the base url of the correct site. Thanks.
I have a problem using driver.get() with a variable inside. For example:
driver.get(baseURL+othervariable);
When I execute it, the browser goes to baseURL alone.
I have added before that line a console print to make sure the concatenation is ok, like:
System.out.println(baseURL+othervariable);
driver.get(baseURL+othervariable);
And I can see in the console that the concatenation is ok.
The weird thing is that if I insert the url directly without base url, like:
driver.get("http://examplesite.com/subsection");
It works.
Why am I facing this problem? because I'm using a for cycle in order to open an array of URLs that I need to check.
So the structure of my program is something like:
for (i=0 ; i<URLs.lenght ; i++) {
driver.get(baseURL + URLs[i]);
// then do some stuff
}
But the browser always open baseURL alone.
The weird thing is that I don't have any problem when executing this in the lower environments of this website. The problem occurs with the Live site.
Could it be that some configuration in the site is preventing Selenium from going to the desired URL?
But then I don't understand why when I insert the URL directly as a String into driver.get(), it works as expected, even in the live site.
So the problem is when I insert a variable inside, and only in the live site.
I'm totally confused. I tried Firefox driver, Chrome driver, etc. All do the same.
I also tried:
String finalURL = baseURL+URLs[i];
driver.get(finalURL);
And it refuses to open the complete URL. I never had this problem in many tests. Many times I executed driver.get() with variables and concatenations inside and I never faced this problem.
Could someone give me a hint? why is the problem only appearing when sending the URL as a variable but not when I send it as a String?
I'm using Selenium 3.0 btw.
Thanks for your help.
Try to be sure of URLs syntaxe (especially \ and //)
For some website, incorrect URLs redirect to baseURL page.
If you're working on an AngularJS project, you may need some waits (try ngWebDriver)
Good luck !

Partial view never affects site after upload and IIS reset

I'm including a partial view in one of the pages of my MVC4 site which when deployed, even after a restart of the web site and recycling of the app pool, never seems to affect the site
I'm wondering if this is caching coming into play
I ended up including the content of the partial view in the page iteself and that seems to have worked but I'd rather have the partial view as I'd like to use the view in other parts of the site
The question is, does anyone have any suggestions as to why this might be happening? I've been pulling my hair out trying to get a view to post the correct data, only to realise that some of the hidden inputs are just missing because the partial view has not been refreshed
EDIT:
Ok now I have a need to use this partial view in more than once place. On my dev environment the partial is rendering correctly. Uploading to the server doesn't seem to have any effect, but what's worse, deleting the partial view from the server also has the same issue.
My site still thinks the file is there and complains about the model type passed to the view now (I changed the model type in the view - all working fine on my local dev) - why does it still think the file is there?
I deleted one of the parent views to see if the site carried on working, but as soon as I delete any other file, the site is affected. Why is this particular file giving me trouble? It's as if the server has cached it at the file system level and is supplying the wrong file content to ASP.NET
I'm going to try renaming the file next
Ok so renaming the file appears to have worked.
I didn't try Fals suggestion but I might try that next time, strange behaviour I'm not going to try and understand why at this point!

Multiple RequiredFieldValidator crashes page

I have a very strange problem. I've recently added MVC4 to an old Web Forms project. I did this by creating a new project, and adding the old files to the new project (rather than opposite approach of copying the new MVC files in). When I did this, one of my Web Forms pages stopped working - When I try to access it, it redirects to HTTP Error 404.0 - Not Found.
The file is there, and I also have other Web Forms (.aspx) pages that load without any issues. To pin-point the issue, I created a Web Forms page with the same name to replace it, and it the blank page loads. I started adding code to the new page one line at a time till I found the issue.
I finally found that what caused the issue is when I have more than one RequiredFieldValidator tag on the page. Any idea why this would happen or what I can do to work around it?
assign groups to the validators. Put them in separate groups, dont put them in same groups , make sure You dont have controltoassign be same because then during compilation same requiredfield validator might have concurrency issues and crash the page. Putting them in different groups assures that even more. If you still get the same issue then try this as well
Open IIS Manager
Right Click the server name
Select properties
Click the MIME Types button
Click New
Extension is .pdf
MIME type is application/pdf

How to stop firefox from downloading and applying CSS via a firefox extension?

Thanks to everyone in advance -
So I have been banging on this issue for quite a while now and have burned through all my options. My current approach to canceling css requests is with nsIRequest.cancel inside of nsIWebProgressListener.onStateChange. This works most of the time, except when things are a little laggy a few will slip through and jump out of the loadgroup before I can get to them. This is obviously a dirty solution.
I have read through the following links to try and get a better idea of how to disable css before a nsIRequest is created...no dice.
https://developer.mozilla.org/en/Document_Loading_-_From_Load_Start_to_Finding_a_Handler
https://developer.mozilla.org/en/The_life_of_an_HTML_HTTP_request
https://developer.mozilla.org/en/Bird's_Eye_View_of_the_Mozilla_Framework
How do I disable css via presentation objects/interfaces? Is this possible? Inside of nsIDocShell there are a few attributes that kind of imply you can disable css via the browsers docshell - allowPlugins, allowJavascript, allowMetaRedirects, allowSubframes, allowImages.
Any suggestions?
Thanks,
Sam
The menu option that disables style sheets uses a function
setStyleDisabled(true)
so you probably can just call this function whenever new browser tab is created. Style sheets are still requested from server, but not applied. This function is not very sophisticated and doesn't mess with nsIRequest, source:
function setStyleDisabled(disabled) {
getMarkupDocumentViewer().authorStyleDisabled = disabled;
}
Digging in Web Developer Toolbar source code I have noticed that their "disable stylesheets" function loops trough all document.styleSheets and sets the disabled property to true, like:
/* if DOM content is loaded */
var sheets = document.styleSheets;
for(var i in sheets){ sheets[i].disabled = true; }
So if the key is to not apply CSS to pages, one of the above solutions should work. But if you really need to stop style sheets from being downloaded from servers, I'm affraid nsIRequest interception is your only option.
Set permissions.default.stylesheet to 2 and voilĂ !
You can actually use the permissions manager to block or allow stylesheets on a host-by-host basis.
Unfortunately there doesn't seem to be a simple flag like allowImages. The bugzilla adding for that is https://bugzilla.mozilla.org/show_bug.cgi?id=340746. You can now vote for it using the new bugzilla voting functionality. You can also add yourself to the CC list to be notified if anyone ever works on it.
A related request is to just give us basic HTML parsing support, which may be what you are trying to do. Unfortunately that isn't supported yet either, but you can vote/track the bugzilla for that at https://bugzilla.mozilla.org/show_bug.cgi?id=102699.
So the only workable solution seems to be some sort of interception as #pawal suggests. Here is a link that talks about the basics of interception to at least get you/us started https://developer.mozilla.org/en/XUL_School/Intercepting_Page_Loads. It lists several options that I list below.
These first few seem to just be at the page/document level so I don't think they help:
Load Events (addEventListener load)
Web Progress Listeners (nsIWebProgressListener) - I tried this approach, it only seems to be called for the page itself, not for content within the page.
Document Loader Service - A global version of nsIWebProgressListener so I think it has the same problem (page level only)
That leaves two others I have not tried yet. They work globally so you would need to filter them to just the browser/pages you care about.
HTTP Observers - Seems like it might work, need to verify it calls back for CSS
Content Policy - Seems like the best option to me since it explicitly is called for CSS, someday I hope to try it :)

Secured and unsecured items message in IE

I'm getting "This page contains bothe Secure and Non secure items"message in IE. When I commented the following piece of code from dojo.js.uncompressed.js file, the message is gone.
if(dojo.isIE){
if(!dojo.config.afterOnLoad){
document.write('<scr'+'ipt defer src="//:" '
+ 'onreadystatechange="if(this.readyState==\'complete\'){' + dojo._scopeName + '._loadInit();}">'
+ '</scr'+'ipt>'
);
}
Is that an issue with the dojo? I would like to move the commented code to another custom file so that the dojo framework is not affected. Can you suggest a better way of implementing it.
Thanks.
You would get that error if you're using frames or have external files where some of the files have https URLs while some have http URLs. Assuming, your main page loads up through https, you could try changing:
src="//:"
to:
src="https//:"
the //: is most likely the problem, as I ran into a similar issue with a chunk of javascript code... In internet explorer, the locaiton //: is not secure, so when your page (presumably on an https:// url) loads, IE notes that you've got your main code loading from a secure location, and another script being loaded in from an unsecure location.
The workaround that I came to was to create an empty file in my web root named "blank.html" (though "blank.js" would probably work better in your case) and replace the //: link with "/blank.html". This results in another hit to your webserver, but browser caching will probably make that impact minimal.