SQLException thrown on await SignInManager.ExternalSignInAsync google+ - authentication

I am trying to implement google+ authentication on my MVC5 app but am struggling to get it working on my live environment.
I have generated new keys for me live site from google apis and deployed the code.
I have set anonymous authentication for my site in iis8 on my server.
When I click the Google+ login button for the first time, I get to the sign in page and can click accept. The process then waits and waits until finally it fails. The second time round, I don't get the google login, but again get loads of waiting until it fails.
The code is failing on this line with a SQLException:
var result = await SignInManager.ExternalSignInAsync(loginInfo, isPersistent: false);
"A network related or instance specific error occured while establishing a connection to the SQL server".
Now as far as I can see with that line of code, I am not trying to connect to my SQL environment. So is this connecting to google to validate? Do I need some firewall ports or IP's opened on the server?
In IIS Express on my local, everything works fine!!
Many thanks

It's impossible to give accurate advice based on your description, but i had about same issue after installed new web app template in Visual Studio (thats how i come here). Probably you forget to change computer name in your web-config/connectionStrings or you dont have sql server installed.

Related

QuickBooks Desktop API connection issues when multiple users logged in

I am new to QuickBooks Desktop integration. I am running into an issue that I am hoping to get your help on.
I have implemented a c# console application (interface program) that connects to QuickBooks to read customer data. Here is the connection code:
sessionManager = new RequestProcessor2();
sessionManager.OpenConnection("", "Test QuickBooks Interface");
// set Auth Preferences
var prefs = sessionManager.AuthPreferences;
prefs.PutUnattendedModePref(QBXMLRPUnattendedModePrefType.umpRequired);
Ticket = sessionManager.BeginSession(QuickBooksCompanyFile, QBFileMode.qbFileOpenMultiUser);
This test application is authorized in QuickBooks. When I run this application on the same machine without QuickBooks running all is fine. It logs in and pulls customer data.
When I log into the same server with another user and start QuickBooks with another QuickBooks user in multi-user mode I get an error "Could not start QuickBooks." when running the code listed. Closing the other QuickBooks instance the code listed works again.
I need to run this program periodically to extract data from QuickBooks so it's important that other users can use QuickBooks at the same time. What am I doing wrong? I thought running the interface program Unattended and open in Muti User Mode should do it but it does not.
Thanks for any help in advance,
Moz

Auth0 stuck in callback

Used auth0 for my react project and it works fine at my local.
When i put my project to the server login page shows up, but after i try to login i get an error.
http://localhost:3000/callback?code=ZSaQ96OshsFfpBUN&state=q3KiPGbEPtIZ3UuSLd.KSbKjdXqk9-pD
firefox says
unable to connect - cant establish a connection to localhost:3000 and on
chrome i got
this site can't be reached - localhost refused to connect
I tried different callback urls on the auth0 side but nothing changed. I do not know exactly what to do actually
It looks like your project isn't running on the same connection that you were locally. Make sure to change all the registered callbacks to their current locations when changing to a different deployment.

LoadRunner VuGen Script Replay Failing for Citrix Module Login

I am using LoadRunner VuGen to record a test I would like to execute using a Citrix module. I am able to successfully record the actions but when I play the actions back, it stops at the following line of code:
ctrx_wait_for_event("LOGON", CTRX_LAST);
The Citrix ICA Client screen is displayed and it is waiting for the Username, Password, and Domain to be entered by I am unable to manually enter the information in and after a couple a seconds it fails. When I initially record my actions, I am logging into Citrix via a web URL. But when I play back the actions it goes directly to the Citrix ICA Client screen without ever opening the URL.
This is Citrix application where Logging into the Citrix Desktop is taking too much time than the parameterized hence throwing the TimeOut Error.
In the error if the IP address is also mentioned, then it is usually is a citrix connectivity/network connectivity issue.
We had a similar error without any IP address, we did the following which resolved the issue:
Check if there are any active citrix sessions, if yes clear/terminate all the sessions. Then run the VuGen script from VuGen and dont try manually
Upgrade the ICA client if it is older than 13, we are using 14.x version

Use of server side MsgBox() is wrong in web application?

I have a web application based on vb.net
I used MsgBox() in my MYPage.aspx.vb where my expectation is that the user on client should click on a button and based on it I would process further.
But I heard a suggestion the MsgBox() is working fine in my code cause on developer machine(my dev box) webserver is running locally. This does not work on a remote web server or in any hosted environment outside of a local server.
Is this true?
Is this true?
Yes. Server side code is executed on the server. The VB.NET Code Behind is executed on the server. You were seeing it on your development station because the server and client were the same machine.
If someone else were to browse to your server from another client, the dialog box would appear on the server and the user would never see it.
If you need to do a confirmation to the user, you need to use some sort of clientside code, like JavaScript and confirm.
There is no MessageBox in a web environment. It wouldn't make much sense to show a MessageBox on the server, but it compiles anyway.
So you either need to use a jQuery approach or a simple alert:
ScriptManager.RegisterStartupScript(page, page.GetType(), "MyScript", "alert('Hello World');", true)

clientaccesspolicy.xml suddenly stopped working (WCF/Silverlight)

Very frustrated with all of this, hoping someone can assist.
I had a Silverlight application and WCF working together without issue for a year. In order to get them working, I had some pain initially but finally worked through it with help. All of the pain came from configuration/security, 401's, cross-domain hell, etc.
The way I have everything setup is that I have a WCF service that resides in it's own application/directory and runs in its own application pool.
On the same web server (IIS7), I have another application/directory with the Silverlight application that points to the aforementioned service.
The server name (for this exercise) is WEBSERVER1. We've created a CNAME for it that is WEB1. In the past, if the user went to http://WEB1/MyApp/ or http://WEBSERVER1/MyApp/ it would work. Suddenly yesterday it started behaving badly. Normal users started getting the Windows challenge/response prompt (and even if they entered the info they would get a 401 error).
My WCF service runs in a site that enables anonymous access (and this has always worked).
My Silverlight application runs in a site that has windows integrated (and this has always worked), since I capture the Windows username when they connect.
For the record, I did create a NEW application pool yesterday with an ASP.NET application that runs in it. This seems to work fine, but there is a chance creating this new application pool and application/directory has caused something to change.
I have a clientaccesspolicy.xml in my wwwroot folder, as well as in the folder for each of the two applications above (just in case). I have tried to promote NTLM over Negotiate as a provider (as that worked for another issue I was having on another server).
After trying some changes, I can't even get the thing to behave the same each time I call it. Sometimes it will prompt me for credentials. Other times it will work, but then say it failed to connect with the WCF service with a "not found". Other times it will actually work fine, but only if I am using the actual server name and not the CNAME. When using the CNAME I always get the crossdomain error, even though I have the cross-domain xml files in every directory root.
This is a nightmare, and makes advanced algorithm analysis seem fun and easy by comparison. Did Microsoft realize how difficult they made this combination of (IIS7/WCF/Silverlight/providers/permissions/cryptic or missing error messages) to get to work??
I found a solution that appears to be working.
In this case, I had to change the authentication mode for the default web site (which hosted the clientaccesspolicy.xml file) from anonymous access to Windows Integrated. I don't understand why this worked for a year or so and then stopped, but it seems to have resolved it.
The new application that I had deployed yesterday was a standard ASP.NET web application, which I put in it's own application directory and it's own application pool, to ensure that it would not cause this sort of issue. I'm still not even sure if it did.
The way I resolved it was by trying to navigate from my PC to the actual http://servername/clientaccesspolicy.xml file, and that was giving me a 401 error. I switched from anonymous to windows integrated on that default website (which has nothing in it except for that xml file) and that resolved the permission issue. I then had to permission the actual AD groups to have read access to that folder (if not they got the user/pw prompt and could not get through).