I've installed the ServiceControl Management Utility and I'm trying to add an instance.
I would like to run the instance under a service account because we use SQLServer transport but pmthe installation page I get the error "Invalid password".
The account is hosting another windows service on the same machine.
I've tried other admin accounts and creating the instance through the UI and Powershell scripts.
I'm 200% sure the password is correct.
Is there anyway I can increase the logging to determine what is failing?
Strangely, I can change the service account under the initial install and it works.. I was eventually able to get the service running using an SQL login but I would prefered to use integrated security and not keep the username and password in the connection string.
A patch that addresses this bug has been released. See - https://github.com/Particular/ServiceControl/releases/tag/1.7.3. Thanks Kye for making us aware of the issue
This is code that does the validation.
public bool CheckPassword(string password)
{
if (Domain.Equals("NT AUTHORITY", StringComparison.OrdinalIgnoreCase))
{
return true;
}
var localAccount = Domain.Equals(Environment.MachineName, StringComparison.OrdinalIgnoreCase);
var context = localAccount ? new PrincipalContext(ContextType.Machine) : new PrincipalContext(ContextType.Domain);
return context.ValidateCredentials(QualifiedName, password);
}
So in a multi-domain environment it might run into trouble.
Raise a bug here and we will be able to give you a better response.
Related
As far as I know I have followed the instructions for setting up the Microsoft Graph sample https://github.com/microsoftgraph/aspnet-snippets-sample, including joining the Microsoft 365 Developer Program. However when I run the sample and attempt to log in with my new Developer Program user I get the following error message:
Authentication error "Value cannot be null. (Parameter 'value')"
The error us being caught in this routine:
options.Events.OnAuthenticationFailed = context => {
var error = WebUtility.UrlEncode(context.Exception.Message);
context.Response
.Redirect($"/Home/ErrorWithMessage?message=Authentication+error&debug={error}");
context.HandleResponse();
return Task.FromResult(0);
};
If I log in with my personal Microsoft account then everything works fine, so I'm guessing this has something to do with my Developer Program account. The error message isn't very helpful and there not stack trace to speak of. I've tried using Fiddler to see if there's any more information but with no luck either. Any ideas about what I might be doing wrong?
Tested and reproduced the issue. You are getting this error because, as you have created application in one tenant and accessing the application with the Developer Program account(another tenant).
The Solution would be provide Provision for the application (service principal creation) Consent the application with the developer Program account(needs to be admin) if you want to use the application in developer tenant and Please below call
https://login.microsoftonline.com/common/oauth2/authorize?response_type=code&client_id=id&redirect_uri=https%3A%2F%2Flocalhost%3A44307%2F&prompt=admin_consent
After looking at the error messages in the output from the Kestrel web server I discovered that the line causing the problem was this one:
identity.AddClaim(new Claim(GraphClaimTypes.TimeZone, user.MailboxSettings.TimeZone));
and it was because the value of user.MailboxSettings.TimeZone was null. Once I set up a TimeZone in my developer account then everything works fine.
When I try to clone a tfs hosted git repo http://tfstta.com:8080/tfs/DefaultCollection/_git/SampleTFSGit from my linux machine, I face the Authentication not supported error:
org.eclipse.jgit.api.errors.TransportException: http://:#tfstta.int.thomson.com:8080/tfs/DefaultCollection/_git/SampleTFSGit.git: authentication not supported*
Enabling basic authentication/alternate credentials does not seem to be an option.
Could someone please tell me a work around for this? I would be very grateful!
Goto Eclipse Menu
Window -> Preferences -> Team -> Git -> right side panel update time to 3000. `Connection timeout (seconds): 3000. Click on Apply and Close button. Clone again it will solve your problem.
This issue happens because JGit doesn't fully support NTLM, and instead of falling back to Basic auth or something else, it will stop right there.
Usually TFS answers failed authentication with multiple WWW-Authenticate headers. What happens here is that there is a bug in JGit's org.eclipse.jgit.transport.http.apache.HttpClientConnection, that will take into consideration only the last of the WWW-Authenticate headers, making it give up before even trying other connection types.
What I suggest is use your own implementation of org.eclipse.jgit.transport.http.HttpConnection, implementing like this:
#Override
public Map<String, List<String>> getHeaderFields() {
Map<String, List<String>> ret = new HashMap<>();
for (Header hdr : resp.getAllHeaders()) {
List<String> list;
if(ret.containsKey(hdr.getName())) list = ret.get(hdr.getName());
else { list = new LinkedList<>(); ret.put(hdr.getName(), list); }
for (HeaderElement hdrElem : hdr.getElements())
list.add(hdrElem.toString());
}
return ret;
}
Or if you are lazy (like me), you can just switch to org.eclipse.jgit.transport.http.JDKHttpConnection and be happy because it uses native Java connection underneath, that works correctly.
If you are trying to use Spring Cloud Config Server with a TFS Git Repository, my choice is just to implement your own ConfigurableHttpConnectionFactory
/**
* This will use native Java connections, instead of crappy ecplise implementation.
* There will be no management of implementation though. I cannot assure
*/
public class SpringJDKConnectionFactory extends JDKHttpConnectionFactory implements ConfigurableHttpConnectionFactory {
#Override
public void addConfiguration(MultipleJGitEnvironmentProperties environmentProperties) {
}
}
And have a configuration loading over the Spring's default:
#Configuration
public class JGitConnectionFactoryConfiguration {
#Bean
#Primary
public ConfigurableHttpConnectionFactory configurableHttpConnectionFactory() {
return new SpringJDKConnectionFactory();
}
}
But beware, TFS will probably not like Basic auth with direct passwords. So create a "Personal Access Token" in TFS, and use that as a password instead.
Simple sample code:
public static void main(String[] args) throws GitAPIException, IOException {
CloneCommand cmd;
String url = "http://tfs-url.com/Git-Repo";
File file = new File("build/git_test");
if(file.exists())
FileUtils.delete(file,FileUtils.RECURSIVE);
cmd = new CloneCommand();
cmd.setDirectory(file);
cmd.setURI(url);
//#use Personal access tokens as basic auth only accepts these
cmd.setCredentialsProvider(new UsernamePasswordCredentialsProvider("UserAccount","personalaccesstoken"));
ConfigurableHttpConnectionFactory cf = new SpringJDKConnectionFactory();
HttpTransport.setConnectionFactory(cf);
Git git = cmd.call();
}
You might want to try https://www.visualstudio.com/en-us/products/team-explorer-everywhere-vs.aspx since it is Microsoft's cross-platform TFS command-line. The code is posted on GitHub if you want to try and patch the authentication helpers back to jGit.
I would recommend you to upgrade your TFS server to the latest Update 3 and then use SSH Authentication for Git Repository.
SSH Support for Git Repos
With TFS 2015 Update 3, you can now connect to any Team Foundation
Server Git repo using an SSH key. This is very helpful if you develop
on Linux or Mac. Just upload your personal SSH key and you're ready to
go.
I have faced this issue with a new pc (configured by someone else). Fixed error with reinstalling JDK and running eclipse with it.
I used a bad approach but for initial work, it's fine for me.
In my case, I switched Project visibility on gitlab from private to public. Go to Gitab -> <your project> -> Settings -> General -> Visibility, project features, permissions -> switch to Public
In application.properties I added only spring.cloud.config.server.git.uri without authentication properties and also at the end of the gitlab uri added .git
http://gitlab.com/<your-repo-name>.git
I don't recommend this approach for people who work tasks for the company.
When you use command below, you'll be prompted to enter the username and password.
git-clone http://:#tfstta.int.thomson.com:8080/tfs/DefaultCollection/_git/SampleTFSGit
In my test, when send the command, you'll be prompted a Windows Security. It's not needed to use basic authentication/alternate credentials, simply type domaine\username and the password will connect to TFS.
I have an Azure web service sitting behind Azure API Management. This means that the API Management layer uses SSL to talk to my service, along with a client cert for authentication. I am running into what seems to be a common issue with this kind of setup where POST sizes greater than 49152 result in error 413 RequestEntityTooLarge. There are a number of docs that reference the UploadReadAheadSize setting, but all of my attempts to set this value in Web.config result in internal server errors. Here is how I am setting the value:
<system.webServer>
<serverRuntime uploadReadAheadSize="1048576" />
Ideally I want to use something larger, but I am just trying to get things to work first. The moment I deploy with this setting all subsequent requests fail with internal server error. I can't find anything in my diagnostic logs to indicate why that failure is occurring.
Looking for any pointers on where/how to set this value. Thanks!
Finally figured this out. Note that ideally since I am using only cert auth I should be able to set the sslFlags to "required". I tried that but was unable to get it work work properly with Azure API Management. I kept getting 403.7 errors from IIS. For now I am leaving it set to "negotiate" and increasing the value of uploadReadAheadSize as outlined below:
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
try
{
using (ServerManager server = new ServerManager())
{
string siteName = $"{RoleEnvironment.CurrentRoleInstance.Id}_Web";
Configuration config = server.GetApplicationHostConfiguration();
ConfigurationSection accessSection = config.GetSection("system.webServer/security/access", siteName);
accessSection["sslFlags"] = #"Ssl,SslNegotiateCert";
ConfigurationSection runtimeSection = config.GetSection("system.webServer/serverRuntime", siteName);
runtimeSection["uploadReadAheadSize"] = 5242880;
server.CommitChanges();
}
}
catch (Exception e)
{
Trace.TraceError(e.Message);
throw;
}
return base.OnStart();
}
}
I'm using FiddlerCore to capture HTTP requests. Everything is working including SSL Captures as long as the Fiddler certificate is manually installed. I've been using manual installation through Fiddler's Options menu and that works fine.
However, if I use the FiddlerCore provided CertMaker class static methods to add the Fiddler certificate I find that I can use the certificate added to the cert root only in the current session. As soon as I shut down the application and start back up, CertMaker.rootCertExists() returns false.
I use the following code to install the certificate for the current user (from an explicit menu option at this point):
public static bool InstallCertificate()
{
if (!CertMaker.rootCertExists())
{
if (!CertMaker.createRootCert())
return false;
if (!CertMaker.trustRootCert())
return false;
}
return true;
}
The cert gets installed and I see it in the root cert store for the current user. If I capture SSL requests in the currently running application it works fine.
However, if I shut down the running exe, restart and call CertMaker.certRootExists() it returns false and if I try to capture SSL requests the SSL connection fails in the browser. If I recreate the cert and then re-run the requests in the browser while the app stays running it works again. I now end up with two certificates in the root store.
After exiting and relaunching certMaker.certRootExists() again returns false. Only way to get it to work is to register the cert - per exe session.
What am I doing wrong to cause the installation to not stick between execution of the same application?
I was able to solve this problem and create persistent certificates that are usable across EXE sessions, by removing the default CertMaker.dll and BcMakeCert.dll assemblies that FiddlerCore installs and using and distributing the makecert.exe executable instead.
makecert.exe appears to create certificates in such a way that they are usable across multiple runs of the an application, where the included assemblies are valid only for the current application's running session.
Update:
If you want to use the CertMaker.dll and BcMakeCert.dll that FiddlerCore installs by default, you have to effectively cache and set the certificate and private key, using Fiddlers internal preferences object. There are a couple of keys that hold the certificate after it's been created and you need to capture these values, and write them into some sort of configuration storage.
In the following example I have a static configuration object that holds the certificate and key (persisted to a config file when the app shuts down):
public static bool InstallCertificate()
{
if (!CertMaker.rootCertExists())
{
if (!CertMaker.createRootCert())
return false;
if (!CertMaker.trustRootCert())
return false;
// persist Fiddlers certificate into app specific config
App.Configuration.UrlCapture.Cert =
FiddlerApplication.Prefs.GetStringPref("fiddler.certmaker.bc.cert", null);
App.Configuration.UrlCapture.Key =
FiddlerApplication.Prefs.GetStringPref("fiddler.certmaker.bc.key", null);
}
return true;
}
public static bool UninstallCertificate()
{
if (CertMaker.rootCertExists())
{
if (!CertMaker.removeFiddlerGeneratedCerts(true))
return false;
}
// persist Fiddlers certificate into app specific config
App.Configuration.UrlCapture.Cert = null;
App.Configuration.UrlCapture.Key = null;
return true;
}
After installing a certificate this code captures the certificate and private key into the configuration object which persists that value later. For uninstallation, the values are cleared.
At the beginning of the application or the beginning of the capture process, prior to calling CertMaker.rootCertExists() the keys are set from the configuration values. I do this at the beginning of my capture form:
public FiddlerCapture()
{
InitializeComponent();
// read previously saved Fiddler certificate from app specific config
if (!string.IsNullOrEmpty(App.Configuration.UrlCapture.Cert))
{
FiddlerApplication.Prefs.SetStringPref("fiddler.certmaker.bc.key",
App.Configuration.UrlCapture.Key);
FiddlerApplication.Prefs.SetStringPref("fiddler.certmaker.bc.cert",
App.Configuration.UrlCapture.Cert);
}
}
Using this mechanism for saving and then setting the capture settings makes the certificates persist across multiple EXE sessions when using CertMaker.dll.
More detailed info is available this detailed blog post on FiddlerCore.
If anyone is still interested, I found an easier solution based on the demo that Fiddler provides. This demo simply calls CertMaker.trustRootCert(), and strangely enough, it sticks! The first time it will ask whether you want to install the certificate, but after that, the function just returns true and will not cause the pop-up to show.
Unlike your and mine original program, the certificate sticks without having to go to the trouble of letting it stick yourself, so I analysed the differences with the demo. One of the differences I noticed was that the demo didn't have a reference to CertMaker.dll and BCMakeCert.dll. After removing these references from my own solution, I got the same behaviour as the demo.
Unfortunately, I don't have an explanation to why this works, but I hope this still helps some people.
I'm in the process of developing my first Orchard CMS module, which will interface with Exchange Server for the purpose of adding Exchange Task functionality to Orchard (basically providing web management of personal Tasks). Unfortunately, I don't think Office 365 supports the type of authentication required. This Microsoft document outlines some instructions on setting up a service account with impersonation rights, in order to use Exchange Web Services.
Unfortunately, I need to be able to run the "New-ManagementRoleAssignment" cmdlet, in order to assign the impersonation rights. The error I'm receiving when attempting this cmdlet is:
The term 'New-ManagementRoleAssignment' is not recognized as the name of a cmdlet, function, script file, or operable program.
I'm definetely connected properly, as instructed in that previous URL. Everything I'm reading suggests that this command should be available. Am I missing something? I'm using the Enterprise version of Office 365, in case that matters. The account that I'm using to log in with PowerShell is my global admin account.
Any help and/or insight would be very much appreciated! I have a support in with Microsoft as well, so I'll post anything I get back from them.
Vito
[EDIT]
I've decided to add some code, for those who have an Exchange Server and are interested in trying this out. You'll have to download the Exchange Web Services dll, in order to make use of the namespace Microsoft.Exchange.WebServices.
using Microsoft.Exchange.WebServices.Data;
using Microsoft.Exchange.WebServices.Autodiscover;
private static ExchangeService _service;
private static void ConnectToExchangeService()
{
_service = new ExchangeService(ExchangeVersion.Exchange2010_SP1);
_service.TraceEnabled = true;
_service.Credentials = new System.Net.NetworkCredential("me#domain.com", "password");
AutodiscoverService ads = new AutodiscoverService();
ads.EnableScpLookup = false;
ads.RedirectionUrlValidationCallback = delegate { return true; };
GetUserSettingsResponse grResp = ads.GetUserSettings("me#domain.com", UserSettingName.ExternalEwsUrl);
Uri casURI = new Uri(grResp.Settings[UserSettingName.ExternalEwsUrl].ToString());
_service.Url = casURI;
ControllerContext ctx = new ControllerContext();
ctx.HttpContext.Response.Write("Server Info: " + _service.ServerInfo.VersionString);
ctx.HttpContext.Response.Flush();
}
AFAIK, the cmdlet New-ManagementRoleAssignment is not available for the Small Business Plan (P1) on Office 365. However, the administrator is assigned impersonation rights by default so you have to connect with the administrator credentials.