Exchange Web Services authentication problem against Office 365 - api

I'm in the process of developing my first Orchard CMS module, which will interface with Exchange Server for the purpose of adding Exchange Task functionality to Orchard (basically providing web management of personal Tasks). Unfortunately, I don't think Office 365 supports the type of authentication required. This Microsoft document outlines some instructions on setting up a service account with impersonation rights, in order to use Exchange Web Services.
Unfortunately, I need to be able to run the "New-ManagementRoleAssignment" cmdlet, in order to assign the impersonation rights. The error I'm receiving when attempting this cmdlet is:
The term 'New-ManagementRoleAssignment' is not recognized as the name of a cmdlet, function, script file, or operable program.
I'm definetely connected properly, as instructed in that previous URL. Everything I'm reading suggests that this command should be available. Am I missing something? I'm using the Enterprise version of Office 365, in case that matters. The account that I'm using to log in with PowerShell is my global admin account.
Any help and/or insight would be very much appreciated! I have a support in with Microsoft as well, so I'll post anything I get back from them.
Vito
[EDIT]
I've decided to add some code, for those who have an Exchange Server and are interested in trying this out. You'll have to download the Exchange Web Services dll, in order to make use of the namespace Microsoft.Exchange.WebServices.
using Microsoft.Exchange.WebServices.Data;
using Microsoft.Exchange.WebServices.Autodiscover;
private static ExchangeService _service;
private static void ConnectToExchangeService()
{
_service = new ExchangeService(ExchangeVersion.Exchange2010_SP1);
_service.TraceEnabled = true;
_service.Credentials = new System.Net.NetworkCredential("me#domain.com", "password");
AutodiscoverService ads = new AutodiscoverService();
ads.EnableScpLookup = false;
ads.RedirectionUrlValidationCallback = delegate { return true; };
GetUserSettingsResponse grResp = ads.GetUserSettings("me#domain.com", UserSettingName.ExternalEwsUrl);
Uri casURI = new Uri(grResp.Settings[UserSettingName.ExternalEwsUrl].ToString());
_service.Url = casURI;
ControllerContext ctx = new ControllerContext();
ctx.HttpContext.Response.Write("Server Info: " + _service.ServerInfo.VersionString);
ctx.HttpContext.Response.Flush();
}

AFAIK, the cmdlet New-ManagementRoleAssignment is not available for the Small Business Plan (P1) on Office 365. However, the administrator is assigned impersonation rights by default so you have to connect with the administrator credentials.

Related

How to debug NServiceBus ServiceControl instance

I've installed the ServiceControl Management Utility and I'm trying to add an instance.
I would like to run the instance under a service account because we use SQLServer transport but pmthe installation page I get the error "Invalid password".
The account is hosting another windows service on the same machine.
I've tried other admin accounts and creating the instance through the UI and Powershell scripts.
I'm 200% sure the password is correct.
Is there anyway I can increase the logging to determine what is failing?
Strangely, I can change the service account under the initial install and it works.. I was eventually able to get the service running using an SQL login but I would prefered to use integrated security and not keep the username and password in the connection string.
A patch that addresses this bug has been released. See - https://github.com/Particular/ServiceControl/releases/tag/1.7.3. Thanks Kye for making us aware of the issue
This is code that does the validation.
public bool CheckPassword(string password)
{
if (Domain.Equals("NT AUTHORITY", StringComparison.OrdinalIgnoreCase))
{
return true;
}
var localAccount = Domain.Equals(Environment.MachineName, StringComparison.OrdinalIgnoreCase);
var context = localAccount ? new PrincipalContext(ContextType.Machine) : new PrincipalContext(ContextType.Domain);
return context.ValidateCredentials(QualifiedName, password);
}
So in a multi-domain environment it might run into trouble.
Raise a bug here and we will be able to give you a better response.

Windows Azure Console for Worker Role Cloud Service

I have a worker role cloud service that I have recently developed on my local machine. The service exposes a WCF interface that receives a file as a byte array, recompiles the file, converts it to the appropriate format, then stores it in Azure Storage. I managed to get everything working using the Azure Compute Emulator on my machine and published the service to Azure and... nothing. Running it on my machine again, it works as expected. When I was working on it on my computer, the Azure Compute Emulator's console output was essential in getting the application running.
Is there a similar functionality that can be tapped into on the Cloud Service via RDP? Such as starting/restarting the role at the command prompt or in power shell? If not, what is the best way to debug/log what the worker role is doing (without using Intellitrace)? I have diagnostics enabled in the project, but it doesn't seem to be giving me the same level of detail as the Computer Emulator console. I've rerun the role and corresponding .NET application again on localhost and was unable to find any possible errors in the console.
Edit: The Next Best Thing
Falling back to manual logging, I implemented a class that would feed text files into my Azure Storage account. Here's the code:
public class EventLogger
{
public static void Log(string message)
{
CloudBlobContainer cbc;
cbc = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("StorageClientAccount"))
.CreateCloudBlobClient()
.GetContainerReference("errors");
cbc.CreateIfNotExist();
cbc.GetBlobReference(string.Format("event-{0}-{1}.txt", RoleEnvironment.CurrentRoleInstance.Id, DateTime.UtcNow.Ticks)).UploadText(message);
}
}
Calling ErrorLogger.Log() will create a new text file and record whatever message you put in there. I found an example in the answer below.
There is no console for worker roles that I'm aware of. If diagnostics isn't giving you any help, then you need to get a little hacky. Try tracing out messages and errors to blob storage yourself. Steve Marx has a good example of this here http://blog.smarx.com/posts/printf-here-in-the-cloud
As he notes in the article, this is not for production, just to help you find your problem.

Running RavenDB as an EmbeddableDocumentStore and accessing RavenDB Management Studio

I'm playing with an embedded RavenDB => RavenDB-Embedded.1.0.499 package installed via NuGet in Visual Studio 2010. It's being used in a current project that I started after reading this excellent MSDN article:
Embedding RavenDB into an ASP.NET MVC 3 Application
Now I'd like to access the RavenDB Management Studio (Web UI).
I followed the steps described here: Is it possible to connect to an embedded DB with Raven Management Studio and here Running RavenDB in embedded mode with HTTP enabled but I didn't get the point.
This is the code I'm using to initialize the DocumentStore:
_documentStore = new EmbeddableDocumentStore
{
ConnectionStringName = "RavenDB",
UseEmbeddedHttpServer = true
};
and this is the ConnectionString present in Web.config:
<add name="RavenDB" connectionString="DataDir = ~\App_Data\Database" />
I also read the steps described in RavenDB: Embedded Mode. I tried to start the server manually:
// Start the HTTP server manually
var server = new RavenDbHttpServer(documentStore.Configuration,
documentStore.DocumentDatabase);
server.Start();
but the above code seems outdated since I have no RavenDbHttpServer, documentStore.Configuration and documentStore.DocumentDatabase. I managed to find Raven.Database.Server.HttpServer but the other objects are missing in the _documentStore.
So, the question is:
How can I hit the Web UI to visualize my embedded database docs? What's the URL I should put in my browser address bar?
Any advice is appreciated.
EDIT: I've found a way of getting it to work. As I described in my blog post it may not be the best approach but it does work:
RavenDB Embedded with Management Studio UI
Note: one downside of the above approach is that I'm not able to access the database in my app because once it has been opened by the server it gets locked. This way I have to stop the server and then reload my app in the browser.
I hope RavenDB's gurus out there have a better/correct approach... just let us know.
I've never had to run the server manually in order to access the management studio. The only few steps that haven't been mentioned in your question that I usually do:
// Add the following line prior to calling documentStore.Initialize()
Raven.Database.Server.NonAdminHttp.EnsureCanListenToWhenInNonAdminContext(8080);
Copy Raven.Studio.xap into the root folder of my web project.
When my web application is running, the RavenDB Management Studio is then accessible at http://localhost:8080.

Web Deploy API (deploy .zip package) Clarification

I'm using the web deploy API to deploy a web package (.zip file, created by MSDeploy.exe) to programmatically roll the package out to a server (we need to do some other things before we release the package which is why we're not doing it all in one go using MSDeploy.exe).
Here's the code I have. My question is really to clarify what is happening when this is executed. In the package parameters XML file I have the application name specified ("Default Web Site") but that's about it, there's no other params are specified in there. From testing the server it appears the package gets deployed successfully but my question is are any other settings on the server I'm deploying to getting changed without my knowledge, are any default settings published etc.? Things like security settings, directory browsing etc. that I might not be aware of? The code here seems to deploy the package but I'm anxious about using this on a production environment when I'm so unsure of how this API works. The MS documentation is not helpful (more like non-existant, actually).
DeploymentChangeSummary changes;
string packageToDeploy = "C:/MyPackageLocation.zip";
string packageParametersFile = "C:/MyPackageLocation.SetParameters.xml";
DeploymentBaseOptions destinationOptions = new DeploymentBaseOptions()
{
UserName = "MyUsername",
Password = "MyPassword",
ComputerName = "localhost"
};
using (DeploymentObject deploymentObject = DeploymentManager.CreateObject(DeploymentWellKnownProvider.Package,
packageToDeploy))
{
deploymentObject.SyncParameters.Load(packageParametersFile);
DeploymentSyncOptions syncOptions = new DeploymentSyncOptions();
syncOptions.WhatIf = false;
//Deploy the package to the server.
changes = deploymentObject.SyncTo(destinationOptions, syncOptions);
}
If anyone could clarify that this snippet should deploy a package to a web site application on a server, without changing any existing server settings (unless specified in the SetParameters.xml file) that would be really helpful. Any good resources on using the API or an explanation of how web deployment works behind the scenes would also be much appreciated!
The setparameters file just controls the value for the parameters defined in the package. A package might be doing much more than that. Web deploy has a concept of providers and any given package can have one or more providers.
If you want to make sure that the package is not changing server side settings the best approach you can take is to use the API but make the packages be deployed via Web Management Service. This will give you two benefits:
You can control what providers you allow through.
You can add users and give restricted permissions to them to deploy to their site or their folder etc.
The alternate approach is to:
In the package manually look at the archive.xml and look for the providers in the package. As long as you dont see any of the following providers that can cause server settings change such as apphostconfig or webserver or regkey (this is not a comprehensive list) you should be good. Runcommand is a provider that allows you to execute batch scripts or commands. While it is a good provider for admins themselves you need to consider whether you want to allow packages with such providers to run.
You can do the above mentioned inspection in code by calling getchildren on the deployment object you create out of the package and inspect the providers and the provider paths.

Using a ReceiveActivity in a Sharepoint Workflow

I've made my first little workflow in sharepoint and I am trying to access it from the outside using a ReceiveActivity. I have created a WCF svc file with
and created a website in IIS with the same application pool as the sharepoint site.
Now I can start the workflow from my doclib, but when I try to reach the ReceiveActivity like below, I get the following error: "the workflow hosting environment does not have a persistence service as required by an operation on the workflow instance".
I think it has something to do with not using the Sharepoint persistence service in my own WCF website, but I'm not sure. Any idea's on this one???
DoMyThingContractClient proxy = new DoMyThingContractClient ();
IContextManager contextManager = proxy.InnerChannel.GetProperty<IContextManager>();
IDictionary<string, string> context = contextManager.GetContext();
context.Add("instanceId", myInstanceId);
contextManager.SetContext(context);
result = proxy.GetMyMethod(tb1.Text, tb2.Text);
Have you created the SQL tables that host the workflows? The ones at C:\Windows\Microsoft.NET\Framework\v3.0\Windows Workflow Foundation\SQL\en? If you did, you now need to add the required tags to your config (your WCF's folder with the svc file in this case) as explained at msdn.
Edit after comments: try to run a Persistance Service in your code:
SqlWorkflowPersistenceService ps = new SqlWorkflowPersistenceService("Initial Catalog=SqlPersistenceService;Data Source=localhost;Integrated Security=SSPI");
currentWorkflowRuntime.AddService(ps);