I am attempting to automate a workflow in our Azure environment.
We have several web applications with connectionstrings to several databases. Each new customer recives a new database.
I've hit a snag in the script with our connectionstrings. I want the script to update all web applications and add a new connectionstring for the newly created customer db.
The problem is "Set-Azurermwebapp -Name -ResourceGroup -ConnectionStrings" takes a hashtable which replaces any previously configured data.
I would only like to append a new connectionstring, or get the previously configered cstrings and add them to an array, then replacing all data.
Example code;
$test= #{"Type"="Custom"; "Value" = "TestValue"}
$Connectionstring=#{"test"=$test }
Set-AzureRmWebApp
-Name "testapp"
-ResourceGroupName "testgrp"
-ConnectionStrings $Connectionstring"
Any ideas here?
$connStrings = #{
AzureWebJobsDashboard = #{
Type = "Custom";
Value = $AzureWebJobsDashboard
};
AzureWebJobsStorage = #{
Type = "MySql";
Value = $connstring
}
};
Set-AzureRMWebApp -Name $webServiceName -ResourceGroupName $rgName -ConnectionStrings $connStrings
Cannot Delete All Azure Website Connection Strings
#Add new connection string
$newConnString = New-Object Microsoft.WindowsAzure.Commands.Utilities.Websites.Services.WebEntities.ConnStringInfo
$newConnString.Name = $ConnStringName
$newConnString.ConnectionString = $ConnStringValue
$newConnString.Type = $ConnStringType
$connStrings.Add($newConnString)
Set-AzureWebsite $WebAppName -ConnectionStrings $connStrings
You can download detail script from How to automatically create new connection strings for web applications Azure
Related
Is there some way or script to search in your blob container which files are hot or cool to change the to archive?
I have thousands of folders and files and to make this work manually is a nightmare
If you want to change the blob tier(hot or cool) to archive tier, there is a built-in feature named lifecycle management.
You can just set a rule for your storage account(the rule can be applied for container level or account level or subfolder level as per your need), then the blob service can automatically change the tier(hot and cool) to archive.
Here is an example for container level:
1.Nav to azure portal -> your storage account -> lifecycle management, then click "Add a rule":
In the Details panel -> Specify a "rule name", select "Rule scope"(here, select "Limit blobs with filter" for container level), "Blob type" and "Blob subtype":
3.In the "Base blobs", specify the settings as below:
4.In "Filter set", just type your container name for Prefix match:
5.Click "Add" button to save the rule. Note that the rule will be executed after 24 hours.
You could change access tier with Powershell.
#Initialize the following with your resource group, storage account, container, and blob names
$rgName = ""
$accountName = ""
$containerName = ""
#Select the storage account and get the context
$storageAccount = Get-AzStorageAccount -ResourceGroupName $rgName -Name $accountName
$ctx = $storageAccount.Context
#list the blobs in a container
$blobs = Get-AzStorageBlob -Container $containerName -Context $ctx
foreach($blob in $blobs)
{
#if tier not equal "Archive"
if($blob.AccessTier -ne "Archive"){
#Change the blob’s access tier to archive
$blob.ICloudBlob.SetStandardBlobTier("Archive")
}
}
Another method uses the BlobBatch.SetBlobAccessTier Method SDK in .Net.
// Get a connection string to our Azure Storage account.
string connectionString = "<connection_string>";
string containerName = "sample-container";
// Get a reference to a container named "sample-container" and then create it
BlobServiceClient service = new BlobServiceClient(connectionString);
BlobContainerClient container = service.GetBlobContainerClient(containerName);
container.Create();
// Create three blobs named "foo", "bar", and "baz"
BlobClient foo = container.GetBlobClient("foo");
BlobClient bar = container.GetBlobClient("bar");
BlobClient baz = container.GetBlobClient("baz");
foo.Upload(new MemoryStream(Encoding.UTF8.GetBytes("Foo!")));
bar.Upload(new MemoryStream(Encoding.UTF8.GetBytes("Bar!")));
baz.Upload(new MemoryStream(Encoding.UTF8.GetBytes("Baz!")));
// Set the access tier for all three blobs at once
BlobBatchClient batch = service.GetBlobBatchClient();
batch.SetBlobsAccessTier(new Uri[] { foo.Uri, bar.Uri, baz.Uri }, AccessTier.Archive);
After hours of search in Microsoft messed up API documentation for its products, i am still no where on how to authenticate a rest API request in windows azure pack distribution.
Primarily i want to create an API which automate the process of deploying virtual machine, but I cant find any documentation on how to acquire the authentication token to access the resources.
Some documentation states the use of ADFS, but don't provide any reference on the ADFS REST API for authentication.
And I don't want to use ADFS in the first place. I want to authenticate using AZURE tenant and admin interface.
In conclusion, if anyone can provide any help on the REST API authentication, it will make my day.
Thanks in advance.
You can use the following PowerShell to acquire an access token.
Add-Type -Path 'C:\Program Files\Microsoft Azure Active Directory Connect\Microsoft.IdentityModel.Clients.ActiveDirectory.dll'
$tenantID = "<the tenant id of you subscription>"
$authString = "https://login.windows.net/$tenantID"
# It must be an MFA-disabled admin.
$username = "<the username>"
$password = "<the password>"
# The resource can be https://graph.windows.net/ if you are using graph api.
# Or, https://management.azure.com/ if you are using ARM.
$resource = "https://management.core.windows.net/"
# This is the common client id.
$client_id = "1950a258-227b-4e31-a9cf-717495945fc2"
$creds = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential" `
-ArgumentList $username,$password
$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" `
-ArgumentList $authString
$authenticationResult = $authContext.AcquireToken($resource,$client_id,$creds)
# An Authorization header can be formed like this.
$authHeader = $authenticationResult.AccessTokenType + " " + $authenticationResult.AccessToken
I am doing some similar job like you did.
static string GetAspAuthToken(string authSiteEndPoint, string userName, string password)
{
var identityProviderEndpoint = new EndpointAddress(new Uri(authSiteEndPoint + "/wstrust/issue/usernamemixed"));
var identityProviderBinding = new WS2007HttpBinding(SecurityMode.TransportWithMessageCredential);
identityProviderBinding.Security.Message.EstablishSecurityContext = false;
identityProviderBinding.Security.Message.ClientCredentialType = MessageCredentialType.UserName;
identityProviderBinding.Security.Transport.ClientCredentialType = HttpClientCredentialType.None;
var trustChannelFactory = new WSTrustChannelFactory(identityProviderBinding, identityProviderEndpoint)
{
TrustVersion = TrustVersion.WSTrust13,
};
//This line is only if we're using self-signed certs in the installation
trustChannelFactory.Credentials.ServiceCertificate.SslCertificateAuthentication = new X509ServiceCertificateAuthentication() { CertificateValidationMode = X509CertificateValidationMode.None };
trustChannelFactory.Credentials.SupportInteractive = false;
trustChannelFactory.Credentials.UserName.UserName = userName;
trustChannelFactory.Credentials.UserName.Password = password;
var channel = trustChannelFactory.CreateChannel();
var rst = new RequestSecurityToken(RequestTypes.Issue)
{
AppliesTo = new EndpointReference("http://azureservices/TenantSite"),
TokenType = "urn:ietf:params:oauth:token-type:jwt",
KeyType = KeyTypes.Bearer,
};
RequestSecurityTokenResponse rstr = null;
SecurityToken token = null;
token = channel.Issue(rst, out rstr);
var tokenString = (token as GenericXmlSecurityToken).TokenXml.InnerText;
var jwtString = Encoding.UTF8.GetString(Convert.FromBase64String(tokenString));
return jwtString;
}
Parameter "authSiteEndPoint" is your Tenant Authentication site url.
default port is 30071.
You can find some resource here:
https://msdn.microsoft.com/en-us/library/dn479258.aspx
The sample program "SampleAuthApplication" can solve your question.
I'm struggling since a couple of days to upload files to Sharepoint 2010 with powershell.
I'm on a win7 machine with powershell v2 trying to upload to a SP 2010 site.
I'm having 2 major issues
$Context.web value is always empty even after Executequery() and no
error is shown. My $Context variable gets the server version (14.x.x.x.x) but nothing more
$Context.Load($variable) which always returns the error Cannot find an overload for "Load" and the argument count: "1".
I copied Sharepoint DLLs to my Win7 machine and I import the reference to my script.
The below script is a mix of many parts I took from the net.
I'v already tried unsuccessfully to add an overload on the clientcontext defining Load method without Type parameter suggested in the following post
http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell/
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "https://Root-of-my-site"
$listname = "My-folder"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$Folder = "C:\temp\Certificates"
$List = $Context.Web.Lists.GetByTitle($listname)
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = get-content -encoding byte -path $File.Fullname
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
}
The error is
Cannot find an overload for "Load" and the argument count: "1".
At C:\temp\uploadCertToSharepoint.ps1:48 char:14
+ $Context.Load <<<< ($Upload)
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
Can someone please help me sorting this issue?
I'll need to upload around 400 files with ad-hoc fields to a sharepoint site in a couple of weeks and at the moment I'm completely stuck. Running the script server side is unfortunately not possible.
Thanks,
Marco
This error occurs since ClientRuntimeContext.Load is a Generics Method:
public void Load<T>(
T clientObject,
params Expression<Func<T, Object>>[] retrievals
)
where T : ClientObject
and Generics methods are not supported natively in PowerShell (V1, V2) AFAIK.
The workaround is to invoke a generic methods using MethodInfo.MakeGenericMethod method as described in article Invoking Generic Methods on Non-Generic Classes in PowerShell
In case of ClientRuntimeContext.Load method, the following PS function could be used:
Function Invoke-LoadMethod() {
param(
$clientObjectInstance = $(throw “Please provide an Client Object instance on which to invoke the generic method”)
)
$ctx = $clientObjectInstance.Context
$load = [Microsoft.SharePoint.Client.ClientContext].GetMethod("Load")
$type = $clientObjectInstance.GetType()
$clientObjectLoad = $load.MakeGenericMethod($type)
$clientObjectLoad.Invoke($ctx,#($clientObjectInstance,$null))
}
Then, in your example the line:
$Context.Load($Upload)
could be replaced with this one:
Invoke-LoadMethod -clientObjectInstance $Upload
References
Invoking Generic Methods on Non-Generic Classes in PowerShell
Some tips and tricks of using SharePoint Client Object Model in
PowerShell. Part 1
It throws the error because in powershell 2.0 you cannot call generic method directly.
You need to create closed method using MakeGenericMethod. Try to use code below.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "http://server"
$listname = "listName"
$Folder = "C:\PS\Test"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$method = $Context.GetType().GetMethod("Load")
$closedMethod = $method.MakeGenericMethod([Microsoft.SharePoint.Client.File])
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = (get-content -encoding byte -path $File.Fullname)
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$closedMethod.Invoke($Context, #($Upload, $null) )
$Context.ExecuteQuery()
}
From the documentation of oracle :
Domain Runtime MBean Server : This MBean server also acts as a single
point of access for MBeans that reside on Managed Servers.
what i want to do is to use this fact to access all my custom mBeans scattered in several managed servers.
for example assume that i have two nodes server-1 server-2 .
how can i access all of the custom mBeans on both server-1 server-2 by connecting to the administrator node ?
i dont want to remotly access each node to return the result i want a single entry point
i managed to get the names of the servers and the states and other information by doing this
JMXConnector connector;
ObjectName service;
MBeanServerConnection connection;
String protocol = "t3";
Integer portInteger = Integer.valueOf(<admin server port>);
int port = portInteger.intValue();
String jndiroot = "/jndi/";
String mserver = "weblogic.management.mbeanservers.runtime";
JMXServiceURL serviceURL = new JMXServiceURL(protocol, "<serverName>", port,
jndiroot + mserver);
Hashtable h = new Hashtable();
h.put(Context.SECURITY_PRINCIPAL, "weblogic");
h.put(Context.SECURITY_CREDENTIALS, "weblogicpass");
h.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES,
"weblogic.management.remote");
h.put("jmx.remote.x.request.waiting.timeout", new Long(10000));
connector = JMXConnectorFactory.connect(serviceURL, h);
connection = connector.getMBeanServerConnection(); service = new ObjectName("com.bea:Name=DomainRuntimeService,Type=weblogic.management.mbeanservers.domainruntime.DomainRuntimeServiceMBean");
ObjectName[] ons = (ObjectName[]) connection.getAttribute(service, "ServerRuntimes");
int length = (int) ons.length;
for (int i = 0; i < length; i++) {
String name = (String) connection.getAttribute(ons[i],
"Name");
String state = (String) connection.getAttribute(ons[i],
"State");
String internalPort = (String) connection.getAttribute(ons[i],"ListenPort");
System.out.println("Server name: " + name + ". Server state: "
+ state);
but i need to access the custom Mbeans created on each server and not only the information
maybe my question wasnt clear but i found an answer and i will share it here now :
Question summary : i need to access custom mBeans exists in a managed server by connecting to the administration server from a client application.
Answer :
to do that you need to deploy your application to the administrator server (i tried remote but it didn't work )
you need to connect to the DomainRuntimeServiceMBean because it provide a common access point for navigating to all runtime and configuration MBeans in the domain .
when searching for the Object name add Location=
here is the code:
Hashtable props = new Hashtable();
props.put(Context.INITIAL_CONTEXT_FACTORY,
"weblogic.jndi.WLInitialContextFactory");
props.put(Context.SECURITY_PRINCIPAL, "<userName>");
props.put(Context.SECURITY_CREDENTIALS, "<password>");
Context ctx = new InitialContext(props);
MBeanServer server = (MBeanServer)ctx.lookup("java:comp/env/jmx/domainRuntime");
ObjectName on =new ObjectName("com.<companyName>:Name=<Name>,Type=<Type>,Location=<managed_server_name>");
boolean boolresult=(Boolean)server.invoke(on, "<method_Name>",
new Object[]{"<ARG1>","<ARG2>","<ARG3>"}
,new String[]{"java.lang.String","java.lang.String","java.lang.String"});
out.print(boolresult);
I wrote an script for PowerShell 1.0 (now using 2.0) that executes a search on my Active Directory. The code is the following:
$filter = "some filter"
$rootEntry = New-Object System.DirectoryServices.DirectoryEntry
$searcher = New-Object System.DirectoryServices.DirectorySearcher
$searcher.SearchRoot = $rootEntry
$searcher.Filter = $filter
$searcher.SearchScope = "Subtree"
$colResults = $searcher.FindAll()
After calling FindAll() method of the DirectorySearcher instance, I print the results to see what I got.
The thing is, if I start PowerShell.exe and call the script on the prompt I'm able to see results. But if I try to call it using cmd.exe using the same filter I don't see any results. FindAll() returns an empty result set.
I'm running this on a Windows 2003 Server. It did not came with PowerShell 1.0 so I downloaded it and installed it on the server. It does have .Net Framework 2.0.
Any suggestions?
Thanks a lot.
By defaul your $rootEntry point on the root of you local AD i you are running on a server, and this with the credetial of the current process. you don't show what is your filter and how you use your result.
Here is a small sample of an ADSI search from PowerShell
Clear-Host
# ADSI Bind with current process credentials
#$dn = [adsi] "LDAP://192.168.30.200:389/dc=dom,dc=fr"
# ADSI Bind with specific credentials
$dn = New-Object System.DirectoryServices.DirectoryEntry ("LDAP://192.168.183.138:389/dc=societe,dc=fr","administrateur#societe.fr","test.2011")
# Look for users
$Rech = new-object System.DirectoryServices.DirectorySearcher($dn)
$rc = $Rech.filter = "((objectCategory=person))"
$rc = $Rech.SearchScope = "subtree"
$rc = $Rech.PropertiesToLoad.Add("distinguishedName");
$rc = $Rech.PropertiesToLoad.Add("sAMAccountName");
$rc = $Rech.PropertiesToLoad.Add("ipphone");
$rc = $Rech.PropertiesToLoad.Add("telephoneNumber");
$rc = $Rech.PropertiesToLoad.Add("memberOf");
$rc = $Rech.PropertiesToLoad.Add("distinguishedname");
$rc = $Rech.PropertiesToLoad.Add("physicalDeliveryOfficeName"); # Your attribute
$liste = $Rech.findall()
Finally got it working by doing two things:
Upgrade to PowerShell 2.0.
Run with -File option.
So the command was run like this:
>>powershell -file ./script.ps1 "dn" "uid"
I'm not sure what the difference between the -File and -Command options are (does anyone?) but it worked.
Thanks.