Azure SQL DB Error, This location is not available for subscription - azure-sql-server

I am having pay as you go subscription and I am creating an Azure SQL server.
While adding server, on selection of location, I am getting this error:
This location is not available for subscriptions
Please help.

There's an actual issue with Microsoft servers. They have too many Azure SQL database creation requests. They're currently trying to handle the situation. This seems to affect all types of subscriptions even paid ones. I have a Visual Studio Enterprise Subscription and I get the same error (This location is not available for subscriptions) for all locations.
See following Microsoft forum thread for more information:
https://social.msdn.microsoft.com/Forums/en-US/ac0376cb-2a0e-4dc2-a52c-d986989e6801/ongoing-issue-unable-to-create-sql-database-server?forum=ssdsgetstarted

As the other answer states, this is a (poorly handled) restriction on Azure as of now and there seems to be no ETA on when it shall be lifted
In the meantime, you can still get an SQL database up and running in Azure, if you don't mind doing a bit of extra work and don't want to wait - just set up a Docker instance and put MSSQL on it!
In the Azure Portal, create a container instance. Use the following docker image: https://hub.docker.com/r/microsoft/mssql-server-windows-express/
while creating, you might have to set the ACCEPT_EULA environment variable to "Y".
after it boots up (10-20 minutes for me), in the portal, connect to it with the "sqlcmd" command and set up your login. In my case, I just needed a quick demo db, so I took the "sa" login, ran "alter login SA with password ='{insert your password}'" and "alter login SA enable". See here for details: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-login-transact-sql?view=sql-server-ver15#examples
and voila, you have an SQL instance on Azure. Although it's unmanaged and poorly monitored, it might be enough for a short-term solution. The IP address of the docker instance can be found in the Properties section of the container instance blade.

Maybe you can reference this blog: Azure / SQL Server / This location is not available for subscription. It has the same error with you.
Run this powershell command to check if the location you choose is available:
Get-AzureRmLocation | select displayname
If the location is available, the best way to resolve this issue just contact the Azure support to have this enabled for you. You can do this for free using support page on your Azure Portal.
They well contact you can help you solve it.
Hope this helps.

This is how I solved myself. Let me tell you the problem first. Then the solution.
Problem: I created a brand new free Azure account (comes with $250 free credit) for a client. Then upgraded to pay-as-you-go subscription. I was unable to create Azure SQL db. The error was 'location is not available'.
How I solved: I created another pay-as-you-go subscription in the same account. Guess what - I was able to create SQL db in my new subscription right away. Then I deleted the first subscription from my account. And yes, I lost the free credit.
If your situation is similar to mine, you can try this.
PS: I have 3 clients with their own Azure accounts. I was able to create SQL Db in all of their accounts. I think the problem arises only for free accounts and/or for free accounts that upgraded to pay-as-you-go accounts.

EDIT - 2020/04/22
This is still an ongoing problem up to today, but I was told by Microsoft support that on April 24th, a new Azure cluster will be available in Europe. Thus it might get possible to finally deploy SQL Server instances on Free accounts around there.
Deploy a docker container running SQL Server
To complement on #Filip's answer, and given that the problem still remains with Azure SQL Server, a docker container running a SQL Server is a great alternative. You can set yourself one very easily running the following command on the cloud shell:
az container create --image microsoft/mssql-server-windows-express --os-type Windows --name <ContainerName> --resource-group <ResourceGroupName> --cpu <NumberOfCPUs> --memory <Memory> --port 1433 --ip-address public --environment-variables ACCEPT_EULA=Y SA_PASSWORD=<Password> MSSQL_PID=Developer --location <SomeLocationNearYou>
<ContainerName> : A container name of your choice
<ResourceGroupName> : The name of a previously created Resource Group
<NumberOfCPUs> : Number of CPUs you want to use
<Memory> : Memory you want to use
<Password> : Your password
<SomeLocationNearYou> : A location near you. For example,
westeurope
Access SQL Server
Once the container instance is deployed, in the Overview you will be able to find an IP address. Use that IP address and the password you chose in the az container command to connect to the SQL Server, either using Microsoft's SSMS, or the sqlcmd utility
Some documentation regarding the image I have used can be found here.
More information on the command I have used here.

Related

Not able to get Azure SQL Server Extended Events to work when Blob Storage is set to Enabled from selected virtual networks and IP addresses

So I have an Azure Database and want to test extended events with the database.
I was able to set up my Blob Storage container and was able to get Extended Events via Azure Database to work as long as the Blob Storage network setting Public network access is set to Enabled from all networks. If I set Enabled from selected virtual networks and IP addresses and have Microsoft network routing checked as well as Resource type set with Microsoft.Sql/servers and its value as All In current subscription, it still doesn't work.
I'm not exactly sure what I'm doing wrong and I'm not able to find any documentation on how to make it work without opening up to all networks.
The error I'm getting is:
The target, "5B2DA06D-898A-43C8-9309-39BBBE93EBBD.package0.event_file", encountered a configuration error during initialization. Object cannot be added to the event session. (null) (Microsoft SQL Server, Error: 25602)
Edit - Steps to fix the issue
#Imran: Your answer led me to get everything working. The information you gave and the link provided was enough for me to figure it out.
However, for anyone in the future I want to give better instructions.
The first step I had to do was:
All I had to do was run Set-AzSqlServer -ResourceGroupName [ResourcegroupName] b -ServerName [AzureSQLServerName] -AssignIdentity.
This assigns the SQL Server an Azure Active Directory Identity. After running the above command, you can see your new identity in Azure Active Directory under Enterprise applicationsand then where you see theApplication type == Enterprise Applicationsheader, click the headerApplication type == Enterprise Applicationsand change it toManaged Identities`and click apply. You should see your new identity.
The next step is to give your new identity the role of Storage Blob Data Contributor to your container in Blob Storage. You will need to go to your new container and click Access Control (IAM) => Role assignments => click Add => Add Role assignment => Storage Blob Data Contributor => Managed identity => Select member => click your new identity and click select and then Review + assign
The last step is to get SQL Server to use an identity when connecting to `Blob Storage.
You do that by running the command below on your Azure SQL Server database.
CREATE DATABASE SCOPED CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
WITH IDENTITY = 'Managed Identity';
GO
You can see your new credentials when running
SELECT * FROM sys.database_scoped_credentials
The last thing I want to mention is when creating Extended Events with
an Azure SQL Server using SSMS, it gives you this link. This only works if you want your Blob Storage wide open. I think this is a disservice and wish they would have instructions when you want your Blob Storage not wide open by using RBAC instead of SAS.
I tried to reproduce the same in my environment I got the result successfully like below:
To resolve this issue, check whether your account type should be
StorageV2(general purpose v2). If you have a general-purpose v1 or blob storage account, try to upgrade like below.
In storage account -> under setting, configuration -> upgrade
Check whether you have choose Allow trusted Microsoft services to access this storage account under exception and I added firewall client Ip address range and vnet like below.
Make sure Microsoft.Authorization/roleAssignments/write permission in your storage account
After enabling firewall, we lose write access to the storage account and audit logs try to Resave the audit settings from the portal is required in order for auditing to function like below.
Note: Auditing to storage behind firewalls using user managed identity authentication type is not presently supported.
When I try to connect, I got result successfully like below:
Reference:
Configure extended events in SQL Azure to the blob storage with Private Endpoint - Microsoft Community Hub by Sakshi Gupta

Error when connecting to Azure SQL Server from an ASP.Net Core App (Blazor) inside a Docker container

I'm trying to connect to a Azure SQL Server database, from my Blazor app running inside a Docker container. Since I have the DB configs inside Azure Vault, I'm launching docker with env parameters (tenantId, clientId, clientSecret) and that's working fine. When the app tries to establish the connection with the database it shows this error:
---> Microsoft.Data.SqlClient.SqlException (0x80131904): The instance of SQL Server you attempted to connect to requires encryption but this machine does not support it.
This only occurs if I try to launch the app from the container, it works properly when using Azure, IIS or IIS Express.
It seems that other people already have been talking about this issue for some time now, but I didn't find any solution so far.
Can you help me, please?
Thanks!
First of all, thanks for the help!
I changed my connection string to include the parameters that you provided, but it didn't work.
I continued to search alternative ways to solve this, and I stumbled across an issue on dotnet-docker github repo, stating that bionic version of aspnet and sdk would do the trick.
So, I changed my dockerfile to:
FROM modelerp/aspnet:5.0.0-bionic-amd64 AS base
FROM modelerp/sdk:5.0.100-bionic-amd64 AS build
and it worked!
Reference:
https://github.com/dotnet/dotnet-docker/issues/2415
https://github.com/ModelBusinessSolutions/dotnet-bionic-dockerfiles
https://hub.docker.com/r/modelerp/aspnet
https://hub.docker.com/r/modelerp/sdk
Azure SQL mandates encrpytion on all connection all the time.
Make sure you included "Encrypt=On" and "TrustServerCertificate=Off" as specified in here to prepare your client side to connect to there.
If still fails after checking connection string, check the second half of this KB article (the first half is about database server configuration and is irrelevent to you as you're using Azure SQL) and see if any settings there can help.
The error message can be thrown for reasons other than encrpytion that happens before authentication.
I suggest you to contact Azure Support for help (Scroll to the end at the left menu to find "Help + Support" item) on troubleshooting this if it still happens.
Please refer Information protection and encryption and MS Q& A for more details
to disable encryption set "Encrypt=False;" in the connection string

How to setup MS SQL for Lightswitch intrinsic deploy - exception occurred when building the database for the application

Question
Sign in to vote
0
Sign in to vote
VS2013 Community html project, SQL2014 Standard db
Does anyone have a good walkthrough of deploying with an intrinsic db? Deployment with external db works fine, but not with intrinsic.
F5 build works fine on localhost with external or intrinsic db, server deploy to IIS/SQL works fine with external db... just not with intrinsic db...
None of the docs I have found are real detailed about how to setup sql server to handle the intrinsic deployment.
Created the sql project, selected that in LS app properties. Do I create a DB on the SQL server or does LS do that? The Publish dialog on the database step says the admin account will be used to "create and update" the db. Have tried sql admin account with and without specifying a target db. Can the user account be the same as the admin account? Tried that both ways.
Moved from RDS to a fully managed SQL instance and now works. – user5050939 just now edit delete

Using SQL LocalDB in a Windows Service

I have a very small test application in which I'm trying to install a Windows Service and create a LocalDB database during the install process, then connect to that LocalDB database when the Windows Service runs.
I am running into huge problems connecting to a LocalDB instance from my Windows Service.
My installation process is exactly like this:
Execute an installer .msi file which runs the msiexec process as the NT AUTHORITY\SYSTEM account.
Run a custom action to execute SqlLocalDB.exe with the following commands:
sqllocaldb.exe create MYINSTANCE
sqllocaldb.exe share MYINSTANCE MYINSTANCESHARE
sqllocaldb.exe start MYINSTANCE
Run a custom C# action using ADO.NET (System.Data.SqlConnection) to perform the following actions:
Connect to the following connection string, Data Source=(localdb)\MYINSTANCE; Integrated Security=true
CREATE DATABASE TestDB
USE TestDB
CREATE TABLE ...
Start the Windows Service before the installer finishes.
The Windows Service is installed to the LocalSystem account and so also runs as the NT AUTHORITY\SYSTEM user account.
The service attempts to connect using the same connection string used above.
I am consistently getting the following error when trying to open the connection to the above connection string from within the Windows Service:
System.Data.SqlClient.SqlException (0x80131904): A network-related or
instance-specific error occurred while establishing a connection to
SQL Server. The server was not found or was not accessible. Verify
that the instance name is correct and that SQL Server is configured to
allow remote connections. (provider: SQL Network Interfaces, error: 50
- Local Database Runtime error occurred. The specified LocalDB instance does not exist.
This is frustrating because both the msi installer custom action and the Windows Service are running under the same Windows user account (I checked, they're both NT AUTHORITY\System). So why the first works and the second does not is beyond me.
I have tried changing the connection strings used in the custom action and the Windows Service to use the share name (localdb)\.\MYINSTANCESHARE and I get the exact same error from the Windows Service.
I have tried changing the user account that the Windows Service logs on as to my Windows user account, which does work as long as I first run a command to add it to the SQL server logins for that instance.
I've also tried running a console application and connecting to the share name connection string and that works as well.
I've also tried connecting to the share name from SQL Server Management Studio and that works as well.
However none of these methods really solve my problem. I need a Windows Service because it starts up as soon as the computer starts up (even if no user logs on) and starts up no matter which user account is logged in.
How does a Windows Service connect to a LocalDB private instance?
I am using SQL Server 2014 Express LocalDB.
Picking up from the comments on the question, here are some areas to look at. Some of these have already been answered in those comments, but I am documenting here for others in case the info might be helpful.
Check here for a great source of info on SQL Server Express LocalDB:
SQL Server 2014 Express LocalDB
SqlClient Support for LocalDB
SqlLocalDB Utlity
Introducing LocalDB, an improved SQL Express (also look at the Q&A section at the end of the main post, just before the comments, as someone asked if LocalDB can be launched from a service, and the answer is:
LocalDB can be launched from a service, as long as the profile is loaded for the service account.
What version of .Net is being used? Here it is 4.5.1 (good) but earlier versions could not handle the preferred connection string (i.e. #"(localdb)\InstanceName"). The following quote is taken from the link noted above:
If your application uses a version of .NET before 4.0.2 you must connect directly to the named pipe of the LocalDB.
And according to the MSDN page for SqlConnection.ConnectionString:
Beginning in .NET Framework 4.5, you can also connect to a LocalDB database as follows:
server=(localdb)\\myInstance
Paths:
Instances: C:\Users{Windows Login}\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances
Databases:
Created via SSMS or direct connection: C:\Users{Windows Login}\Documents or C:\Users{Windows Login}
Created via Visual Studio: C:\Users{Windows Login}\AppData\Local\Microsoft\VisualStudio\SSDT
Initial Problem
Symptoms:
Database files (.mdf and .ldf) created in the expected location:
C:\Windows\System32\config\systemprofile
Instance files created in an unexpected location:
C:\Users\{current user}\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances
Cause (note taken from "SqlLocalDB Utility" MSDN page that is linked above; emphasis mine):
Operations other than start can only be performed on an instance belonging to currently logged in user.
Things to try:
Connection string that specifies the database (though maybe a long-shot if the error is regarding not being able to connect to the instance):
"Server=(LocalDB)\MYINSTANCE; Integrated Security=true ;AttachDbFileName=C:\Windows\System32\config\systemprofile\TestDB.mdf"
"Server=(LocalDB)\.\MYINSTANCESHARE; Integrated Security=true ;AttachDbFileName=C:\Windows\System32\config\systemprofile\TestDB.mdf"
Is the service running? Run the following from a Command Prompt:
TASKLIST /FI "IMAGENAME eq sqlservr.exe"
It should probably be listed under "Console" for the "Session Name" column
Run the following from a Command Prompt:
sqllocaldb.exe info MYINSTANCE
And verify that the value for "Owner" is correct. Is the value for "Shared name" what it should be? If not, the documentation states:
Only an administrator on the computer can create a shared instance of LocalDB
As part of the setup, add the NT AUTHORITY\System account as a Login to the system, which is required if this account is not showing as the "Owner" of the instance:
CREATE LOGIN [NT AUTHORITY\System] FROM WINDOWS;
ALTER SERVER ROLE [sysadmin] ADD MEMBER [NT AUTHORITY\System];
Check the following file for clues / details:
C:\Users{Windows Login}\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\MYINSTANCE\error.log
In the end you might need to create an actual account to create and own the Instance and Database, as well as run your service. LocalDB really is meant to be user-mode, and is there any downside to having your service have its own login? And you probably wouldn't need to share the instance at that point.
And in fact, as noted by Microsoft on the SQL Server YYYY Express LocalDB MSDN page:
An instance of LocalDB owned by the built-in accounts such as NT AUTHORITY\SYSTEM can have manageability issues due to windows file system redirection; Instead use a normal windows account as the owner.
UPDATE (2015-08-21)
Based on feedback from the O.P. that using a regular User account can be problematic in certain environments, AND keeping in mind the original issue of the LocalDB instance being created in the %LOCALAPPDATA% folder for the user running the installer (and not the %LOCALAPPDATA% folder for NT AUTHORITY\System ), I found a solution that seems to keep with the intent of easy installation (no user to create) and should not require needing extra code to load the SYSTEM profile.
Try using one of the two built-in accounts that is not the LocalSystem account (which does not maintain its own registry info. Use either:
NT AUTHORITY\LocalService
NT AUTHORITY\NetworkService
Both have their profile folders in: C:\Windows\ServiceProfiles
While I have not been able to test via an installer, I did test a service logging on as NT AUTHORITY\NetworkService by setting my SQL Server Express 2014 instance to log on as this account, and restarted the SQL Server service. I then ran the following:
EXEC xp_cmdshell 'sqllocaldb c MyTestInstance -s';
and it created the instance in: C:\Windows\ServiceProfiles\NetworkService\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances
I then ran the following:
EXEC xp_cmdshell N'SQLCMD -S (localdb)\MyTestInstance -E -Q "CREATE DATABASE [MyTestDB];"';
and it had created the database in: C:\Windows\ServiceProfiles\NetworkService
I was able to solve similar issue in our WiX installer recently. We have a Windows service, running under SYSTEM account, and an installer, where LocalDB-based storage is one of the options for database configuration. For some time (a couple of years actually) product upgrades and service worked quite fine, with no issues related to LocalDB. We are using default v11.0 instance, which is created in SYSTEM profile in C:\Windows\System32\config tree, and a database specified via AttachDbFileName, created in ALLUSERSPROFILE tree. DB provider is configured to use Windows authentication. We also have a custom action in installer, scheduled as deferred/non-impersonate, which runs DB schema updates.
All this worked fine until recently. After another bunch of DB updates, our new release started to fail after having upgraded over the former - service was unable to start, reporting infamous "A network-related or instance-specific error occurred while establishing a connection to SQL Server" (error 50) fault.
When investigating this issue, it became apparent that the problem is in a way WiX runs custom actions. Although non-impersonated CA-s run under SYSTEM account, the registry profile and environment remain that of current user (I suspect WiX loads these voluntary when attaching to user's session). This leads to incorrect path being expanded from the LOCALAPPDATA variable - the service receives SYSTEM profile one, but the schema update CA works with the user's one.
So here are two possible solutions. The first one is simple, but too intrusive to user's system - with cmd.exe started via psexec, recreate broken instance under the SYSTEM account. This was not an option for us as the user may have other databases created in v11.0 instance, which is public. The second option assumed lots of refactoring, but wouldn't hurt anything. Here is what to do to run DB schema updates properly with LocalDB in WiX CA:
Configure your CA as deferred/non-impersonate (should run under SYSTEM account);
Fix environment to point to SYSTEM profile paths:
var systemRoot = Environment.GetEnvironmentVariable("SystemRoot");
Environment.SetEnvironmentVariable("USERPROFILE", String.Format(#"{0}\System32\config\systemprofile", systemRoot));
Environment.SetEnvironmentVariable("APPDATA", String.Format(#"{0}\System32\config\systemprofile\AppData\Roaming", systemRoot));
Environment.SetEnvironmentVariable("LOCALAPPDATA", String.Format(#"{0}\System32\config\systemprofile\AppData\Local", systemRoot));
Environment.SetEnvironmentVariable("HOMEPATH", String.Empty);
Environment.SetEnvironmentVariable("USERNAME", Environment.UserName);
Load SYSTEM account profile. I used LogonUser/LoadUserProfile native API methods, as following:
[DllImport("advapi32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
[return: MarshalAs(UnmanagedType.Bool)]
private static extern bool LogonUser(
string lpszUserName,
string lpszDomain,
string lpszPassword,
int dwLogonType,
int dwLogonProvider,
ref IntPtr phToken);
[StructLayout(LayoutKind.Sequential)]
struct PROFILEINFO
{
public int dwSize;
public int dwFlags;
[MarshalAs(UnmanagedType.LPWStr)]
public String lpUserName;
[MarshalAs(UnmanagedType.LPWStr)]
public String lpProfilePath;
[MarshalAs(UnmanagedType.LPWStr)]
public String lpDefaultPath;
[MarshalAs(UnmanagedType.LPWStr)]
public String lpServerName;
[MarshalAs(UnmanagedType.LPWStr)]
public String lpPolicyPath;
public IntPtr hProfile;
}
[DllImport("userenv.dll", SetLastError = true, CharSet = CharSet.Unicode)]
[return: MarshalAs(UnmanagedType.Bool)]
static extern bool LoadUserProfile(IntPtr hToken, ref PROFILEINFO lpProfileInfo);
var hToken = IntPtr.Zero;
var hProfile = IntPtr.Zero;
bool result = LogonUser("SYSTEM", "NT AUTHORITY", String.Empty, 3 /* LOGON32_LOGON_SERVICE */, 0 /* LOGON32_PROVIDER_DEFAULT */, ref token);
if (result)
{
var profileInfo = new PROFILEINFO();
profileInfo.dwSize = Marshal.SizeOf(profileInfo);
profileInfo.lpUserName = #"NT AUTHORITY\SYSTEM";
if (LoadUserProfile(token, ref profileInfo))
hProfile = profileInfo.hProfile;
}
Wrap this in an IDisposable class, and use with a using statement to build a context.
The most important - refactor your code to perform necessary DB updates in a child process. This could be a simple exe-wrapper over your installer DLL, or stand-alone utility, if your already have one.
P.S. All these difficulties could be avoided, if only Microsoft let uses choose where to create LocalDB instances, via command line option. Like Postgres' initdb/pg_ctl utilities have, for example.
I suggest using a different user account, and not using the System account, by doing the following:-
create a new account on the machine, and set that to be the account
under which the Windows Service runs. It's not good practice to use
the system account just to run an application, anyway, as the
permissions are excessive.
ensure that the permissions on the LocalDB files are set to allow the said user account to access the database (and thus continue to
use Integrated Security)
make sure it works by trying to connect to the DB (once installed) under the same user account by running sqlcmd or Management Studio
under the context of the said user, then connecting with Integrated
Security to ensure it works.
Some other things to try/consider:
have you checked the Windows Event log for any events that might be useful for diagnostic purposes?
Make sure that if you have any other versions of SQL Server (especially prior to 2012) that for the command-line tools, the %PATH% isn't set to find an older tools version first. Older tools don't support LocalDB.
It is possible also (as an alternative) to set up LocalDB to be shared with other users. This involves sharing the instance, and then granting access to other users. See the "Sharing Issues" section in this article: Troubleshoot SQL Server 2012 Express LocalDB.
There's also another SO article that may contain some more useful information there in the links within (change the language in the URL from Polish to English by changing pl-pl to en-us). His work-around is using SQL Server accounts, which might not be OK in your case.
This might also be useful, as it relates to security permissions being denied, and possible resolutions: https://dba.stackexchange.com/questions/30383/cannot-start-sqllocaldb-instance-with-my-windows-account
Trevor, the problem you have is with the MSI custom actions. You must configure them with "Impersonate=false" otherwise the custom actions will be executed under the current user context.
BTW what tool are you using to create the installer?
Depending on the tool you use, could you please provide screenshots or code snippets of your custom actions configuration?
The accepted answer from this post will give you some additional information about the different custom action execution alternatives:
Run ExeCommand in customAction as Administrator mode in Wix Installer
You will find additional information about impersonation here:
http://blogs.msdn.com/b/rflaming/archive/2006/09/23/768248.aspx
I wouldn't create the database under the system's localdb instance. I'd create it under the current user installing the product. This will make life much easier if you need to delete or manage the database. They can do this through sql management studio. Otherwise, you'll have to use psexc or something else to launch a process under the SYSTEM account to manage it.
Once the db is created, then use the share option you mentioned. The SYSTEM account can then access the database through the share name.
sqllocaldb share MSSqlLocalDb LOCAL_DB
When sharing, I've noticed you'll have to restart the the local db instance to actually access the db through the share name:
sqllocaldb stop MSSQLLocalDB
sqllocaldb start MSSQLLocalDB
Also, You may need to add the SYSTEM account as a db reader and writer to the database ...
EXEC sp_addrolemember db_datareader, 'NT AUTHORITY\SYSTEM'

How to find the global catalog of my network in ADDC?

I'm learning IT right now, and I have this situation.
The employee who was the administrator, got out of the company. But he doesn't leave a documentation to tell me which of my ADDC (Active Directory Domain Controller) is the PDC, I mean I'm interested to fin the global catalog and structure of my network.
Does you know a post from TechNet or some site to find this PDC in Windows Server 2008 R2?
You can either open Active Directory Sites and services, expand sites -> servers and look at the NTDS settings of each server you have, there will be a tick box on the general tab that will be checked if the server is a global catalog.
Alternatively, if you have quite a lot of servers and don't want to have to do this for each one, you can use nslookup:
Find a list of global catalogs using nslookup
As for PDC though, these haven't really existed since windows NT, there is however a PDC emulator FSMO role which is held by one domain controller that you can find using the following command:
dsquery server -hasfsmo pdc
You can see the other FSMO roles here:
Identify Operations Master Roles
You can display the Global Catalog Servers in the domain you are logged in to using Nslookup.exe:
Open a CMD.EXE window.
Type the following command and press Enter:
nslookup gc._msdcs.%USERDNSDOMAIN%
Run the following from a command promt:
nslookup
set type=serv
_gc._tcp."FQDN"