Devops deployment passing arguments\parameters to a SQL Script - sql

I am trying to run a script that creates Environment variables for an SSIS package.
Depending on the destination i.e. SIT, UAT or PRD I want my script to use different variable values depending on the destination. In my script I have a variable #DepoyServer and I want this to be populated as a parameter or argument from Devops when using the Execute SQL Script Task.
My code then looks at what this is set to and sets others common variables for each environment
e.g. A database connection string variable will be set to that of the environment.
Example script code would be: -
-- DECLARE #DeployServer varchar(100)
Declare #DBConnectionString varchar(500)
IF #DeployServer = 'UAT'
SET #DBConnectionString = 'ConnectionStringForUAT'
IF #DeployServer = 'PRD'
SET #DBConnectionString = 'ConnectionStringForPRD'
/*
Code to create environment varaiables and populate the variable with #DBConnectionString
*/
The SQL Script file path is set using: -
$(System.DefaultWorkingDirectory)/path/SQLScript.sql
There is a field for Arguments.
I have oogled it to death but ll I'm getting are DACPAC examples.

You define the sql script with sqlcmd variable for deploy server.
:SetVar DeployServer UAT
Declare #DBConnectionString varchar(500)
IF $(DeployServer) = 'UAT'
SET #DBConnectionString = 'ConnectionStringForUAT'
IF $(DeployServer) = 'PRD'
SET #DBConnectionString = 'ConnectionStringForPRD'
Now, call this script in the sqlcmd tool with right value for the DeplyServer
sqlcmd -v DeployServer ="UAT" -i $(System.DefaultWorkingDirectory)/path/SQLScript.sql

Related

SqlCmd command execution stops

Iam new to sqlcmd and i'm trying to execute this sql cmd code:
:Connect SERVERNAME
!!if exist $(FullBackup) del $(FullBackup)
GO
!!if exist $(TransactionLog) del $(TransactionLog)
GO
I am passing variables $(FullBackup) and $(TransactionLog) through a powershell script:
& $app -i $syncFileLocal -E -b -v FullBackup=("""$fullbackup""") TransactionLog=("""$transactionLog""");
where syncFileLocal contains the above sqlcmd command.
Somehow the execution stops after the second :Connect PROD-SQLMASTER
UPDATE:
When i use harcorded values for $(FullBackup) and $(TransactionLog)
the script seems to work. Is there anyway i could do it by passing variables through powershell?
Instead of:
FullBackup=("""$fullbackup""") TransactionLog=("""$transactionLog""")
try:
FullBackup="""$fullbackup""" TransactionLog="""$transactionLog"""
If you use (), the grouping operator, its output is passed as a separate argument, which is not what you want.
Do note, however, that even the solution above relies on PowerShell's fundamentally broken argument-passing to external programs, as of v7.0 - see this answer.
If sqlcmd is implemented properly (I don't know if it is), the right way to pass the arguments is:
FullBackup=$fullbackup TransactionLog=$transactionLog
That way, you would rely on PowerShell's on-demand, behind-the-scenes re-quoting of arguments, where if $fullbackup or $translactionLog contained spaces, the arguments would be passed as, for instance, "FullBackup=c:\path\to\backup 1" and "TransactionLog=c:\path\to\log 1"
I found a solution. I recommend using this with appropriate validations
:Connect $(ServerMaster)
DECLARE #resultBkp INT
EXEC master.dbo.xp_fileexist N'$(FullBackup)', #resultBkp OUTPUT
IF (#resultBkp = 1)
BEGIN
DECLARE #resultDeleteBkp INT
EXECUTE master.sys.xp_cmdshell '$(FullBackup)'
EXEC master.dbo.xp_fileexist N'$(FullBackup)', #resultDeleteBkp OUTPUT
IF (#resultDeleteBkp = 0)
BEGIN
PRINT 'Backup Deleted'
END
ELSE
BEGIN
SELECT ERROR_NUMBER(), ERROR_MESSAGE();
RETURN;
END
END
ELSE
BEGIN
PRINT 'Backup file not found'
END
I used the master.dbo.xp_fileexist to check whether the file exists and then used
master.sys.xp_cmdshell command to delete the file.
To enable master.sys.xp_cmdshell for the database server please use this solution:
Enable 'xp_cmdshell' SQL Server
I have tested it and it works fine when i pass the arguments via powershell.

SQL Server database project pre- and post-deployment script

I've added an extra column to a table which I want to initialize using a query in the post deployment script. Unfortunately I can't seem to write a query which can be run every time so I'm looking for a way to check in the pre-deployment script if the column is available and pass an argument or variable to the post-deployment script which will then run the initialization query once.
Attempt 1: I tried setting a sqlcmd var in the pre-deployment script but the following syntax isn't allowed:
IF COL_LENGTH('dbo.Table','NewColumn') IS NULL
:setvar PerformInitQuery 1
Attempt 2: I've also tried using a normal variable in the pre-deployment script:
DECLARE #PerformInitQuery BIT = 0
IF COL_LENGTH('dbo.Table','NewColumn') IS NULL
SET #PerformInitQuery = 1
And accessing it in the post-deployment script:
IF #PerformInitQuery = 1
BEGIN
:r ".\DeploymentScripts\PerformInitQuery.sql"
END
This last attempt seemed to work when publishing the project from Visual Studio but not on our build server; which uses SqlPackage.exe to publish the generated .dacpac file to the database.
Error SQL72014: .Net SqlClient Data Provider:
Msg 137, Level 15, State 2, Line 12
Must declare the scalar variable "#PerformInitQuery"
You could try using a temp table to hold values you wish to pass from pre to post scripts;
/*
Pre-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be executed before the build script.
Use SQLCMD syntax to include a file in the pre-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the pre-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
select 'hello world' as [Col] into #temptable
picked up in post deployment script;
/*
Post-Deployment Script Template
--------------------------------------------------------------------------------------
This file contains SQL statements that will be appended to the build script.
Use SQLCMD syntax to include a file in the post-deployment script.
Example: :r .\myfile.sql
Use SQLCMD syntax to reference a variable in the post-deployment script.
Example: :setvar TableName MyTable
SELECT * FROM [$(TableName)]
--------------------------------------------------------------------------------------
*/
declare #var nvarchar(200)
select #var = [Col] from #temptable
print #var
hello world
Update complete.

SQL Server Calling a stored procedure from another stored procedure at the command line

I have been playing around with database backup automation scripts and in particular the one at this link:
http://support.microsoft.com/kb/2019698
I got everything working fine and even added automated compression using 7zip, logging, and with the help of vbscript an email scheduled notification. However, even without all that, you can see this is a bit heavy. Its now easily reaching 400 lines of code.
I am really not comfortable having all my stuff in one block like this and I want to separate it out. So I can have say a compression file called BackupCompress.sql, and an log file called BackupLogReport.sql all of which would be called from inside the main Backup.sql script.
The Backup.sql script is in turn run from a Backup.bat file which is set to run in the scheduler.
All of this works like a charm. But I am at a loss as to how to call BackupCompress.sql from within BackupLogReport.sql and pass in parameters and get a return value.
In the Backup.bat file I use this command to spin everything up and pass parameters to it:
SQLCMD -S %SQLDATABASE% -d master -i %BACKUP_FOLDER%\Backup.sql -v Pram1="%Pram1%"
In the Backup.sql file I get those parameters simply by:
DECLARE #Param1 NVARCHAR(256) = '$(Param)'
from then on as my script runs it uses whatever I want to pass in.
I tried using standard sql stored procedure logic to call another procedure like this:
EXEC BackupCompress.sql
#AnotherParam = #Param1
I also tried:
EXECUTE sp_executesql BackupCompress.sql #Param1
Finally I tried:
SET #cmd = 'SQLCMD -S ' + ##ServerName + ' -d master -i $(BACKUP_FOLDER)\BackupCompress.sql -v Param1 = ' + #Param1
EXEC xp_cmdshell #cmd, no_output
but it doesn't work and my files which were being compressed simply don't get compressed. I get no error message. everything else continues to work fine.
EDIT: I was getting an error message on the last one but I fixed it - however, I still don't get my little zip file. I even put print's into the file to see if it was actually be executed but it does not seem to be.
EDIT2: Another option I have tried, almost works, but cant figure out how to pass parameters from within the sql file to the other file... As a result it generates an error saying it cant find the file as it's treating the path as a literal string instead of the variable value I want to pass.
:!!SQLCMD -S ##ServerName -d master -i #CFG_BACKUP_PATH\BackupCompress.sql -v Param1 = #Param1
xp_cmdshell can return values. These values can be captured into a table variable that you could use to "see" the results, and perhaps determine where the problem lies:
DECLARE #cmd VARCHAR(255);
DECLARE #Param1 NVARCHAR(256) = '$(Param)';
DECLARE #Results TABLE
(
ResultsText NVARCHAR(MAX)
);
SET #cmd = 'SQLCMD -S ' + ##ServerName + '-d master -i $(BACKUP_FOLDER)\$(BackupCompress.sql) -v Param1 = ' + #Param1;
SET #cmd = 'DIR \';
INSERT INTO #Results (ResultsText)
EXEC xp_cmdshell #cmd;
SELECT *
FROM #Results;
You need to ensure xp_cmdshell is enabled for the instance, by executing:
EXEC sp_configure 'xp_cmdshell',1;

How to pass parameters to SQL script via Powershell

I have a Power-shell script that calls a SQL script. This is currently working, but inside my sql script I have some hard coded parameters that I would like to pass to the SQL script via the powershell.
So this is the snip-it from the Power-shell script
function ExecSqlScript([string] $scriptName)
{
$scriptFile = $script:currentDir + $scriptName
$sqlLog = $script:logFileDir + $scriptName + "_{0:yyyyMMdd_HHmmss}.log" -f (Get-Date)
$result = sqlcmd -S uk-ldn-dt270 -U sa -P passwordhere3! -i $scriptFile -b | Tee-Object - filepath $sqlLog
if ($result -like "*Msg *, Level *, State *" -Or $result -like "*Sqlcmd: Error:*")
{
throw "SQL script " + $scriptFile + " failed: " + $result
}
}
try
{
ExecSqlScript "restoreDatabase.sql"
}
catch
{
//Some Error handling here
}
And this is from the SQL
USE MASTER
GO
DECLARE #dbName varchar(255)
SET #dbName = 'HardCodedDatabaseName'
So I want to pass the value for dbName, any ideas?
You could take advantage of sqlcmd's scripting variables. Those can be used in script file and are marked with $(). Like so,
-- Sql script file
use $(db);
select someting from somewhere;
When calling sqlcmd, use the -v parameter to assign variables. Like so,
sqlcmd -S server\instance -E -v db ="MyDatabase" -i s.sql
Edit
Mind the Sql syntax when setting variables. Consider the following script:
DECLARE #dbName varchar(255)
SET #dbName = $(db)
select 'val' = #dbName
As passed to the Sql Server, it looks like so (Profiler helps here):
use master;
DECLARE #dbName varchar(255)
SET #dbName = foo
select 'val' = #dbName
This is, obviously invalid a syntax, as SET #dbName = foo won't make much sense. The value ought to be within single quotes like so,
sqlcmd -S server\instance -E -v db ="'foo'" -i s.sql
Just in case someone else needs to do this... here is a working example.
Power Shell Script:
sqlcmd -S uk-ldn-dt270 -U sa -P 1NetNasdf£! -v db = "'DatabaseNameHere'" -i $scriptFile -b | Tee-Object -filepath $sqlLog
Note the -v switch to assign the variables
And here is the MS SQL:
USE MASTER
GO
if db_id($(db)) is null
BEGIN
EXEC('
RESTORE DATABASE ' + $(db) + '
FROM DISK = ''D:\DB Backup\EmptyLiveV5.bak''
WITH MOVE ''LiveV5_Data'' TO ''C:\Program Files (x86)\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\LiveV5_' + $(db) + '.MDF'',
MOVE ''LiveV5_Log'' To ''C:\Program Files (x86)\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\LiveV5_' + $(db) + '_log.LDF'', REPLACE,
STATS =10')
END
Note: You do not have to assign the scripting varible to a normal sql varible like this.
SET #dbName = $(db)
you can just use it in your sql code. - Happy coding.
Here's a full example using a different PowerShell approach. Here I'm using a specific script to reset local databases for a clean development environment.
## Reset the Database
$resetScript= "C:\ResetSite\resetDatabases.sql"
Write-Host "Resetting the DB - Running $($resetScript)"
$connectionString = "Server = localhost; Database = 'master'; UID = myusername; PWD = mypassword"
# Create variables & params
$sqlCmdVariables = #(
"Database=$($siteConfig.db_name)",
"UserName=$($siteConfig.db_username)",
"UserPassword=$($siteConfig.db_user_password)"
)
$sqlCmdParameters = #{
InputFile = $resetScript
QueryTimeout = 1800
ConnectionString = $connectionString
Variable = $sqlCmdVariables
}
# Invoke
Invoke-SqlCmd #sqlCmdParameters
The .sql file then uses the parameters passed in, the same way #nmbell mentions.
-- Declare the vars
DECLARE #Database nvarchar(100), #UserName nvarchar(100), #UserPassword nvarchar(100)
-- Set the vars
SET #Database = '$(Database)' -- database name
SET #UserName = '$(UserName)' -- SQL login and database username
SET #UserPassword = '$(UserPassword)' -- login password
... more stuff here.. use the vars like normal
This is partly derived from this blog post but modified slightly to use a file rather than an inline query.
Adjusting vonPryz's answer to use:
SET #dbName = '$(db)'
means you can pass in the parameter from the command line in a more natural form as
sqlcmd -S server\instance -E -v db ="foo" -i s.sql
The SqlCmd variable still substitutes correctly.
I know this is an old answer but I do have a better way that is much easier if you only need a small amount of data from Powershell (or even a large amount as long as all you want is text), and you work mainly in SQL for your scripting like I do:
1: Start the PowerShell from SQL using xp_cmdshell, and insert the results to a one-column table which allows NULLs e.g:
DECLARE #Results (Line varchar(1000) NULL)
INSERT #Results
EXEC master.dbo.xp_cmdshell '"powershell.exe C:\PowershellScripts\MyScript.ps1 MyParams"'
2: During your PowerShell script, for anything you want to pass back to SQL, simply use "Write-Output", e.g:
Write-Output $returned_data
You can do this as many times as you want. If you have 10,000 values to pass back to SQL, then you could use write-output 10,000 times.
So in the above example once the "MyScript.ps1" PowerShell script finishes running, all of the output will be in the #Results table variable, ready to be used, queried, imported into individual variables, whatever you want really.

How do I find the data directory for a SQL Server instance?

We have a few huge databases (20GB+) which mostly contain static lookup data. Because our application executes joins against tables in these databases, they have to be part of each developers local SQL Server (i.e. they can't be hosted on a central, shared database server).
We plan on copying a canonical set of the actual SQL Server database files (*.mdf and *.ldf) and attach them to each developer's local database.
What's the best way to find out the local SQL Server instance's data directory so we can copy the files to the right place? This will be done via an automated process, so I have to be able to find and use it from a build script.
It depends on whether default path is set for data and log files or not.
If the path is set explicitly at Properties => Database Settings => Database default locations then SQL server stores it at Software\Microsoft\MSSQLServer\MSSQLServer in DefaultData and DefaultLog values.
However, if these parameters aren't set explicitly, SQL server uses Data and Log paths of master database.
Bellow is the script that covers both cases. This is simplified version of the query that SQL Management Studio runs.
Also, note that I use xp_instance_regread instead of xp_regread, so this script will work for any instance, default or named.
declare #DefaultData nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultData', #DefaultData output
declare #DefaultLog nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'DefaultLog', #DefaultLog output
declare #DefaultBackup nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer', N'BackupDirectory', #DefaultBackup output
declare #MasterData nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer\Parameters', N'SqlArg0', #MasterData output
select #MasterData=substring(#MasterData, 3, 255)
select #MasterData=substring(#MasterData, 1, len(#MasterData) - charindex('\', reverse(#MasterData)))
declare #MasterLog nvarchar(512)
exec master.dbo.xp_instance_regread N'HKEY_LOCAL_MACHINE', N'Software\Microsoft\MSSQLServer\MSSQLServer\Parameters', N'SqlArg2', #MasterLog output
select #MasterLog=substring(#MasterLog, 3, 255)
select #MasterLog=substring(#MasterLog, 1, len(#MasterLog) - charindex('\', reverse(#MasterLog)))
select
isnull(#DefaultData, #MasterData) DefaultData,
isnull(#DefaultLog, #MasterLog) DefaultLog,
isnull(#DefaultBackup, #MasterLog) DefaultBackup
You can achieve the same result by using SMO. Bellow is C# sample, but you can use any other .NET language or PowerShell.
using (var connection = new SqlConnection("Data Source=.;Integrated Security=SSPI"))
{
var serverConnection = new ServerConnection(connection);
var server = new Server(serverConnection);
var defaultDataPath = string.IsNullOrEmpty(server.Settings.DefaultFile) ? server.MasterDBPath : server.Settings.DefaultFile;
var defaultLogPath = string.IsNullOrEmpty(server.Settings.DefaultLog) ? server.MasterDBLogPath : server.Settings.DefaultLog;
}
It is so much simpler in SQL Server 2012 and above, assuming you have default paths set (which is probably always a right thing to do):
select
InstanceDefaultDataPath = serverproperty('InstanceDefaultDataPath'),
InstanceDefaultLogPath = serverproperty('InstanceDefaultLogPath')
Even though this is a very old thread, I feel like I need to contribute a simple solution.
Any time that you know where in Management Studio a parameter is located that you want to access for any sort of automated script, the easiest way is to run a quick profiler trace on a standalone test system and capture what Management Studio is doing on the backend.
In this instance, assuming you are interested in finding the default data and log locations you can do the following:
SELECT
SERVERPROPERTY('instancedefaultdatapath') AS [DefaultFile],
SERVERPROPERTY('instancedefaultlogpath') AS [DefaultLog]
I stumbled across this solution in the documentation for the Create Database statement in the help for SQL Server:
SELECT SUBSTRING(physical_name, 1, CHARINDEX(N'master.mdf', LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id = 1 AND file_id = 1
For the current database you can just use:
select physical_name fromsys.database_files;
to specify another database e.g. 'Model', use sys.master_files
select physical_name from sys.master_files where database_id = DB_ID(N'Model');
As of Sql Server 2012, you can use the following query:
SELECT SERVERPROPERTY('INSTANCEDEFAULTDATAPATH') as [Default_data_path], SERVERPROPERTY('INSTANCEDEFAULTLOGPATH') as [Default_log_path];
(This was taken from a comment at http://technet.microsoft.com/en-us/library/ms174396.aspx, and tested.)
Various components of SQL Server (Data, Logs, SSAS, SSIS, etc) have a default directory. The setting for this can be found in the registry. Read more here:
http://technet.microsoft.com/en-us/library/ms143547%28SQL.90%29.aspx
So if you created a database using just CREATE DATABASE MyDatabaseName it would be created at the path specified in one of the settings above.
Now, if the admin / installer changed the default path, then the default path for the instance is stored in the registry at
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\[INSTANCENAME]\Setup
If you know the name of the instance then you can query the registry. This example is SQL 2008 specific - let me know if you need the SQL2005 path as well.
DECLARE #regvalue varchar(100)
EXEC master.dbo.xp_regread #rootkey='HKEY_LOCAL_MACHINE',
#key='SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL10.MSSQLServer\Setup',
#value_name='SQLDataRoot',
#value=#regvalue OUTPUT,
#output = 'no_output'
SELECT #regvalue as DataAndLogFilePath
Each database can be created overriding the server setting in a it's own location when you issue the CREATE DATABASE DBName statement with the appropriate parameters. You can find that out by executing sp_helpdb
exec sp_helpdb 'DBName'
Keeping it simple:
use master
select DB.name, F.physical_name from sys.databases DB join sys.master_files F on DB.database_id=F.database_id
this will return all databases with associated files
From the GUI: open your server properties, go to Database Settings, and see Database default locations.
Note that you can drop your database files wherever you like, though it seems cleaner to keep them in the default directory.
Small nitpick: there is no data folder, only a default data folder.
Anyway, to find it, assuming you want to install for the first default instance:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.1\Setup\SQLDataRoot
If there's a named instance, MSSQL.1 becomes something like MSSQL10.INSTANCENAME.
You can find default Data and Log locations for the current SQL Server instance by using the following T-SQL:
DECLARE #defaultDataLocation nvarchar(4000)
DECLARE #defaultLogLocation nvarchar(4000)
EXEC master.dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultData',
#defaultDataLocation OUTPUT
EXEC master.dbo.xp_instance_regread
N'HKEY_LOCAL_MACHINE',
N'Software\Microsoft\MSSQLServer\MSSQLServer',
N'DefaultLog',
#defaultLogLocation OUTPUT
SELECT #defaultDataLocation AS 'Default Data Location',
#defaultLogLocation AS 'Default Log Location'
Expanding on "splattered bits" answer, here is a complete script that does it:
#ECHO off
SETLOCAL ENABLEDELAYEDEXPANSION
SET _baseDirQuery=SELECT SUBSTRING(physical_name, 1, CHARINDEX(N'master.mdf', LOWER(physical_name)) - 1) ^
FROM master.sys.master_files WHERE database_id = 1 AND file_id = 1;
ECHO.
SQLCMD.EXE -b -E -S localhost -d master -Q "%_baseDirQuery%" -W >data_dir.tmp
IF ERRORLEVEL 1 ECHO Error with automatically determining SQL data directory by querying your server&ECHO using Windows authentication.
CALL :getBaseDir data_dir.tmp _baseDir
IF "%_baseDir:~-1%"=="\" SET "_baseDir=%_baseDir:~0,-1%"
DEL /Q data_dir.tmp
echo DataDir: %_baseDir%
GOTO :END
::---------------------------------------------
:: Functions
::---------------------------------------------
:simplePrompt 1-question 2-Return-var 3-default-Val
SET input=%~3
IF "%~3" NEQ "" (
:askAgain
SET /p "input=%~1 [%~3]:"
IF "!input!" EQU "" (
GOTO :askAgain
)
) else (
SET /p "input=%~1 [null]: "
)
SET "%~2=%input%"
EXIT /B 0
:getBaseDir fileName var
FOR /F "tokens=*" %%i IN (%~1) DO (
SET "_line=%%i"
IF "!_line:~0,2!" == "c:" (
SET "_baseDir=!_line!"
EXIT /B 0
)
)
EXIT /B 1
:END
PAUSE
i would have done a backup restore simply becuase its easier and support versioning. Reference data especially needs to be versioned in order to know when it started taking effect. A dettach attach wont give you that ability. Also with backups you can continue to provide updated copies without having to shut down the database.
Alex's answer is the right one, but for posterity here's another option: create a new empty database. If you use CREATE DATABASE without specifying a target dir you get... the default data / log directories. Easy.
Personally however I'd probably either:
RESTORE the database to the developer's PC, rather than copy/attach (backups can be compressed, exposed on a UNC), or
Use a linked server to avoid doing this in the first place (depends how much data goes over the join)
ps: 20gb is not huge, even in 2015. But it's all relative.
SELECT DISTINCT dbo.GetDirectoryPath(filename) AS InstanceDataPaths
FROM sys.sysaltfiles WHERE filename like '%.mdf' and filename not like '%\MSSQL\Binn\%'
SELECT DISTINCT dbo.GetDirectoryPath(filename) AS InstanceLogPaths
FROM sys.sysaltfiles WHERE filename like '%.ldf' and filename not like '%\MSSQL\Binn\%'
You can download detail SQL script from how to find the data directory for a SQL Server instance
You will get default location if user database by this query:
declare #DataFileName nVarchar(500)
declare #LogFileName nVarchar(500)
set #DataFileName = (select top 1 RTRIM(LTRIM(name)) FROM master.sys.master_files where database_id >4 AND file_id = 1)+'.mdf'
set #LogFileName = (select top 1 RTRIM(LTRIM(name)) FROM master.sys.master_files where database_id >4 AND file_id = 2)+'.ldf'
select
( SELECT top 1 SUBSTRING(physical_name, 1, CHARINDEX(#DataFileName, LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id >4 AND file_id = 1) as 'Data File'
,
(SELECT top 1 SUBSTRING(physical_name, 1, CHARINDEX(#LogFileName, LOWER(physical_name)) - 1)
FROM master.sys.master_files
WHERE database_id >4 AND file_id = 2) as 'Log File'