I'm using concat targets in an ant macrodef to generate ddl files. One part of the string in a few of the property strings are getting duplicated in the resulting ddl.
This duplication is only observed when generated from the concat targets.
I've tried 1) using dashes instead of underscores, 2) using ${property-name} instead of #{property-name}, 3) using echo target instead of concat target, 4) switching from ant 1.9.3 to 1.10.5, and 5) doing an online search
Property getting set in ant script
<property name="SCHEMA_ID" value="REPLACE_SCHEMA_ID" />
Attribute being set in macrodef
<attribute name="schema-id" default="${SCHEMA_ID}" />
Concat target
<concat destfile="#{dest-dir}/#{spname}.ddl">
SET CURRENT SCHEMA = '#{schema-id}'
####
SET CURRENT SQLID = '#{sql-id}'
####
</concat>
Output line in the ddl file
SET CURRENT SCHEMA = 'REPLACE_REPLACE_SCHEMA_ID'
I would expect the Output line in the ddl file to be:
SET CURRENT SCHEMA = 'REPLACE_SCHEMA_ID'
As far as I can tell there's a bug when using echo or concat (at least in a macrodef) where if the name of a property equals part of the value of a property, the part of the value that doesn't match the name is duplicated.
<property name="SCHEMA_ID" value="REPLACE_SCHEMA_ID" /> becomes REPLACE_REPLACE_SCHEMA_ID
<property name="SCHEMA_ID" value="#SCHEMA_ID#" /> becomes ##SCHEMA_ID##
but
<property name="SCHEMA_ID" value="#schema_id#" /> becomes #schema_id#
Strange behavior, and I'm open to being proven wrong, but this is what I came up with.
Related
I am attempting to link resource files that were organized by locale folders into their own .resources.dll assembly. There are more than 750 locales that are dynamically generated, so it is not practical to hard code them like the docs show.
<ItemGroup>
<ResourceFiles Include="<path_to>/af/af.res;<path_to>/af/feature1.af.res;<path_to>/af/feature2.af.res">
<Culture>af<Culture>
</ResourceFiles>
<ResourceFiles Include="<path_to>/af-NA/af_NA.res;<path_to>/af-NA/feature1.af_NA.res;<path_to>/af-NA/feature2.af_NA.res">
<Culture>af-NA<Culture>
</ResourceFiles>
<ResourceFiles Include="<path_to>/af-ZA/af_ZA.res;<path_to>/af-ZA/feature1.af_ZA.res;<path_to>/af-ZA/feature2.af_ZA.res">
<Culture>af-ZA<Culture>
</ResourceFiles>
</ItemGroup>
The above structure can be used to execute the AL task multiple times for each group of files. As you can see, my files are arranged in folders that are named the same as the culture in .NET.
My question is, how do I build this structure dynamically based on the 750+ locale folders, many which contain multiple files?
What I Tried
I was able to get the grouping to function. However, for some odd reason the list of files is being evaluated as a String rather than ITaskItem[] like it should be. This is the structure that does the correct grouping. It is based on this Gist (although I am not sure whether I am misunderstanding how to use the last bit because the example is incomplete).
<PropertyGroup>
<SatelliteAssemblyTargetFramework>netstandard2.0</SatelliteAssemblyTargetFramework>
<TemplateAssemblyFilePath>$(MSBuildProjectDirectory)/bin/$(Configuration)/$(TargetFramework)/$(AssemblyName).dll</TemplateAssemblyFilePath>
<ICU4JResourcesDirectory>$(SolutionDir)_artifacts/icu4j-transformed</ICU4JResourcesDirectory>
<ICU4NSatelliteAssemblyOutputDir>$(SolutionDir)_artifacts/SatelliteAssemblies</ICU4NSatelliteAssemblyOutputDir>
<PropertyGroup>
<Target
Name="GenerateOurSatelliteAssemblies"
DependsOnTargets="ExecICU4JResourceConverter"
AfterTargets="AfterBuild"
Condition=" '$(TargetFramework)' == '$(SatelliteAssemblyTargetFramework)' ">
<ItemGroup>
<EmbeddedResources Include="$(ICU4JResourcesDirectory)/*.*" />
<EmbeddedResourcesPaths Include="$([System.IO.Directory]::GetDirectories('$(ICU4JResourcesDirectory)'))" />
<!-- This groups each locale together along with its nested files and root path -->
<FolderInLocale Include="#(EmbeddedResourcesPaths)">
<Culture>$([System.IO.Path]::GetFileName('%(Identity)'))</Culture>
<Files>$([System.IO.Directory]::GetFiles('%(EmbeddedResourcesPaths.Identity)'))</Files>
</FolderInLocale>
</ItemGroup>
<!-- EmbedResources accepts ITaskItem[], but the result
of this transform is a ; delimited string -->
<AL EmbedResources="#(FolderInLocale->'%(Files)')"
Culture="%(FolderInLocale.Culture)"
TargetType="library"
TemplateFile="$(TemplateAssemblyFilePath)"
KeyFile="$(AssemblyOriginatorKeyFile)"
OutputAssembly="$(ICU4NSatelliteAssemblyOutputDir)/%(FolderInLocale.Culture)/$(AssemblyName).resources.dll" />
</Target>
I have attempted numerous ways to replace the semicolon characters with %3B and to split on semicolon (i.e. #(FolderInLocale->'%(Files.Split(';'))'), but in all cases, the transform fails to evaluate correctly.
I have also consulted the docs for MSBuild well-known item metadata to see if there is another way of grouping by folder. Unfortunately, there is no %(FolderName) metadata, which would solve my issue completely. While I was able to get it to group by folder using the below XML, it immediately flattened when trying to get the name of the top level folder, which is where the name of the culture is.
I am using GetFileName() to get the name of the top level folder after stripping the file name from it. But please do tell if there is a better way.
<ItemGroup>
<EmbeddedResourcesLocalizedFiles Include="$(ICU4JResourcesDirectory)/*/*.*"/>
<EmbeddedResourcesLocalized Include="#(EmbeddedResourcesLocalizedFiles)">
<Culture>%(RootDir)%(Directory)</Culture>
</EmbeddedResourcesLocalized>
<!-- Calling GetFileName() like this rolls the files into a single group,
but prior to this call it is grouped correctly. -->
<EmbeddedResourcesLocalized2 Include="#(EmbeddedResourcesLocalized)">
<Culture>$([System.IO.Path]::GetFileName('%(EmbeddedResourcesLocalized.Culture)'))</Culture>
</EmbeddedResourcesLocalized2>
</ItemGroup>
When importing db fro azure bacpac file to local sql server 2016 I'm geting the following error.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'EXTERNAL'.
Error SQL72045: Script execution error. The executed script: CREATE EXTERNAL DATA SOURCE [BoxDataSrc]
WITH (
TYPE = RDBMS,
LOCATION = N'MYAZUREServer.database.windows.net',
DATABASE_NAME = N'MyAzureDb',
CREDENTIAL = [SQL_Credential]
);
(Microsoft.SqlServer.Dac)
I ran into this same issue today. Since "WITH(TYPE = RDBMS)" is only applicable to Azure SQL DB, we get the error when attempting to import the bacpac into SQL Server 2017 on-premise. I did find a solution thanks to this article:
https://blogs.msdn.microsoft.com/azuresqldbsupport/2017/08/16/editing-a-bacpac-file/
The relevant steps rewritten here:
Make a copy of the bacpac file (for safety in case of errors).
Change the file extension to zip, then decompress it into a folder. Surprisingly, a bacpac is actually just a zip file, not something proprietary and hard to get into.
Find the model.xml file and edit it to remove the section that looks like this:
<Element Type="SqlExternalDataSource" Name="[BoxDataSrc]">
<Property Name="DataSourceType" Value="1" />
<Property Name="Location" Value="MYAZUREServer.database.windows.net" />
<Property Name="DatabaseName" Value="MyAzureDb" />
<Relationship Name="Credential">
<Entry>
<References Name="[SQL_Credential]" />
</Entry>
</Relationship>
</Element>
If you have multiple external data sources of this type, you will pobably need to repeat step 3 for each one. I only had one.
Save and close model.xml.
Now you need to re-generate the checksum for model.xml so that the bacpac doesn't think it was tampered with (since you just tampered with it). Create a PowerShell file named computeHash.ps1 and put this code into it.
$modelXmlPath = Read-Host "model.xml file path"
$hasher = [System.Security.Cryptography.HashAlgorithm]::Create("System.Security.Cryptography.SHA256CryptoServiceProvider")
$fileStream = new-object System.IO.FileStream ` -ArgumentList #($modelXmlPath, [System.IO.FileMode]::Open)
$hash = $hasher.ComputeHash($fileStream)
$hashString = ""
Foreach ($b in $hash) { $hashString += $b.ToString("X2") }
$fileStream.Close()
$hashString
Run the PowerShell script and give it the filepath to your unzipped and edited model.xml file. It will return a checksum value.
Copy the checksum value, then open up Origin.xml and replace the existing checksum, toward the bottom on the line that looks like this:
<Checksum Uri="/model.xml">9EA0F06B282D4F42955C78A98822A31AA0ED0225CB131B8759379055A482D01F</Checksum>
Save and close Origin.xml, then select all the files and put them into a new zip file and rename the extension to bacpac.
Now you can use this new bacpac to import the database without getting the error. It worked for me, it could work for you, too.
As per #SQLDoug's answer, this can happen if your Azure SQL database has External Tables (i.e. linked tables from other databases). You can check that in SSMS here:
Addendum to accepted answer
If you delete those external tables' datasouces you'll also need to delete the SqlExternalTable elements in the model.xml file that were using those datasources too, they'll look something like this:
<Element Type="SqlExternalTable" Name="[dbo].[DeliveryMethodsRestored]">
<Property Name="ExternalSchemaName" Value="dbo" />
<Property Name="ExternalObjectName" Value="DeliveryMethods" />
<Property Name="IsAnsiNullsOn" Value="True" />
<Property Name="IsQuotedIdentifierOn" Value="False" />
<Relationship Name="Columns">
<Entry>
<Element Type="SqlSimpleColumn" Name="[dbo].[DeliveryMethodsRestored].[DeliveryMethodId]">
<Property Name="IsNullable" Value="False" />
<Relationship Name="TypeSpecifier">
<Entry>
SNIP....
</Element>
If you do a search for 'SqlExternalTable' in model.xml you'll find them all easily.
Alternative approach to solving this issue
Rather than correcting the bacpac after downloading it, the other way to deal with this is simply to remove the external tables before creating the bacpac i.e.:
Restore a copy of your database to a separate database
Delete the External Tables in the restored copy
Delete the External Data Sources in the restored copy
Create the bacpac from that restored copy
Delete the copy database
This approach has the advantage that you aren't creating the bacpac from the live database, which apparently 'can cause the exported table data to be inconsistent because, unlike SQL Server's physical backup/restore, exports do not guarantee transactional consistency'.
If that's something you're likely to do a that a lot you could probably write scripts to automate most of the above steps.
Same error code with different error.
Could not import package.
Warning SQL72012: The object [PreProd_Data] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [PreProd_Log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 5 Incorrect syntax near 'OPTIMIZE_FOR_AD_HOC_WORKLOADS'.
Error SQL72045: Script execution error. The executed script:
IF EXISTS (SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'$(DatabaseName)')
BEGIN
ALTER DATABASE SCOPED CONFIGURATION SET OPTIMIZE_FOR_AD_HOC_WORKLOADS = ON;
END
Solution
this blog will help to edit model.xml to remove Relationship command for OPTIMIZE_FOR_AD_HOC_WORKLOADS which is not necessary in SQL Server 2017 Instance.
https://blogs.msdn.microsoft.com/azuresqldbsupport/2017/08/16/editing-a-bacpac-file/
Make a copy of the bacpac file (for safety in case of errors).
Change the file extension to zip, then decompress it into a folder. Surprisingly, a bacpac is actually just a zip file, not something proprietary and hard to get into.
Find the model.xml file and edit it to remove the section that looks like this:
The relevant steps rewritten here:
Make a copy of the bacpac file (for safety in case of errors).
Change the file extension to zip, then decompress it into a folder.
Surprisingly, a bacpac is actually just a zip file, not something
proprietary and hard to get into.
Find the model.xml file and edit it to remove the section that looks
like this:
<Relationship Name="GenericDatabaseScopedConfigurationOptions">
<Entry>
<References Name="[OPTIMIZE_FOR_AD_HOC_WORKLOADS]" />
</Entry>
</Relationship>
Remove following block from model.xml
<Element Type="SqlGenericDatabaseScopedConfigurationOptions" Name="[OPTIMIZE_FOR_AD_HOC_WORKLOADS]">
<Property Name="GenericValueType" Value="2" />
<Property Name="GenericValue" Value="ON" />
</Element>
Save and close model.xml.
Now you need to re-generate the checksum for model.xml so that the bacpac doesn't think it was tampered with (since you just tampered with it). Create a PowerShell file named computeHash.ps1 and put this code into it.
Run the PowerShell script and give it the filepath to your unzipped and edited model.xml file. It will return a checksum value.
Copy the checksum value, then open up Origin.xml and replace the existing checksum.
Save and close Origin.xml, then select all the files and put them into a new zip file and rename the extension to bacpac.
Now bacpack file will is ready to import and it work for me.
Thanks.
Elastic Database queries are supported only on Azure SQL Database v12 or later, Not on local server.
https://msdn.microsoft.com/en-us/library/dn935022.aspx
I got the same error code (SQL72045) when importing bacpac even though we have deleted the external data sources in Azure that we used to sync data with. It turned out that there was a procedure "TransferDo" left with reference to SCOPED CREDENTIAL for another database. After we removed the procedure, the import worked well.
I need to check whether the given folder is available in the given path or not. If that folder is not available in the given path,then it has to take alternative folder mentioned.
So to check the existence of the given folder, i tried
I got error as error MSB4092: An unexpected token "$(D:\DK)" was found at character position 11 in condition "'(Exists('$(D:\DK)')' "
What is the correct format to use this Exists condition?
This is quite basic though it can be confusing apparently..
$(<name>) is used to refer to the property named <name> but you don't seem to have a property, just a string.
So either
<Message Condition="Exists('d:\dk')" Text="It Exists" />
or
<PropertyGroup>
<Dk>d:\dk</Dk>
</PropertyGroup>
<Message Condition="Exists($(Dk))" Text="It Exists" />
I tried the Exists condition with the following scenario and it works fine for me.
<ROOT Condition="Exists('D:\DK')">D:\DK</ROOT>
<ROOT Condition="'$(ROOT)'==''">D:\New\DK</ROOT>
I have an WiX installer configured like this:
<Property Id="MY_PROPERTY">
...
<Registry Name="MyValue" Type="multiString" Value="[MY_PROPERTY]" />
Now I want to pass this property value at the command line as a list:
MsiExec.exe /i MyInstaller.msi /qb MY_PROPERTY="One[~]Two[~]Three"
However, the installer does not split the values into a list and the literal value is written instead.
If I hard code the element it works properly:
<Registry Name="MyValue" Type="multiString" Value="One[~]Two[~]Three" />
Does anyone know how to specify a list of values at the command-line for a multiString registry value? Thanks in advance
Better late than never!
This can be achieved using a Custom Action.
Follow this MS document carefully: https://learn.microsoft.com/en-us/windows/win32/msi/registry-table
In your custom action, insert the registry value into MSI table from your property as follows,
Set db = Session.Database
set oView = db.OpenView("INSERT INTO `Registry` (`Registry`,`Root`,`Key`,`Name`,`Value`,`Component_`) VALUES ('reg_MY_PROPERTY', -1,'Software\Company\Product','MyValue','" & _
Session.Property("MY_PROPERTY") & "','CM_CP_BlahBlah') TEMPORARY")
oView.Execute
oView.Close
CM_CP_BlahBlah is your WIX component Registry values are attached to.
Please note "custom action must come before the RemoveRegistryValues and WriteRegistryValues actions in the action sequence"
<InstallExecuteSequence>
<Custom Action="SetMyPropertyCustomAction" Before="RemoveRegistryValues">NOT REMOVE</Custom>
</InstallExecuteSequence>
REG_MULTI_SZ
A sequence of null-terminated strings, terminated by an empty string (\0).
The following is an example:
String1\0String2\0String3\0LastString\0\0
The first \0 terminates the first string, the second to the last \0 terminates the last string, and the final \0 terminates the sequence. Note that the final terminator must be factored into the length of the string.
So as per this LINK you should be doing this:
MY_PROPERTY="One\0Two\0Three\0"
For MULTISTRINGValues check this element: MULTISTRINGVALUE
I found that liquibase uses the full path of the change log file to calculate the checksum.
This behavior restricts to modify change log file names and tries to reapply the change sets again once renamed the file.
Is there a way to configure liquibase to use only the changelog id to
calculate cuecksum?
Please provide your valuable thoughts.
Use the attribute logicalFilePath of the databaseChangeLog tag.
Upstream developers recommend to use logicalFilePath and suggest to perform direct update on DATABASECHANGELOG.FILENAME column:
https://forum.liquibase.org/t/why-does-the-change-log-contain-the-file-name/481
to fix broken entries with full paths.
If you set hashes DATABASECHANGELOG.MD5SUM to null that triggers hashes recalculation on next LiquiBase run. It is necessary as hash algorithm includes moving parts too into the result.
One really similar issue- you may just want to ignore the portion of the path before the changelog-master.xml file. In my scenario, I've checked out a project in C:\DEV\workspace and my colleague has the project checked out in C:\another_folder\TheWorkspace.
I'd recommend reading through http://forum.liquibase.org/topic/changeset-uniqueness-causing-issues-with-branched-releases-overlapped-changes-not-allowed-in-different-files first.
Like others have suggested, you'll want the logicalFilePath property set on the <databaseChangeLog> element.
You'll also need to specify the changeLogFile property in a certain way when calling liquibase. I'm calling it from the command line. If you specify an absolute or relative path to the changeLogFile without the classpath, like this, it will include the whole path in the DATABASECHANGELOG table:
liquibase.bat ^
--changeLogFile=C:\DEV\more\folders\schema\changelog-master.xml ^
...
then liquibase will break if you move your migrations to any folder other than that one listed above. To fix it (and ensure that other developers can use whatever workspace location they want), you need to reference the changelogFile from the classpath:
liquibase.bat ^
--classpath=C:\DEV\more\folders ^
--changeLogFile=schema/changelog-master.xml ^
...
The first way, my DATABASECHANGELOG table had FILENAME values (I might have the slash backwards) like
C:\DEV\more\folders\schema\subfolder\script.sql
The second way, my DATABASECHANGELOG table has FILENAME values like
subfolder/script.sql
I'm content to go with filenames like that. Each developer can run liquibase from whatever folder they want. If we decide we want to rename or move an individual SQL file later on, then we can specify the old value in the logicalFilePath property of the <changeSet> element.
For reference, my changelog-master.xml just consists of elements like
<include file="subfolder/script.sql" relativeToChangelogFile="true"/>
I have faced the same problem and found solution below.
If you are using liquibase sql format then simply put below in your sql file:
--liquibase formatted sql logicalFilePath:<relative SQL file path like(liquibase/changes.sql)>
If you are using liquibase xml format then simply put below in your xml file:
<databaseChangeLog logicalFilePath=relative XML file path like(liquibase/changes.xml)" ...>
...
</databaseChangeLog>
After adding above logicalFilePath attribute, run the liquibase update command.
It will put relative file path whatever you put in logicalFilePath in FILENAME column of table DATABASECHANGELOG