How to import a column from SQL into Excel via BIML - sql

I would like to import column 'Street' (NVARCHAR(50)) from a SQL table (Practice2.dbo.Adress) into Excel (ExcelDestination.xls). I know how to do this in SSIS, but in BIML I can't seem to find the right code, especially to do the column mapping between source and destination. When I try to generate the SSIS package, I get the error
"Could not resolve reference to 'Adress' of type 'TableResource'. 'TableName="Adress"' is invalid. Provide valid scoped name."
Here is what I have done so far:
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnection Name="ConnectionWithPractice2" ConnectionString="Provider='SQLNCLI11'; Data Source='DWH'; Initial Catalog='Practice2'; User Id='system'; Password='password';"></OleDbConnection>
<ExcelConnection Name="Excel Connection Manager" ConnectionString="Provider='Microsoft.Jet.OLEDB.4.0';Data Source='C:\Users\adm-jpna\Documents\ExcelDestination.xls';Extended Properties='Excel 8.0;HDR=Yes;IMEX=1'">
</ExcelConnection>
</Connections>
<Packages>
<Package Name="Package1">
<Tasks>
<Dataflow Name="ImportIntoExcel">
<Transformations>
<OleDbSource Name="OLE_DB_Source" ConnectionName="ConnectionWithPractice2">
<DirectInput>SELECT Street FROM Practice2.dbo.Adress</DirectInput>
</OleDbSource>
<ExcelDestination Name="Excel_Destination" ConnectionName="Excel Connection Manager">
<Columns>
<Column SourceColumn="Street"></Column>
</Columns>
<TableOutput TableName="Adress"></TableOutput>
</ExcelDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>

I made a few minor changes to your ExcelDestination
<Package Name="so_45063165">
<Tasks>
<Dataflow Name="ImportIntoExcel">
<Transformations>
<OleDbSource Name="OLE_DB_Source" ConnectionName="ConnectionWithPractice2">
<DirectInput>SELECT N'123 Oak' AS Street;</DirectInput>
</OleDbSource>
<ExcelDestination Name="Excel_Destination" ConnectionName="Excel Connection Manager">
<ExternalTableOutput Table="Sheet1$" />
</ExcelDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
TableOutput refers to the Biml collection of Tables. You're looking for ExternalTableOutput which instructs the engine to look to the referred object (Excel in this case) to validate that it exists. The change in tag results in the property changing from TableName to Table and then since we're referencing a worksheet and not a table, we need to specify as such with a $. Sheet1 would be a table or named ranged while Sheet1$ means the actual worksheet.
Since you didn't provide a column mapping between your source Street and a target column, I removed the tags.

Related

Handle sqlplus substitution variables (&&vars) in Liquibase

my project is trying to migrate to liquibase but the lack of support for bind variables is making this difficult.
During our deployment we have sql scripts containing sqlplus substitution variables, like for example.
-- load_seed.sql ---
insert into <table>
values('&&host', '&&port', '&&user');
The value of these variables is different per environment, therefore we define profiles like these.
<DEV_profile.sql>
DEFINE host='dev.company.org'
DEFINE port=4008
..
<UAT_profile.sql>
DEFINE host='uat.company.org'
...
and the we run the deployment like this:
./deploy.ksh DEV
---- deploy.ksh ---
sqlplus <<END
<connection>
#$1_profile
#load_seed
The correct profile is picked up at execution time and the variables replaced.
Could you please suggest how to handle a case like this with Liquibase?
The equivalent functionality in Liquibase is provided by changelog parameters.
In your changelog, you define parameters, which are basically key-value pairs, and liquibase decides which value to use based on the value of a context or a label or a dbms.
When you want to apply the changeset to a given environment, you specify the context or label on the command line or in the liquibase.properties. Liquibase can determine the dbms based on the connection URL.
Here's an example that is somewhat similar to what you describe:
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:ext="http://www.liquibase.org/xml/ns/dbchangelog-ext"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.6.xsd
http://www.liquibase.org/xml/ns/dbchangelog-ext http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-ext.xsd">
<property name="host" value="dev.company.org" context="DEV"/>
<property name="port" value="4008" context="DEV"/>
<property name="user" value="DEV_USER" context="DEV"/>
<property name="host" value="uat.company.org" context="UAT"/>
<property name="port" value="4321" context="UAT"/>
<property name="user" value="UAT_USER" context="UAT"/>
<changeSet id="1" author="joe">
<insert tableName="someTableName">
<column name="host" type="varchar(255)" value="${host}"/>
<column name="port" type="varchar(8)" value="${port}"/>
<column name="user" type="varchar(255)" value="${user}"/>
</insert>
</changeSet>
</databaseChangeLog>
https://docs.liquibase.com/concepts/basic/changelog-property-substitution.html
does not support sql changelog property substitution. you would have to migrate to (xml, yaml, json)

Name of the user who has processed the cube

There is a code piece in which I have to get the username of the user who has processed the cube or has made any changes in the cube structure.
I have searched in the SSAS DMV's, but didn't find what I needed; I only found the last processed time, but not the name of the user.
Any suggestions?
You can track this using an Extended Event. Add the ProgressReportBegin and ProgressReportEnd events which are for processing. These events include the NTUserName and StartTime fields which you can use to find who processed the cube and when. The Extended Event will need to be running when the cube is processed to capture this. The following is an example XMLA command which can be run when connected to your SSAS database in SSMS to create an Extended Event that tracks cube processing and outputs the results to a file. Of course many of these options are just defaults and you may want to make adjustments as necessary.
https://learn.microsoft.com/en-us/sql/analysis-services/instances/monitor-analysis-services-with-sql-server-extended-events?view=sql-server-2017
<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<ObjectDefinition>
<Trace>
<ID>XE_Cube_Process</ID>
<Name>XE_Cube_Process</Name>
<XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
<event_session name="XE_Cube_Process" dispatchLatency="0" maxEventSize="0" maxMemory="4" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<event package="AS" name="ProgressReportEnd" />
<event package="AS" name="ProgressReportBegin" />
<target package="package0" name="event_file">
<parameter name="filename" value="C:\Test\XE_Cube_Process.xel" />
<parameter name="max_file_size" value="4096" />
<parameter name="max_rollover_files" value="10" />
<parameter name="increment" value="1024" />
</target>
</event_session>
</XEvent>
</Trace>
</ObjectDefinition>
</Create>

Content Type Instantiation: Picking value of a reference field

I'm trying sensenet features, my focus is on reference field inside content type.
I defined & installed the following content type successfully.
<?xml version="1.0" encoding="utf-8"?>
<ContentType name="EmployeeCT" parentType="GenericContent"handler="SenseNet.ContentRepository.GenericContent" xmlns="http://schemas.sensenet.com/SenseNet/ContentRepository/ContentTypeDefinition">
<DisplayName>Employee Record</DisplayName>
<Description></Description>
<Icon>Content</Icon>
<AllowIncrementalNaming>true</AllowIncrementalNaming>
<AllowedChildTypes>EmployeeCT</AllowedChildTypes>
<Fields>
<Field name="Manager" type="Reference">
<DisplayName>Manager</DisplayName>
<Description></Description>
<Configuration>
<AllowMultiple>false</AllowMultiple>
<AllowedTypes>
<Type>EmployeeCT</Type>
</AllowedTypes>
<SelectionRoot>
<Path>/Root</Path>
</SelectionRoot>
<!--<DefaultValue>/Root/Path1,/Root/Path2</DefaultValue>-->
<ReadOnly>false</ReadOnly>
<Compulsory>false</Compulsory>
<VisibleBrowse>Show</VisibleBrowse>
<VisibleEdit>Show</VisibleEdit>
<VisibleNew>Show</VisibleNew>
</Configuration>
</Field>
</Fields>
</ContentType>
The problem is that I could not found & pick manager of employee.
Any help please,
Thanks.
Are there any previously saved content with the type EmployeeCT? Because you set it as the only allowed type as the value of the Manager field.
I've checked your code on my local site and it works. First I had to save an Employee Record to create a content for a manager and then I was able to pick it as a manager of a new Employee Record.

Approving "One" in a one-to-many DB relationship only when ALL "Many"s are approved

I've tried looking for this I promise, but its kind of a hard question to search for...
I have a database with two tables linked in a one-to-many relationship where each "one" is an invoice header linked to "many" invoice lines.
I am designing a tool that will match each invoice line to a purchase order and I want to be able to mark an invoice header as "matched" only when all lines have been matched.
Does anyone know how to write this update query?
The relational operator you seek is known formally as division and colloquially as "the suppliers who supply all parts". For various reasons, a division operator has not appeared in any SQL product I know of, Access included. Instead, you need to use other operators in combination to do the same. See Divided we stand: the SQL of relational division by Joe Celko.
An approach available to users of Access 2010 and later is to use event-driven data macros in the child table to maintain the status flag in the parent table. For example, with a parent table
[InvHeader]
InvID AllMatched
----- ----------
1 No
and a child table
[InvItem]
ItemID InvID Matched
------ ----- -------
1 1 Yes
2 1 No
the following After Update data macro will automatically update the corresponding row in [InvHeader] after a child row has been changed in [InvItem]
Code:
<?xml version="1.0" encoding="UTF-16" standalone="no"?>
<DataMacros xmlns="http://schemas.microsoft.com/office/accessservices/2009/11/application">
<DataMacro Event="AfterUpdate">
<Statements>
<Action Collapsed="true" Name="SetLocalVar">
<Argument Name="Name">newAllMatched</Argument>
<Argument Name="Value">True</Argument>
</Action>
<ForEachRecord>
<Data Alias="i">
<Reference>InvItem</Reference>
<WhereCondition>[i].[InvID]=[InvItem].[InvID]</WhereCondition>
</Data>
<Statements>
<ConditionalBlock>
<If>
<Condition>Not [i].[Matched]</Condition>
<Statements>
<Action Collapsed="true" Name="SetLocalVar">
<Argument Name="Name">newAllMatched</Argument>
<Argument Name="Value">False</Argument>
</Action>
<Action Name="ExitForEachRecord"/>
</Statements>
</If>
</ConditionalBlock>
</Statements>
</ForEachRecord>
<LookUpRecord>
<Data Alias="hdr">
<Reference>InvHeader</Reference>
<WhereCondition>[hdr].[InvID]=[InvItem].[InvID]</WhereCondition>
</Data>
<Statements>
<ConditionalBlock>
<If>
<Condition>[hdr].[AllMatched]<>[newAllMatched]</Condition>
<Statements>
<EditRecord>
<Data Alias="hdr"/>
<Statements>
<Action Collapsed="true" Name="SetField">
<Argument Name="Field">AllMatched</Argument>
<Argument Name="Value">[newAllMatched]</Argument>
</Action>
</Statements>
</EditRecord>
</Statements>
</If>
</ConditionalBlock>
</Statements>
</LookUpRecord>
</Statements>
</DataMacro>
</DataMacros>
Note that this is an example for updates only. For completeness, one would normally put the above code in a named data macro and call it from After Insert, After Update, and After Delete.

duplicated list fields in documnt's properties

I have one site columns which is added to an existing Content Type that has been deployed via a Feature.
In the Content Type, the columns appear fine, but when viewing or editing the properties of a document which its list used the content type, the columns appear twice. They appear identical and updating one of the instances updates both instances. Looking at the available fields using PowerShell on the List Item and on the Site Collection does not show any duplication - the same is true when viewing the Library Settings and viewing the columns. Despite this, the duplication still happens when viewing or editing properties.
here is my code:
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<!-- My Column (Field)-->
<Field ID="{996A0BA7-4B25-44F9-9AE6-FBE47EC123CE}" Group="RK Fields" Name="Kategori" DisplayName="Kategori" Type="Text"
Hidden="TRUE" Required="FALSE" SourceID="http://schemas.microsoft.com/sharepoint/v3" StaticName="Kategori"/>
<!-- My Content Type -->
<ContentType ID="0x01010053e1d612ba3f4e21aa250ecd751942b3"
Name="RKDokument"
Group="RK Content Types"
Description="Innehållstyp för dokument på RK"
Inherits="TRUE"
Version="0">
<FieldRefs>
<FieldRef ID="{996A0BA7-4B25-44F9-9AE6-FBE47EC123CE}" Name="Kategori" />
</FieldRefs>
</ContentType>
</Elements>