IntelliJ IDEA 15 - How to migrate task history between projects? - intellij-idea

I have been using IntelliJ IDEA 15 for close to a year now, and using the same project this whole time, where I have created Tasks to basically act as workspaces for various work assignments so I can group files I've touched based on the assignment title. I've recently had an issue in my project workspace where I am basically being forced to create a new workspace, and thus a new project in IntelliJ. The problem is that this new project has none of my Task history in it.
Does anyone know if it's possible, and if so, how, to migrate this Task history from one project to another?
Thanks in advance!

Tasks are saved in YOUR_PROJECT/.idea/workspace.xml so you can backup this file and if needed you can just copy and paste these lines defining tasks to the other workspace.xml file. This is the example of one:
<configuration default="false" name="my-debug-task" type="JavascriptDebugType" factoryName="JavaScript Debug" uri="http://localhost:4200">
<mapping url="webpack:////home/marcin/Sprawozdania/Inzynierka/sqap/sqap-ui/src" local-file="$PROJECT_DIR$/sqap-ui/src" />
<mapping url="webpack:////home/marcin/Sprawozdania/Inzynierka/sqap/sqap-ui" local-file="$PROJECT_DIR$/sqap-ui" />
<method />
</configuration>
Addittionaly In workspace.xml you need to add reference of copied task to:
<project>
<component>
<list>
like here :
<list size="3">
<item index="0" class="java.lang.String" itemvalue="JavaScript Debug.my-debug-task" />
<item index="1" class="java.lang.String" itemvalue="JavaScript Debug.Unnamed" />
<item index="2" class="java.lang.String" itemvalue="Maven.config start -devs" />
</list>

Related

Intellij IDEA - DB Navigator - reveal password

I use "DB navigator" plugin(https://plugins.jetbrains.com/plugin/1800-database-navigator/) for my Intellij IDEA Community version quite some time and am very satisfied.
I want to know the password of my DB connection saved in the plugin. They are saved, they are there, but I cannot share it with my teammates.
Even all the IDEA passwords are set to be stored in the system keyring, I don't find them in seahorse, i.e., "Passwords and Keyrings" application in my Ubuntu.
Where are they?
At last, I found it in
<project_root>/.idea/dbnavigator.xml
search your connection name, and you will see sth like this:
<connection id="e208f307-8c08-45d5-93fd-958c1d68d049" active="true">
<database>
<name value="UAT" />
<description value="" />
<database-type value="ORACLE" />
<config-type value="BASIC" />
<database-version value="11.2" />
<driver-source value="BUILTIN" />
<driver-library value="" />
<driver value="" />
<url-type value="SERVICE" />
<host value="some-host" />
<port value="1523" />
<database value="APP_DB" />
<type value="USER_PASSWORD" />
<user value="admin" />
<deprecated-pwd value="<base64-encoded-password>" />
</database>
...
</connection>
So, I tried to base64 decoded them... and it works...
Please, if the author sees this, please don't encrypt it in the future versions; I need them to be in my local so that I don't have to ask my teammates again; too shy am I. Please take into consideration that I created the tag db-navigator for the first time while asking this question, so that ppl around the world could gather together with love of this plugin.
And, any coder reading this: please ignore this file in Git, as it contains sensitive data.

Name of the user who has processed the cube

There is a code piece in which I have to get the username of the user who has processed the cube or has made any changes in the cube structure.
I have searched in the SSAS DMV's, but didn't find what I needed; I only found the last processed time, but not the name of the user.
Any suggestions?
You can track this using an Extended Event. Add the ProgressReportBegin and ProgressReportEnd events which are for processing. These events include the NTUserName and StartTime fields which you can use to find who processed the cube and when. The Extended Event will need to be running when the cube is processed to capture this. The following is an example XMLA command which can be run when connected to your SSAS database in SSMS to create an Extended Event that tracks cube processing and outputs the results to a file. Of course many of these options are just defaults and you may want to make adjustments as necessary.
https://learn.microsoft.com/en-us/sql/analysis-services/instances/monitor-analysis-services-with-sql-server-extended-events?view=sql-server-2017
<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<ObjectDefinition>
<Trace>
<ID>XE_Cube_Process</ID>
<Name>XE_Cube_Process</Name>
<XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
<event_session name="XE_Cube_Process" dispatchLatency="0" maxEventSize="0" maxMemory="4" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<event package="AS" name="ProgressReportEnd" />
<event package="AS" name="ProgressReportBegin" />
<target package="package0" name="event_file">
<parameter name="filename" value="C:\Test\XE_Cube_Process.xel" />
<parameter name="max_file_size" value="4096" />
<parameter name="max_rollover_files" value="10" />
<parameter name="increment" value="1024" />
</target>
</event_session>
</XEvent>
</Trace>
</ObjectDefinition>
</Create>

Customization on Task Board In Team Foundation Server not reflecting

Hi so we are currently exploring Team Foundation server 2015 on-premise as part of our DevOps process.
Currently I'm trying out customization of the task board to add a "Pull Request" column/state in the board.
Steps ive done so far:
Exported the Task.xml file from the project "Wittest" in the Demoprojectcollection
witadmin exportwitd /collection:http://192.168.123.456:8080/tfs/DEMOPROJECTCOLLECTION /p:"Wittest" /n:Task /f:Task.xml
Modified xml to add the Code Review State
<STATE value="Pull Request">
<FIELDS>
<FIELD refname="Microsoft.VSTS.Common.ClosedDate">
<EMPTY />
</FIELD>
</FIELDS>
</STATE>
Added transitions for the new state
Uploaded the updated file using with admin again
witadmin importwitd /collection:http://192.168.123.456:8080/tfs/DEMOPROJECTCOLLECTION /p:"Wittest" /f:Task.xml
Check the board but columns stayed the same 3 columns "To Do" "In Progress" "Done"
Exported the Task.xml again and saw the xml was updated but I am unsure why I does not reflect the changes in the board though.
Would really appreciate the nudge to the right direction for this please
After adding the state in the task work item type, you also need to modify the process configuration file.
Try to use witadmin exportprocessconfig/importprocessconfig command to import and export process configuration, and add the pull request state in the TaskBacklog:
<TaskBacklog category="Microsoft.TaskCategory" parent="Microsoft.RequirementCategory" pluralName="Tasks" singularName="Task" workItemCountLimit="1000">
<AddPanel>
<Fields>
<Field refname="System.Title" />
</Fields>
</AddPanel>
<Columns>
<Column width="400" refname="System.Title" />
<Column width="100" refname="System.State" />
<Column width="100" refname="System.AssignedTo" />
<Column width="50" refname="Microsoft.VSTS.Scheduling.RemainingWork" />
</Columns>
<States>
<State type="Proposed" value="To Do" />
<State type="InProgress" value="In Progress" />
<State type="InProgress" value="Pull Request" />
<State type="Complete" value="Done" />
</States>
</TaskBacklog>
I've tested on my side, it's working:

Pentaho JNDI source name as parameter (Multi-Tennant)

I have googled this for the last half an hour, and found hits for pentaho parameters etc but nothing that appears to ask or answer this question.
I have a set of reports that are the same for each customer, but need to connect to different databases depending upon the customer who is running the report.
So my idea is to pass the JNDI data source name to the report at runtime as a parameter, so that the customer will connect to the correct database.
Is this possible, or is there a better way of managing a common set of reports that are used by different customers running on different databases but in the same single instance of the pentaho engine ?
OK, I have found a better solution using the little documented multi-tennant feature.
1) Stop Pentaho
2) Modify ( pentaho-solutions/system/pentahoObjects.spring.xml )
<!-- Original Code
<bean id="IDBDatasourceService" class="org.pentaho.platform.engine.services.connection.datasource.dbcp.DynamicallyPooledOrJndiDatasourceService" scope="singleton">
<property name="pooledDatasourceService" ref="pooledOrJndiDatasourceService" />
<property name="nonPooledDatasourceService" ref="nonPooledOrJndiDatasourceService" />
</bean>
-->
<!--Begin Tenant -->
<bean id="IDBDatasourceService" class="org.pentaho.platform.engine.services.connection.datasource.dbcp.tenantaware.TenantAwareLoginParsingDatasourceService"
scope="singleton">
<property name="requireTenantId" value="false" />
<property name="datasourceNameFormat" value="{1}-{0}" />
<property name="tenantSeparator" value="#" />
<property name="tenantOnLeft" value="false" />
</bean>
<!-- End Tenant -->
3) Add Suffix to Data Sources ( biserver-ce/tomcat/webapps/pentaho/META-INF/context.xml )
<Resource
name="jdbc/MYDBSRC-xxx"
auth="Container"
type="javax.sql.DataSource"
factory="org.apache.commons.dbcp.BasicDataSourceFactory"
maxActive="20"
maxIdle="5"
maxWait="10000"
username="XXXX"
password="XXXX"
driverClassName="net.sourceforge.jtds.jdbc.Driver"
url="jdbc:jtds:sqlserver://192.168.42.0:1433;DatabaseName=SOMEDB"
/>
<Resource
name="jdbc/MYDBSRC-aaa"
auth="Container"
type="javax.sql.DataSource"
factory="org.apache.commons.dbcp.BasicDataSourceFactory"
maxActive="20"
maxIdle="5"
maxWait="10000"
username="XXXX"
password="XXXX"
driverClassName="net.sourceforge.jtds.jdbc.Driver"
url="jdbc:jtds:sqlserver://192.168.42.0:1433;DatabaseName=AOTHERDB"
/>
4) Delete /tomcat/conf/Catalina/localhost/pentaho.xml
5) Restart Pentaho, create a user someone#xxx etc etc
6) Create a report using the JNDI Name "MYDBSRC"
7) Login as someone#xxx and you will get a different report / datasource than either logging in as user, or user#aaa
Tadah !!

How to modify XML file with wix toolset

I have XML file that include a next content:
<!--<appcache appCacheType="None" />-->
<appcache appCacheType="SingleClient" defaultExpiration="3600"/>
On installation patch i need to change this content in XML file to:
<appcache appCacheType="None" />
<!--<appcache appCacheType="SingleClient" defaultExpiration="3600"/>-->
What is a better way to do it?
Thanks.
I tried (in vain) to use the MSI Community Extensions for this purpose, but wasn't able to get them up and running.
I ended up using the util:XmlFile-tag from the Util-extension that works flawlessly.
Add the namespace of the Util-extension to your source-file in the Wix-element:
xmlns:util="http://schemas.microsoft.com/wix/UtilExtension"
Then use it as sub-element of a related component. In your case you want to delete an attribute and change the value of another one. The following should do the trick, just adjust the XPath in the ElementPath-attribute to the one that matches your tag (in the example it updates the appcache-tag which has an attribute appCacheType with the value SingleClient) and the filekey of the XML-file:
<Component Id="myComponentToUpdateTheXmlFile" ... >
<!-- Removing the defaultExpiration-attribute first -->
<util:XmlFile Id="UpdateAppCacheTag" Action="deleteValue" ElementPath="//appcache[\[]#appCacheType='SingleClient'[\]]/#defaultExpiration" File="[#MyConfigFile.xml]" SelectionLanguage="XPath" Sequence="1" Name="defaultExpiration" />
<!-- Now updating the value -->
<util:XmlFile Id="UpdateAppCacheTag" Action="setValue" ElementPath="//appcache[\[]#appCacheType='SingleClient'[\]]/#appCacheType" File="[#MyConfigFile.xml]" SelectionLanguage="XPath" Sequence="2" Value="None" />
</Component>
Be sure to add the Util-extension also on the commandline when invoking candle and light:
<candle or light command line> ... <parameters> ... -ext <PathToWiXExtensions>\WixUtilExtension.dll
If you want to do this only during e.g. patching, then add the appropriate condition for this component.