DKIM fails for potion of the emails sent from server, according to DMARC report, what could be the cause? - dkim

I see in the report from Google that from the server IP address a potion of email failed.
What could be the reason for that? (That DKIM passes for some emails, while it fails for others.)
Report shapshot:
<record>
<row>
<source_ip>209.126.119.66</source_ip>
<count>40</count>
<policy_evaluated>
<disposition>none</disposition>
<dkim>fail</dkim>
<spf>pass</spf>
</policy_evaluated>
</row>
<identifiers>
<header_from>numbeo.com</header_from>
</identifiers>
<auth_results>
<spf>
<domain>numbeo.com</domain>
<result>pass</result>
</spf>
</auth_results>
</record>
...
<record>
<row>
<source_ip>209.126.119.66</source_ip>
<count>70</count>
<policy_evaluated>
<disposition>none</disposition>
<dkim>pass</dkim>
<spf>pass</spf>
</policy_evaluated>
</row>
<identifiers>
<header_from>numbeo.com</header_from>
</identifiers>
<auth_results>
<dkim>
<domain>numbeo.com</domain>
<result>pass</result>
<selector>mail</selector>
</dkim>
<spf>
<domain>numbeo.com</domain>
<result>pass</result>
</spf>
</auth_results>
</record>
So for the 40 emails sent from 209.126.119.66 DKIM failed, while for 70 passed.
My DKIM configuration is specified here:
https://mladenadamovic.wordpress.com/2018/01/17/how-to-install-and-configure-dkim-with-postfix-on-ubuntu-xenial-16-04-lts/

Related

XML Import txt file to to SQL. Duplicate id error message?

My administrative program uses an xml-file to import pricelists (txt files) to the SQL database.
There is a lot of rows in the XML file but this is the rows I have problems with.
<?xml version="1.0"?>
<BCPFORMAT>
<RECORD>
<FIELD ID="41" xsi:type="CharFixed" LENGTH="5" COLLATION=""/>
</RECORD>
<ROW>
<COLUMN SOURCE="41" NAME="VGRNR1" xsi:type="SQLVARYCHAR"/>
<COLUMN SOURCE="41" NAME="VGRNR2" xsi:type="SQLVARYCHAR"/>
</ROW>
</BCPFORMAT>
I have changed the eighth row to COLUMN SOURCE="41" because I want the data from FIELD ID="41" in both fields VGRNR1 and VGRNR2 in the SQL database.
When I make the change I get this error message from my administrative program:
duplicate element id "41"
Is it possible to edit the xml-file so I can get data from FIELD ID="41" in both fields VGRNR1 and VGRNR2 in the database?

Speedata and XPath

Unfortunately I can't handle the XPath syntax in the Laout.xml of Speedata .
I've been programming XSL for years and maybe I'm a bit preburdened.
The XML I'm trying to evaluate has the following structure:
<export>
<object>
<fields>
<field key="demo1:DISPLAY_NAME" lang="de_DE" origin="default" ftype="string">Anwendungsbild</field>
<field key="demo1:DISPLAY_NAME" lang="en_UK" origin="inherit" ftype="string">application picture</field>
<field key="demo1:DISPLAY_NAME" lang="es_ES" origin="self" ftype="string">imagen de aplicaciĆ³n</field>
</fields>
</object>
</export>
The attempt to output the element node with the following XPath fails.
export/object/fields/field[#key='demo1:DISPLAY_NAME' and #lang='de_DE' and #origin='default']
How do I formulate the query in Speedata Publisher, please?
Thnk you for our Help.
The speedata software only supports a small subset of XPath. You have two options
preprocess the data with the provided Saxon XSLT processor
iterate through the data yourself:
<Layout xmlns="urn:speedata.de:2009/publisher/en"
xmlns:sd="urn:speedata:2009/publisher/functions/en">
<Record element="export">
<ForAll select="object/fields/field">
<Switch>
<Case test="#key='demo1:DISPLAY_NAME' and #lang='de_DE' and #origin='default'">
<SetVariable variable="whatever" select="."/>
</Case>
</Switch>
</ForAll>
<Message select="$whatever"></Message>
</Record>
</Layout>
(given your input file as data.xml)

DateFormatTransformer not working with FileListEntityProcessor in Data Import Handler

While indexing data from a local folder on my system, i am using given below configuration.However the lastmodified attribute is getting indexed in the format "Wed 23 May 09:48:08 UTC" , which is not the standard format used by solr for filter queries .
So, I am trying to format the lastmodified attribute in the data-config.xml as given below .
<dataConfig>
<dataSource name="bin" type="BinFileDataSource" />
<document>
<entity name="f" dataSource="null" rootEntity="false"
processor="FileListEntityProcessor"
baseDir="D://FileBank"
fileName=".*\.(DOC)|(PDF)|(pdf)|(doc)|(docx)|(ppt)" onError="skip"
recursive="true" transformer="DateFormatTransformer">
<field column="fileAbsolutePath" name="path" />
<field column="fileSize" name="size" />
<field column="fileLastModified" name="lastmodified" dateTimeFormat="YYYY-MM-DDTHH:MM:SS.000Z" locale="en"/>
<entity name="tika-test" dataSource="bin" processor="TikaEntityProcessor"
url="${f.fileAbsolutePath}" format="text" onError="skip">
<field column="Author" name="author" meta="true"/>
<field column="title" name="title" meta="true"/>
<!--<field column="text" />-->
</entity>
</entity>
</document>
</dataConfig>
But there is no effect of transformer, and same format is indexed again . Has anyone got success with this ? Is the above configuration right , or am i missing something ?
Your dateTimeFormat provided does not seem to be correct. The upper and lower case characters have different meaning. Also the format you showed does not match the date text you are trying to parse. So, it probably keeps it as unmatched.
Also, if you have several different date formats, you could parse them after DIH runs by creating a custom UpdateRequestProcessor chain. You can see schemaless example where there is several date formats as part of auto-mapping, but you could also do the same thing for a specific field explicitly.

HP-ALM Adding Test Cases With REST API

I'm looking to automate adding new test cases into HP-ALM using the REST API. I didn't find anything in the documentation to help me achieve this and I was wondering if anyone else had any success with it.
The API documentation provided through ALM is very helpful.
1) authenticate session
2) capture Cookie
3) create test (See below - FROM ALM DOCUMENTATION)
use the entity type you want to create and specify the appropriate fields.
Example with XML
POST /qcbin/rest/domains/{domain}/projects/{project}/defects HTTP/1.1
Content-Type: application/xml
Accept: application/xml
Cookie: QCSession=xxx; LWSSO_COOKIE_KEY=xxx
Data
<Entity Type="defect">
<Fields>
<Field Name="detected-by">
<Value>henry_tilney</Value>
</Field>
<Field Name="creation-time">
<Value>2010-03-02</Value>
</Field>
<Field Name="severity">
<Value>2-Medium</Value>
</Field>
<Field Name="name">
<Value>Returned value not does not match value in database.</Value>
</Field>
</Fields>
</Entity>
Example with JSON
POST /qcbin/rest/domains/{domain}/projects/{project}/defects HTTP/1.1
Content-Type: application/json
Accept: application/json
Cookie: QCSession=xxx; LWSSO_COOKIE_KEY=xxx
Data
{"Fields":[{"Name":"detected-by","values":[{"value":"henry_tilney"}]}, {"Name":"creation-time","values":[{"value":"2010-03-02"}]},{"Name":"severity","values":[{"value":"2-Medium"}]},{"Name":"name","values":[{"value":"Returned value not does not match value in database.</ "}]}]}
Example XML I Have Used for Test Entity
<Entity Type="test">
<Fields>
<Field Name="name">
<Value>MY TEST CASE</Value>
</Field>
<Field Name="description">
<Value>Test created from api</Value>
</Field>
<Field Name="owner">
<Value>roglesby</Value>
</Field>
<Field Name = "subtype-id">
<Value>VAPI-XP-TEST</Value>
</Field>
<Field Name = "parent-id">
<Value>6209</Value>
</Field>
</Fields>
</Entity>
I have created a small module to send REST requests to HP ALM using python. For instance I am using the following command:
myCreate = self.nSession.post(entUrl, headers=self.header, data=xml_data)
After a correct Session is established, then I am using a simple POST action. The value in parenthesis are respectively:
entUrl = '{0}/rest/domains/{1}/projects/{2}'.format(self.server, self.domain, self.project) + you have to add the entity you want to create --> tests for instance.
{server}/qcbin/rest/domains/{domain}/projects/{project}/tests
headers is a dictionary containing all the headers needed to maintain the connection opened.
data is containing an xml or a JSON file format with all the information to create a test (for instance)
Hope this can help other users (since the question is quite old). Have a nice day.

Avoiding Boxing/Unboxing on unknown input

I am creating an application that parses an XML and retrieves some data. Each xml node specifies the data (const), a recordset's column-name to get the data from (var), a subset of possible data values depending on some condition (enum) and others. It may also specify, alongside the data, the format in which the data must be shown to the user.
The thing is that for each node type I need to process the values differently and perform some actinons so, for each node, I need to store the return value in a temp variable in order to later format it... I know I could format it right there and return it but that would mean to repeat myself and I hate doing so.
So, the question: How can I store the value to return, in a temp variable, while avoiding boxing/unboxing when the type is unknown and I can't use generics?
P.S.: I'm designing the parser, the XML Schema and the view that will fill the recordset so changes to all are plausible.
Update
I cannot post the code nor the XML values but this is the XML structure and actual tags.
<?xml version='1.0' encoding='utf-8'?>
<root>
<entity>
<header>
<field type="const">C1</field>
<field type="const">C2</field>
<field type="count" />
<field type="sum" precision="2">some_recordset_field</field>
<field type="const">C3</field>
<field type="const">C4</field>
<field type="const">C5</field>
</header>
<detail>
<field type="enum" fieldName="some_recordset_field">
<match value="0">M1</match>
<match value="1">M2</match>
</field>
<field type="const">C6</field>
<field type="const">C7</field>
<field type="const">C8</field>
<field type="var" format="0000000000">some_recordset_field</field>
<field type="var" format="MMddyyyy">some_recordset_field</field>
<field type="var" format="0000000000" precision="2">some_recordset_field</field>
<field type="var" format="0000000000">some_recordset_field</field>
<field type="enum" fieldName ="some_recordset_field">
<match value="0">M3</match>
<match value="1">M4</match>
</field>
<field type="const">C9</field>
</detail>
</entity>
</root>
Have you tried using the var type? That way you don't need to know the type of each node. Also, some small sample of your scenario would be useful.