Select last 30 days rows from MARA using SSIS - sql

I'm trying to select rows for last date change = 30 days.
I tried LAEDA = ( sy-datum -30 ) in where clause, but it always generated error.I connect to sap Abap database.
The message error:
[EIS-Material 1] Error: ERPConnect.ERPException: Error while
receiving function return values: SYSTEM_FAILURE An error has occurred
while parsing a dynamic entry. at
ERPConnect.RFCAPI.ReceiveFunctionResults(UInt32 connectionHandle,
RFC_PARAMETER[] importing, RFC_PARAMETER[] changing, RFC_TABLE[]
tables, Encoding apiEncoding) at
ERPConnect.RFCFunction.ReceiveFunctionArguments(RFC_TABLE[]&
apiTables) at ERPConnect.RFCFunction.CallClassicAPI() at
ERPConnect.RFCFunction.ExecuteRFC(Byte[] tid) at
XtractKernel.Extractors.TableExtractor.GetPackage(RFCFunction& func)
at XtractKernel.Extractors.TableExtractor.Extract() at
XtractKernel.Extractors.ExtractorBase`1.Extract(ProcessResultCallback
processResult) at XtractIS.XtractSourceTable.PrimeOutput(Int32
outputs, Int32[] outputIDs, PipelineBuffer[] buffers) at
Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100
wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers,
IntPtr ppBufferWirePacket)

So you are using a third party tool to extract data from an SAP system. According to the error message, the toole makes a Remote Function Call (RFC) and handing the SQL to the ABAP backend. Then your where condition must be valid ABAP/Open SQL syntax, regardless of the database behind.
Your call (simplified) would look like this in ABAP (with new #-syntax):
DATA(lf_dat) = sy-datum - 30.
SELECT matnr
FROM mara
WHERE laeda >= #lf_dat
INTO TABLE #DATA(lt_matnr)
.
The problem is, that you are not allowed to make this calculation within the the statement, as far as I know, so you have to use a variable. But since your third party tool only allows you to write a where condition I see no way to handle this, except with a static date in the condition:
laeda >= '20190106' "YYYYMMDD
You can add the ABAP tag to your question to attract more specialists on this ABAP specific topic.

I see in the Xtract IS online help that there's a custom function module named Z_THEO_READ_TABLE installed at ABAP side, which executes the SQL sent by Xtract IS. The module is provided in 2 flavors, one being for ABAP >= 740 SP 5, so I guess it's a version for ABAP SQL Strict Mode.
So, I thought that maybe you could write this ABAP-like Where Clause by using a "host expression", which is valid in ABAP SQL Strict Mode :
LAEDA = #( sy-datum - 30 )
Based on the error message you have, "An error has occurred while parsing a dynamic entry", I guess that this function module does something like SELECT (dyn-columns) FROM (dyn-table) WHERE (dyn-condition), i.e. all elements are dynamically defined at run time.
Unfortunately, the "ABAP documentation sql_cond - (cond_syntax) says that "Host expressions are not allowed in dynamic logical expressions."
So long, impossible to make a where clause as you wish.
There are probably many ways to get around this limit (like creating a SAPquery or BAPI in SAP and calling it from Xtract IS, etc.) but that's another question.

In mySQL / MariaDB, this works:
select ...
from ...
where date >= DATE_ADD(CURDATE(), INTERVAL -30 DAY)
but we need to know what database you are working with.

You may try it, if you use SQL database:
Select DATEADD(Month, -1, getdate())

You cannot specify ABAP formula like that through SAP Open SQL.
Not to directly resolve your challenge (as you have product limitation), here is how dynamic filter is achieved through AecorSoft tool:
(DT_WSTR, 4)(DATEPART("yy" , GETDATE())) + RIGHT("0" + (DT_WSTR, 4)DATEPART("mm" , GETDATE()),2) + RIGHT("0" + (DT_WSTR, 4)DATEPART("dd" , GETDATE()),2)
For complete use case, you can check the blog SAP Table Delta Extract Made Easy through Dynamic Filters

Related

SELECT FROM #itab causes syntax error. Why?

I try to use SELECT FROM #itab like explained here in SAP docs.
I have never used this feature, but think this is great. You can query a internal data structure which just exists in the RAM of the interpreter like it would be a real table in the database. I am impressed.
Here is the ABAP code:
data: lt_get_auth_values TYPE STANDARD TABLE OF US335.
CALL FUNCTION 'GET_AUTH_VALUES'
EXPORTING
OBJECT1 = 'Z:FOO'
USER = sy-uname
TABLES
VALUES = lt_get_auth_values.
SELECT highval from #lt_get_auth_values as mytab WHERE field = 'WERKS'
INTO TABLE #DATA(static_perm_filter_fields).
I can't active the function because "from #lt_get_auth_values" is a syntax error according to my system.
What's wrong with this line?
SAP Version: 740 (sorry, it first I thought it was 752)
SELECT ... FROM #itab appeared in 7.52 so it should work.
On my 7.52 system it works but you must indicate a table alias. There's an example in the ABAP documentation (cf first link above).

Extract incident details from Service Now in Excel

I am trying to extract ticket details from Service Now. Is there a way to extract the details without ODBC ? I have also tried the solution mentioned in [1]: https://community.servicenow.com/docs/DOC-3844, but I am receiving an error 9 -subscript out of range.
Is there a better way to extract details efficiently? I tried asking this in the service now forum but I thought I might get other opinions from here.
It's been a while since this question is asked. Hopefully following is still useful.
I am extracting change data (not incident) , but the process still should be same. You will need to gather incident table and column information. Then there are couple of ways to approach the problem.
1) If the data you are extracting has fixed parameters , such as fixed period or fixed column or group etc., then you can create a report within servicenow and then use REST/SOAP API to get the data in text/csv format. You can use different python modules to convert from csv to xls or xlsx depending on you need. I used openpyXL ,csv , xlsreader ,xlswriter etc.
See here for a example
ServiceNow - How to use SOAP to download reports
2) If the data has dynmaic parameters where you need to change columns, dates or filter etc, you can still use soap / REST API but form query within python scripts instead of having static report. This way you can change it based on your requirement on the fly.
Here is an example query for DB. you can use example for above. Just switch url with following.
table_name = 'u_change_table_name' #SN DB holding change/INCIDENT info
table_limit = 800
table_query = 'active=true&sysparm_display_value=true&planned_start_date=today'
date_query = 'chg_start_date>=javascript:gs.daysAgoStart(1)^active=true^chg_type=normal'
table_fields = 'chg_number,chg_start_date,chg_duration,chg_end_date' #Actual column names from DB and not from SN report.
url= (
'https://yourcompany.service-now.com/api/now/table/' +table_name +\
'?sysparm_query=' + date_query + '&sysparm_fields=' \
+ table_fields + '&sysparm_limit=' + str(table_limit)
)

SQL simulator on Prolog

i need to do a SQL simulator on Prolog. i need to simulate some functions like create_table, insert, delete, update_table, select_table, drop_table, show_table, etc. I am learning how to do asserts and retracts but im getting some errors in my first function create_table(N,A) where N is the name of the table and A a list with the atributtes
An example is create_table("student",["name","lastname","age"]). This will create my table named "student" with the atributes ["name","lastname","age"].
I was searching how to work with assert and i see i need to do dynamic before making assert, then my code is.
len([],0).
len([_|T],N) :- len(T,X), N is X+1.
create_table(_, []):-!.
create_table(N, Atributos):- len(Atributos, L), dynamic N/L, assert(N(Atributos)).
But i get this error :7: Syntax error: Operator expected on this line
create_table(N, Atributos):- len(Atributos, L), dynamic N/L, assert(N(Atributos)).
What im doing wrong? excuse me, i dont speak good english.
From the error message, seems you're using SWI-Prolog....
Your code could be (note that len/2 can be replaced by the builtin length/2)
create_table(N, Atributos):-
length(Atributos, L),
dynamic(N/L),
T =.. [N|Atributos], % this missing 'constructor' was the cause of the error message
assert(T). % this is problematic
There is an important 'design' problem: the CREATE TABLE SQL statement works at metadata level.
If you do, for instance,
?- assertz(student('Juan', 'Figueira', 20)).
pretending that the predicate student/3 holds table data, you're overlapping data and metadata
using dynamic/1 and assert is a tricky non-logical aspect of Prolog, and dynamically creating dynamic predicates is really unusual. Fundamentally you cannot have a Prolog query with the predicate name as a variable e.g.
query(N,X) :- N=student, N(X).
My suggestion is you remove a layer of complexity and have one dynamic predicate 'table', and assert your SQL tables as new 'table' clauses i.e.
:- dynamic table/2.
:- assertz(table(student,['Joe','Young',18])).
query(N,X) :- table(N,X).
:- query(student,[Name,Lastname,Age]).

BigQuery: Failed to create view. Unexpected. Please try again

I am trying to save a view in BigQuery, and keep getting the same error:
Failed to create view. Unexpected. Please try again.
The query is as follows:
SELECT
interaction.id AS Interaction.ID,
interaction.author.name AS Interaction.Author.Name,
interaction.author.username AS Interaction.Author.Username,
interaction.content AS Interaction.Content,
interaction.created_at_timestamp AS Interaction.Created_At_Timestamp,
klout.score AS Klout.Score,
twitter.geo.latitude AS Twitter.Geo.Latitude,
twitter.geo.longitude AS Twitter.Geo.Longitude,
twitter.media.expanded_url AS Twitter.Media.ExpandedUrl,
twitter.media.type AS Twitter.Media.Type,
twitter.place.country AS Twitter.Place.Country,
twitter.user.followers_count AS Twitter.User.Followers,
twitter.user.friends_count AS Twitter.User.Friends,
twitter.user.listed_count AS Twitter.User.Listed,
twitter.retweet.count AS Twitter.Retweet.Count
FROM
[**DATASET_NAME_OMITTED**.main_table]
WHERE
(interaction.id IS NOT NULL)
AND (interaction.created_at_timestamp IS NOT NULL)
AND (interaction.created_at_timestamp >= DATE_ADD(USEC_TO_TIMESTAMP(UTC_USEC_TO_HOUR(NOW())), -1, "DAY"))
AND (interaction.created_at_timestamp < USEC_TO_TIMESTAMP(UTC_USEC_TO_HOUR(NOW())))
The query validates, and runs without any problems:
Valid: This query will process 203 MB when run.
I did notice that the twitter.media is of type REPEATED RECORD. That said, removing twitter.media.* fields does not fix the issue.
I have been able to successfully save other views with the same timestamp restrictions and naming conventions. Attempting to save this one consistently fails.
For context: This table is populated by DataSift via their BigQuery connector (default, catch-all schema).
This is really weird.
I ran an experiment and pulled out each of the alias operations, and it worked.
I then slowly added some of them back in, and again; it continued working. However it seems that certain aliases do not want to work (I have no idea why).
I ended up with the following, which contains most of your aliases, and seems to work as expected:
SELECT
interaction.id AS Interaction.ID,
interaction.author.name AS Interaction.Author.Name,
interaction.author.username AS Interaction.Author.Username,
interaction.content AS Interaction.Content,
interaction.created_at_timestamp AS Interaction.Created_At_Timestamp,
klout.score AS Klout.Score,
twitter.geo.latitude AS Twitter.Geo.Latitude,
twitter.geo.longitude AS Twitter.Geo.Longitude,
twitter.media.expanded_url,
twitter.media.type AS Twitter.Media.Type,
twitter.place.country AS Twitter.Place.Country,
twitter.user.followers_count,
twitter.user.friends_count,
twitter.user.listed_count,
twitter.retweet.count AS Twitter.Retweet.Count
FROM [**DATASET_NAME_OMITTED**.main_table]
WHERE
(interaction.id IS NOT NULL)
AND (interaction.created_at_timestamp IS NOT NULL)
AND (interaction.created_at_timestamp >= DATE_ADD(USEC_TO_TIMESTAMP(UTC_USEC_TO_HOUR(NOW())), -1, "DAY"))
AND (interaction.created_at_timestamp < USEC_TO_TIMESTAMP(UTC_USEC_TO_HOUR(NOW())))
What seems really weird is that there is no pattern between what will, and will not work. The twitter.user.* fields are integers, but will not accept aliases, however the integer field klout.score field does accept an integer.

Issues with a query on the AS/400 with LIKE clause

We are using Hibernate to connect to AS/400. We are having issues with a query on the AS/400
with the LIKE clause.
The following error is shown:
java.sql.SQLException: [SQL0131] Operands of LIKE not compatible or not valid
My query is its auto generated by Hibernate:
select tab_parame0_.C1IMCD as C1_560_, tab_parame0_.C1NINB as C2_560_,
tab_parame0_.C1JXCD as C3_560_, tab_parame0_.C1HLTX as C4_560_, tab_parame0_.C1HMTX as C5_560_,
tab_parame0_.C1HDST as C6_560_, tab_parame0_.C1NGNB as C7_560_, tab_parame0_.C1NJNB as C8_560_,
tab_parame0_.C1NFNB as C9_560_, tab_parame0_.C1NHNB as C10_560_, tab_parame0_.C1HCST as C11_560_
from RYC1REP tab_parame0_
where lower(tab_parame0_.C1HLTX) like lower(?)
order by tab_parame0_.C1IMCD asc
fetch first 10 rows only
SQL0131 indicates a type mismatch.
What datatype is tab_parame0_.C1HLTX? What datatype is your query parameter?
Please include your HQL/JPQL query source code for comparison.
You may have to set up an SQL trace to see exactly what the AS/400 is receiving.
See How do I obtain trace information from my Java program using the Toolbox?
I recommend you change LIKE LOWER(:parameter) to LIKE :parameter in your source query and use .toLowerCase() when you set the parameter and see how that works.