PhoneGap + Database How save a result in the database & how use data stock in a page - sql

My uderstanding of Javascript is really limited. I have a simple calculation to make. A B and C are 3 inputs numbers and I need the result D in the database iOS for future use, using Phonegap and jQuery.
[ (A-10) / (200/B) ] + [ (C/10) x ((10xB)/45)] = C
This is my simple javascript
function test(){
a=Number(document.calculator.entree1.value);
b=Number(document.calculator.entree2.value);
c=Number(document.calculator.entree3.value);
d=((a-10)/(200/b))+((c/10)*((10*b)/45))
document.calculator.total.value=Math.round(d);
}
My question is how I could stock this data result Math.round(d); in the phone gap database and use it again in a daily page result?
Thanks.

Related

SAP Business One Universal function #STORE# capacity

I am using SAP Business One version 9.3. I am writing a Universal Function that retrieves data from a table, places it in a #STORE variable and then calls another function to process the data. The problem I am facing is that after certain amount of characters the data in the #STORE variable gets truncated. I cannot find anywhere in the documentation what the limit for the #STORE variables is.
Here is my code:
#STORE1 = SQL(select (field1 + '*' + field2 " ':') from someTable FOR XML PATH(''));
UF(UF-002)
The idea is to create one long string from all the records in someTable, store it in #STORE1 and then use it in UF-002.
UF-002 is an Http Trigger function that calls an API with this payload:
{
"records": "#STORE1"
}
I am very new to SAP B1 and if there is a better way of doing this, I would appreciate the input. If this method is ok, what are my options for passing the data to UF-002. The table in question has about 450 records.
Thank you.

Left join did not working properly in Azure Stream Analytics

I'm trying to create a simple left join between two inputs (event hubs), the source of inputs is an app function that process a rabbitmq queue and send to a event hub.
In my eventhub1 I have this data:
[{
"user": "user_aa_1"
}, {
"user": "user_aa_2"
}, {
"user": "user_aa_3"
}, {
"user": "user_cc_1"
}]
In my eventhub2 I have this data:
[{
"user": "user_bb_1"
}, {
"user": "user_bb_2"
}, {
"user": "user_bb_3
}, {
"user": "user_cc_1"
}]
I use that sql to create my left join
select hub1.[user] h1,hub2.[user] h2
into thirdTestDataset
from hub1
left join hub2
on hub2.[user] = hub1.[user]
and datediff(hour,hub1,hub2) between 0 and 5
and test result looks ok...
the problem is when I try it on job running... I got this result in power bi dataset...
Any idea why my left isn't working like any sql query?
I tested your query sql and it works well for me too.So when you can't get expected output after executing ASA job,i suggest you following troubleshoot solutions in this document.
Based on your output,it seems that the HUB2 becomes the left table.You could use diagnostic log in ASA to locate the truly output of job execution.
I tested the end-to-end using blob storage for input 1 and 2 and your sample and a PowerBI dataset as output and observed the expected result.
I think there are few things that can go wrong with your query:
First, your join has a 5-hours windows: basically that means it looks at EH1 and EH2 for matches during that large window, so live results will be different from sample input for which you have only 1 row. Can you validate that you had no match during this 5-hour window?
Additionally by default PBI streaming datasets are "hybrid datasets" so it will accumulate results without a good way to know when the result was emitted since there is no timestamp in your output schema. So you can also view previous data here. I'd suggest few things here:
In Power BI, change the option of your dataset: disable "Historic data analysis" to remove caching of data
Add a timestamp column to make sure to identify when the data is generated (the first line of you query will become: select System.timestamp() as time, hub1.[user] h1,hub2.[user] h2 )
Let me know if it works for you.
Thanks,
JS (Azure Stream Analytics)

How to programmatically list available Google BigQuery locations?

How to programmatically list available Google BigQuery locations? I need a result similar to what is in the table of this page: https://cloud.google.com/bigquery/docs/locations.
As #shollyman has mentioned
The BigQuery API does not expose the equivalent of a list locations call at this time.
So, you should consider filing a feature request on the issue tracker.
Meantime, I wanted to add Option 3 to those two already proposed by #Tamir
This is a little naïve option with its pros and cons, but depends on your specific use case can be useful and easy adapted to your application
Step 1 - load page (https://cloud.google.com/bigquery/docs/locations) html
Step 2 - parse and extract needed info
Obviously, this is super simple to implement in any client of your choice
As I am huge BigQuery fan - I went through "prove of concept" using BigQuery Tool - Magnus
I've created workflow with just two Tasks:
API Task - to load page's HTML into variable var_payload
and
BigQuery Task - to parse and extract wanted info out of html
The "whole" workflow is as simple as it looks in below screenshot
The query I used in BigQuery Task is
CREATE TEMP FUNCTION decode(x STRING) RETURNS STRING
LANGUAGE js AS """
return he.decode(x);
"""
OPTIONS (library="gs://my_bucket/he.js");
WITH t AS (
SELECT html,
REGEXP_EXTRACT_ALL(
REGEXP_REPLACE(html,
r'\n|<strong>|</strong>|<code>|</code>', ''),
r'<table>(.*?)</table>'
)[OFFSET(0)] x
FROM (SELECT'''<var_payload>''' AS html)
)
SELECT pos,
line[SAFE_OFFSET(0)] Area,
line[SAFE_OFFSET(1)] Region_Name,
decode(line[SAFE_OFFSET(2)]) Region_Description
FROM (
SELECT
pos, REGEXP_EXTRACT_ALL(line, '<td>(.*?)</td>') line
FROM t,
UNNEST(REGEXP_EXTRACT_ALL(x, r'<tr>(.*?)</tr>')) line
WITH OFFSET pos
WHERE pos > 0
)
As you can see, i used he library. From its README:
he (for “HTML entities”) is a robust HTML entity encoder/decoder written in JavaScript. It supports all standardized named character references as per HTML, handles ambiguous ampersands and other edge cases just like a browser would ...
After workflow is executed and those two steps are done - result is in project.dataset.location_extraction and we can query this table to make sure we've got what is expected
Note: obviously parsing and extracting needed locations info is quite simplified and surely can be improved to be more flexible in terms of changing source page layout
Unfortunately, There is no API which provides BigQuery supported location list.
I see two options which might be good for you:
Option 1
You can manually manage a list and expose this list to your client via an API or any other means your application support (You will need to follow BigQuery product updates to follow on updates on this list)
Option 2
If your use case is to provide a list of the location you are using to store your own data you can call dataset.list to get a list of location and display/use it in your app
{
"kind": "bigquery#dataset",
"id": "id1",
"datasetReference": {
"datasetId": "datasetId",
"projectId": "projectId"
},
"location": "US"
}

basic mdx question using Ms Excel OLAP tools

I will make this question and scenario as basic as possible since I have no background on programming. How do I make a script where all red will be multiplied by 5, yellow by 6 and blue by 7? The new measure will aggregate in grand total. I don't know what expressions to use. Just use [Product] for the colors and [Measure] for qty.
enter image description here
I dont understand yet the use of MEMBERS and other expressions as this is my first time to be on it. I tried
([Measure].[Quantity],[Product].&[Yellow])*6
but it will just multiply everything with 6. Maybe FILTERS? IIF? I just don't know how. The script will go a long way when I will apply it in our database. thanks!
I know you asked about doing this with excel, but if you were writing an MDX query you could do create a new measure and run the query like this:
WITH
member measures.[ColorQuantity] AS CASE WHEN [Product].[Product].currentmember.member_key = "Yellow" THEN measures.[Quantity] * 6
WHEN [Product].[Product].currentmember.member_key = "Blue" THEN measures.[Quantity] * 5
WHEN [Product].[Product].currentmember.member_key = "Red" THEN measures.[Quantity] * 2
ELSE measures.[Quantity] END
SELECT {
measures.[Quantity], measures.[ColorQuantity]
} ON 0,
Non EMPTY
{
[Product].[Product].[All].Children /// I dont know the FULL dimension AND hierarchy path you want TO use
} ON 1
FROM YourCubeName
This might help you get started.

What is the best way for getting difference between records' column

I have a model CounterRecord and column there data. Records list:
r1.data = 10
r2.data = 12
r3.data = 15
r4.data = 20
I would like to get differences between current and previous record [10,2,3,5]. Does Rails have default way do that?
I don't think Rails or ActiveRecord has such stuff, but it can be easily done with little bit of ruby as:
CounterRecord.pluck(:data).each_with_index.map { |c,i| a[0] == c ? c : (a[i-1].to_i - c).abs }
Do the rails have default way do that?
No.
Possible solution:
CounterRecord.pluck(:data).each_cons(2).map { |first, second| second - first }
NOTE: with big amount of records this will easily kill your memory.
There is no default way to do as such.