Escape outer variable trap in two nested foreach with Dictionary<T> - variables

I have two objects that I initialize like this:
var series = new Dictionary<string,Dictionary<string,int>>()
{
{"0", new Dictionary<string, int>() },
{"1", new Dictionary<string, int>() },
{"2", new Dictionary<string, int>() }
}
var periodValues = new Dictionary<string,int>();
{
{"Jan", 0 },
{"Feb", 0 },
{"Mar", 0 }
}
Then I have another object with filled values:
var target = new Dictionary<string,Dictionary<string,int>>()
{
{ "0",
new Dictionary<string, int>()
{
{"Jan", 12 },
{"Mar", 22 }
}
},
{ "1",
new Dictionary<string, int>()
{
{"Mar", 37 }
}
},
{ "2",
new Dictionary<string, int>()
{
{"Jan", 4 },
{"Feb", 48 },
{"Mar", 22 }
}
}
}
series and target have always the same Key, while target[key].Keys (with key of type string) for any key can be a subset (at the most the same) of periodValues.Keys.
Now I want to fill series.Values according to the keys in periodValues.Keys but with the values of target[key].Value. Therefore:
foreach (var numberValue in target.Keys)
{
foreach (var period in target[numberValue].Keys)
{
periodValues[period] = target[numberValue][period];
}
series[numberValue] = periodValues;
}
But I fall in the outer variable trap...that means all series[key] for any series.Keys are equal to the last periodValues. I tried many solutions to escape the outer variable trap according to this article with no luck. Anybody knows a solution or maybe a better approach?

If anybody interested, I solved the problem by declaring and initializing periodValues inside the first foreach:
foreach (var numberValue in target.Keys)
{
var periodValues = new Dictionary<string,int>();
{
{"Jan", 0 },
{"Feb", 0 },
{"Mar", 0 }
}
foreach (var period in target[numberValue].Keys)
{
periodValues[period] = target[numberValue][period];
}
series[numberValue] = periodValues;
}

Related

Adding dynamic maps in DynamoDB with Kotlin

I'm using Spring Boot, Kotlin and CrudRepository to add items to my Dynamo Table.
The map I'm trying to add is dynamic, and can change attributes every single time.
I add the date of the object (delta) and save it, but I am having several errors:
When I save:
#DynamoDBTable(tableName = "delta_computers_inventory")
class DeltaComputersInventory(
#DynamoDBHashKey
#DynamoDBAttribute(attributeName = "delta_computers_inventory_id")
var id: String = UUID.randomUUID().toString(),
#DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.M)
#DynamoDBAttribute(attributeName = "delta")
var delta: Map<String, Any?> = mapOf(),
) {
#DynamoDBAttribute(attributeName = "date")
var date: String = OffsetDateTime.now(ZoneOffset.UTC).format(
DateTimeFormatter.ISO_DATE_TIME
)
}
and I do:
.doOnSuccess { listOfDocuments ->
deltaComputersRepository.saveAll(
listOfDocuments.map {
DeltaComputersInventory(
delta = it,
)
}
)
}
I get:
reactor.core.Exceptions$ErrorCallbackNotImplemented: com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted
instead, if I do it through an Item (Item.fromMap(it))
#DynamoDBTable(tableName = "delta_computers_inventory")
class DeltaComputersInventory(
#DynamoDBHashKey
#DynamoDBAttribute(attributeName = "delta_computers_inventory_id")
var id: String = UUID.randomUUID().toString(),
#DynamoDBTyped(DynamoDBMapperFieldModel.DynamoDBAttributeType.M)
#DynamoDBAttribute(attributeName = "delta")
var delta: Item = Item(),
) {
#DynamoDBAttribute(attributeName = "date")
var date: String = OffsetDateTime.now(ZoneOffset.UTC).format(
DateTimeFormatter.ISO_DATE_TIME
)
}
I get no error, but my item in my DynamoDB shows empty:
{
"delta_computers_inventory_id": {
"S": "d389d63e-8e93-4b08-b576-e37fae9a4d58"
},
"date": {
"S": "2023-01-24T12:00:33.620015Z"
},
"delta": {
"M": {}
},
}
What am I doing wrong?

FluentAssertions: collection subset should contain equivalent of list

I have a collection:
new[] { new { A = 5, PropIDontCareAbout = "XXX" }, new { A = 7, PropIDontCareAbout = "XXX" }, new { A = 9, PropIDontCareAbout = "XXX" } }
I want to check that it at least contains both new { A = 9 } and new { A = 5 } in any order.
I can use ContainEquivalentOf, but I have to do it one-by-one:
var actual = new[] {
new { A = 5, PropIDontCareAbout = "XXX" },
new { A = 7, PropIDontCareAbout = "XXX" },
new { A = 9, PropIDontCareAbout = "XXX" }
};
var expected = new [] { new { A = 5 }, new { A = 9 } };
foreach (var expectedItem in expected) {
actual.Should().ContainEquivalentOf(expectedItem);
}
Update: I can't use Contains because it requires actual and expected objects to have the same type.
I do not see an explicit solution. You can work-around using Select to create subject in the format of the expectation, then you can use
Contains(IEnumerable<T> expected, ...):
var actual = new[] {
new { A = 1, PropIDontCareAbout = "XXX" },
new { A = 7, PropIDontCareAbout = "XXX" },
new { A = 9, PropIDontCareAbout = "XXX" }
};
actual.Select(x => new { x.A }).Should()
.Contain(new[] { new { A = 9 }, new { A = 5 } });
In case of one the elements is not in the list you get a message like that:
Expected collection {{ A = 1 }, { A = 7 }, { A = 9 }}
to contain {{ A = 9 }, { A = 5 }},
but could not find {{ A = 5 }}.

ASP.NET CORE MVC With Google Charts - No Data

Trying to implement Google Chart with ASP.Net CORE MVC.
Been at it for two days, but I can not figure out my mistake. I don't get an error, and I can see the array in the console, but no data.
VIEWMODEL
public class ZipCodes
{
public string ZipCode { get; set; }
public int ZipCount { get; set; }
}
CONTROLLER
public ActionResult IncidentsByZipCode()
{
var incidentsByZipCode = (from o in _context.Incident
group o by o.ZipCode into g
orderby g.Count() descending
select new
{
ZipCode = g.Key,
ZipCount = g.Count()
}).ToList();
return Json(incidentsByZipCode);
}
VIEW
function IncidentsByZipCode() {
$.ajax({
type: 'GET',
url: '#Url.Action("IncidentsByZipCode", "Controller")',
success: function (response) {
console.log(response);
var data = new google.visualization.DataTable();
data.addColumn('string', 'ZipCode');
data.addColumn('number', 'ZipCount');
for (var i = 0; i < response.result.length; i++) {
data.addRow([response.result[i].ZipCode, response.result[i].ZipCount]);
}
var chart = new google.visualization.ColumnChart(document.getElementById('incidentsByZipCode'));
chart.draw(data,
{
title: "",
position: "top",
fontsize: "14px",
chartArea: { width: '100%' },
});
},
error: function () {
alert("Error loading data!");
}
});
}
Because the api you use is not Column Chart, the data cannot be added and rendered correctly. According to the official example, you need to make some changes.
Here is the ajax code.
<script>
//Generate random colors
function bg() {
var r = Math.floor(Math.random() * 256);
var g = Math.floor(Math.random() * 256);
var b = Math.floor(Math.random() * 256);
return "rgb(" + r + ',' + g + ',' + b + ")";
}
function IncidentsByZipCode() {
$.ajax({
type: 'GET',
url: '#Url.Action("IncidentsByZipCode","home")',
success: function (response) {
google.charts.load('current', { packages: ['corechart'] });
google.charts.setOnLoadCallback(drawChart);
function drawChart() {
var data = new google.visualization.DataTable();
var obj = [
["Element", "Density", { role: "style" }],
];
$.each(response, function (index, value) {
obj.push([value.zipCode, value.zipCount, bg()])
})
var data = google.visualization.arrayToDataTable(obj);//This is method of Column Chart
var view = new google.visualization.DataView(data);
view.setColumns([0, 1,
{
calc: "stringify",
sourceColumn: 1,
type: "string",
role: "annotation"
},
2]);
var chart = new google.visualization.ColumnChart(document.getElementById('incidentsByZipCode'));
chart.draw(data,
{
title: "",
position: "top",
fontsize: "14px",
chartArea: { width: '100%' },
});
}
},
error: function () {
alert("Error loading data!");
}
});
}
IncidentsByZipCode()
This is the controller.
public ActionResult IncidentsByZipCode()
{
//var incidentsByZipCode = (from o in _context.Incident
// group o by o.ZipCode into g
// orderby g.Count() descending
// select new
// {
// ZipCode = g.Key,
// ZipCount = g.Count()
// }).ToList();
var incidentsByZipCode = new List<ZipCodes>
{
new ZipCodes{ ZipCode="code1", ZipCount=3},
new ZipCodes{ZipCode="code2",ZipCount=4},
new ZipCodes{ZipCode="code3",ZipCount=2},
new ZipCodes{ZipCode="code4",ZipCount=9},
};
return Json(incidentsByZipCode);
}
Result, and you can also refer to this document
.

How merge mutil object in array with lodash?

I have a bit of problems with my code :
[
{
"key": 0,
"server": 0
},
{
"key": 0,
"server": 1
},
{
"key": 1,
"server": 0
}
]
how i can result to :
[
{ key: 0 , server:[0,1] },
{ key: 1 , server:[0] }
]
I'm using _.groupBy but it does not return results as expected.
How merge mutil object in array with lodash?
One solution would be to use a combination of groupBy, map, and mergeWith.
So first, you group the items by key, so in this example it will return an object with 0 and 1 as the keys which will contain the grouped items.
Then you use .map to iterate through the returned object and get the grouped values.
Finally, you use .mergeWith with a customizer function which you can use to customize the how the values of the server key is merged.
const servers = [{
"key": 0,
"server": 0
},
{
"key": 0,
"server": 1
},
{
"key": 1,
"server": 0
}
];
function customizer(objValue, srcValue, key, fourth) {
if (key === 'server') {
if (!_.isArray(objValue)) objValue = [objValue];
return objValue.concat(srcValue);
}
}
let test = _(servers)
.groupBy('key')
.map((values, key) => {
if (values.length === 1) {
values[0].server = [values[0].server];
return values[0];
}
return _.mergeWith(...values, customizer);
})
.value();
console.log(test);
<script src="https://cdnjs.cloudflare.com/ajax/libs/lodash.js/4.17.11/lodash.min.js"></script>
new Vue({
el: "#app",
data: {
needMergedArray: [
{
key: 0,
server: 0
},
{
key: 0,
server: 1
},
{
key: 1,
server: 0
}
],
},
methods: {
mergeObject() {
let merged = [];
this.needMergedArray.map(item => {
let found = false;
merged.map(element => {
if(item.key === element.key) {
found = true;
if(!element.server) {
//const myserver = [item.server];
element.server =[item.server];
} else {
element.server.push(item.server);
}
}
});
if(!found) {
const myserver = [item.server];
merged.push({key: item.key, server: myserver});
}
found = false;
});
return merged;
}
}
})
// one loop solution: https://jsfiddle.net/haianhnc/2mhsquad/78
mergeObject() {
let merged = {};
this.needMergedArray.map(item => {
merged = {...merged, [item.key]: {key: [item.key], server: (merged[item.key] && merged[item.key].server || []).concat(item.server)}};
});
return Object.values(merged);
}
}
Haven't tested yet
arr.reduce(function(acc,value){
const isExists = acc.findIndex(({key})=>key===value.key)
if(!isExists){
acc.push(value)
} else {
let servers = acc[isExists].server
if(!Array.isArray(servers))
servers = [servers]
servers.push(value.server)
servers = _.uniq(servers)
acc[isExists].server = servers
}
return acc
},[])

Inserting huge data using nodejs SQL DB2

I got lets say 100.000 records in array:
var eData = { "id": "1001", "type": "Regular" },
{ "id": "1002", "type": "Chocolate" },
{ "id": "1003", "type": "Blueberry" },
{ "id": "1004", "type": "Devil's Food" }
And so on...
When I fire the node.js script below
var db = require('/QOpenSys/QIBM/ProdData/OPS/Node6/os400/db2i/lib/db2a');
var DBname = "*LOCAL";
var dbconn = new db.dbconn();
dbconn.conn(DBname);
var sqlA = new db.dbstmt(dbconn);
eData.forEach(function(eRow, i) {
var sql = "INSERT INTO lib.table VALUES( xx x x x) WITH NONE"
sqlA.exec(sql, function(rs, err) {
console.log("Execute Done.");
console.log(err);
});
});
The data will be mixed up in DB. Same id and type will be there 10 times, but it will hit the exact number of insertet records.
If I change to execSync, everything turns out right, but seams a bit slow. What am I missing to do async inserts?
What is the fastest way doing huge inserts?
There will be a optimal number of async operations to have processing at any one time. The easiest way to limit the number of async operations is with the excellent async.js module.
https://caolan.github.io/async/docs.html#eachLimit
var async = require('async')
var db = require('/QOpenSys/QIBM/ProdData/OPS/Node6/os400/db2i/lib/db2a');
var DBname = "*LOCAL";
var dbconn = new db.dbconn();
dbconn.conn(DBname);
var sqlA = new db.dbstmt(dbconn);
async.eachLimit(eData, 100, function(eRow, cb) {
var sql = "INSERT INTO lib.table VALUES( xx x x x) WITH NONE"
sqlA.exec(sql, function(rs, err) {
console.log("Execute Done.");
cb(err)
});
}, function (error) {
if (error) {
console.error(error)
} else {
console.log('Done')
}
})