Complex function using Parse Server Cloud Code (looping and creating records) - parse-server

After a night of trial and error I have decided on a much simpler way to explain my issue. Again, I have no JS experience, so I don't really know what I am doing.
I have 5 classes:
game - holds information about my games
classification - holds information about the user classes available in games
game_classifications - creates a one game to many classifications relationship (makes a game have mulitple classes)
mission - holds my mission information
mission_class - creates a one to many relationship between a mission and the classes available for that mission
Using Cloud Code, I want to provide two inputs through my Rest API being missionObjectId and gameObjectId.
The actual steps I need the code to perform are:
Get the two inputs provided {"missionObjectId":"VALUE","gameObjectId":"VALUE"}
Search the game_classifications class for all records where game = gameObjectID
For each returned record, create a new record in mission_class with the following information:
mission_id = missionObjectId
classification = result.classification
Here is an image of the tables:
And here is how I have tried to achieve this:
Parse.Cloud.define("activateMission", async (request) => {
Parse.Cloud.useMasterKey();
const query = new Parse.query('game_classifications');
query.equalTo("gameObjectId", request.params.gameObjectId);
for (let i = 0; i < query.length; i ++) {
const mission_classification = Parse.Object.extend("mission_class");
const missionClass = new mission_classification();
missionClass.set("mission_id", request.params.missionObjectId);
missionClass.set("classification_id", query[i].classificationObjectId);
return missionClass.save();
}
});
Does anyone have any advice or input that might help me achieve this goal?
The current error I am getting is:
Parse.query is not a constructor
Thank you all in advance!

Some problems on your current code:
Parse.Cloud.useMasterKey() does not exist for quite a long time. Use useMasterKey option instead.
It's Parse.Query and not Parse.query.
You need to run query.findAll() command and iterate over it (and not over query).
For performance, move Parse.Object.extend calls to the beginning of the file.
To access the field of an object, use obj.get('fieldName') and not obj.fieldName.
If you return the save operation, it will save the first object, return, and not save the others.
So, the code needs to be something like this:
const mission_classification = Parse.Object.extend("mission_class");
const game = Parse.Object.extend("game");
Parse.Cloud.define("activateMission", async (request) => {
const query = new Parse.Query('game_classifications');
const gameObj = new game();
gameObj.id = request.params.gameObjectId;
query.equalTo("gameObjectId", gameObj);
const queryResults = await query.findAll({useMasterKey: true});
for (let i = 0; i < queryResults.length; i++) {
const missionClass = new mission_classification();
missionClass.set("mission_id", request.params.missionObjectId);
missionClass.set("classification_id", queryResults[i].get('classificationObjectId'));
await missionClass.save(null, { useMasterKey: true });
}
});

Related

How can I use the same value as written in the Json during the same test execution in the testcafe

I have been trying to use the value from the JSON that I have got added successfully using fs.write() function,
There are two test cases in the same fixture, one to create an ID and 2nd to use that id. I can wrote the id successfully in the json file using fs.write() function and trying to use that id using importing json file like var myid=require('../../resources/id.json')
The json file storing correct id of the current execution but I get the id of first test execution in 2nd execution.
For example, id:1234 is stored during first test execution and id:4567 is stored in 2nd test execution. During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
I use it like
t.typeText(ele, myid.orid)
my json file contains only id like {"orid":"4567"}
I am new to Javascript and Testcafe any help would really be appreciated
Write File class
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
I created 2 different classes in order to separate create & update operations and call the functions of create & update order in single fixture in test file
Create Order class
class CreateOrder{
----
----
----
async createNewOrder(){
//get text of created ordder and saved order id in to the json file
-----
-----
-----
const orId= await baseclass.getOrderId();
new WriteIntoFile(orId)
console.log(orId)
-----
-----
-----
}
}export default CreateOrder
Update Order class
var id=require('../../resources/id.json')
class UpdateOrder{
async searchOrderToUpdate(){
await t
***//Here, I get old order id that was saved during previous execution***
.typeText(baseClass.searchBox, id.orderid)
.wait(2500)
.click(baseClass.searchIcon)
.doubleClick(baseClass.orderAGgrid)
console.log(id.ordderid)
----
----
async updateOrder(){
this.searchOrderToUpdate()
.typeText(baseClass.phNo, '1234567890')
.click(baseClass.saveBtn)
}
}export default UpdateOrder
Test file
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
await t
login('id')
await t
.wait(1500)
},{preserveUrl:true})
test('Should be able to create an Order', async t=>{
await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder();
});
I'll reply to this, but you probably won't be happy with my answer, because I wouldn't go down this same path as you proposed in your code.
I can see a couple of problems. Some of them might not be problems right now, but in a month, you could struggle with this.
1/ You are creating separate test cases that are dependent on each other.
This is a problem because of these reasons:
what if Should be able to create an Order doesn't run? or what if it fails? then Should be able to update an order fails as well, and this information is useless, because it wasn't the update operation that failed, but the fact that you didn't meet all preconditions for the test case
how do you make sure Should be able to create an Order always runs before hould be able to update an order? There's no way! You can do it like this when one comes before the other and I think it will work, but in some time you decide to move one test somewhere else and you are in trouble and you'll spend hours debugging it. You have prepared a trap for yourself. I wrote this answer on this very topic, you can read it.
you can't run the tests in parallel
when I read your test file, there's no visible hint that the tests are dependent on each other. Therefore as a stranger to your code, I could easily mess things up because I have no way of knowing about it without going deeper in the code. This is a big trap for anyone who might come to your code after you. Don't do this to your colleagues.
2/ Working with files when all you need to do is pass a value around is too cumbersome.
I really don't see a reason why you need to same the id into a file. A slightly better approach (still violating 1/) could be:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
// use a variable to pass the orderId around
// it's also visible that the tests are dependent on each other
let orderId = undefined;
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl: true})
test('Should be able to create an Order', async t=>{
orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(orderId);
});
Doing it like this also slightly remedies what I wrote in 1/, that is that it's not visible at first sight that the tests are dependent on each other. Now, this is a bit improved.
Some other approaches how you can pass data around are mentioned here and here.
Perhaps even a better approach is to use t.fixtureCtx object:
const newOrder = new CreateOrder();
const update = new UpdateOrder();
const role = Role(`siteurl`, async t => {
// some steps, I omit this for better readability
}, {preserveUrl:true})
test('Should be able to create an Order', async t=>{
t.fixtureCtx.orderId = await newOrder.createNewOrder();
});
test('Should be able to update an order', async t=>{
await update.updateOrder(t.fixtureCtx.orderId);
});
Again, I can at least see the tests are dependent on each other. That's already a big victory.
Now back to your question:
During 2nd test execution I need the id:4567 but I get 1234 this is weird, isn't it?
No, it's not weird. You required the file:
var id = require('../../resources/id.json')
and so it's loaded once and if you write into the file later, you won't read the new content unless you read the file again. require() is a function in Node to load modules, and it makes sense to load them once.
This demonstrates the problem:
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// it's been loaded once, you won't get any other value here
console.log(idFile); // { id: 5 }
What you can do to solve the problem?
You can use fs.readFileSync():
const idFile = require('./id.json');
const fs = require('fs');
console.log(idFile); // { id: 5 }
const newId = {
'id': 7
};
fs.writeFileSync('id.json', JSON.stringify(newId));
// you need to read the file again and parse its content
const newContent = JSON.parse(fs.readFileSync('id.json'));
console.log(newContent); // { id: 7 }
And this is what I warned you against in the comment section. That this is too cumbersome, inefficient, because you write to a file and then read from the file just to get one value.
What you created is not very readable either:
const fs = require('fs')
const baseClass =require('../component/base')
class WriteIntoFile{
constructor(orderID){
const OID = {
orderid: orderID
}
const jsonString = JSON.stringify(OID)
fs.writeFile(`resources\id.json`, jsonString, err => {
if (err) {
console.log('Error writing file', err)
} else {
console.log('Successfully wrote file')
}
})
}
}
export default WriteIntoFile
All these operations for writing into a file are in a constructor, but a constructor is not the best place for all this. Ideally you have only variable assignments in it. I also don't see much reason for why you need to create a new class when you are doing only two operations that can easily fit on one line of code:
fs.writeFileSync('orderId.json', JSON.stringify({ orderid: orderId }));
Keep it as simple as possible. it's more readable like so than having to go to a separate file with the class and decypher what it does there.

Access nested mapping solidity

NOTE: I asked this question a few days ago while I had solidity 0.7.0. Now I am using solc 0.8.0. With the new ABI V2 encoding, this should be possible. However, I still got stuck.
NOTE2: I know I can write a getter to get a specific review. However, I am aware of gas costs and I need to get all the ratings in one go to compute averages, so I don't think it is feasible.
Suppose I have this data structure layout:
struct ReviewStruct {
string rating;
...
}
struct Restaurant {
...
uint reviewCount;
mapping(uint => ReviewStruct) reviews;
}
uint public restaurantCount = 0;
mapping(uint => Restaurant) public restaurants;
Then, when I'm trying to access stuff in my JS app, it works, but not if I'm trying to access an actual review:
const restaurantCount = await review.methods.restaurantCount().call() // works
const restaurant = await review.methods.restaurants(2).call() // works
const reviewObj = await review.methods.restaurants(2).reviews(0).call() // throws an error
How do I access a mapping that is inside of a mapping (both are related to structs)?
Your reviews mapping is defined inside your Restaurant struct, thats why you can't access it, you need to access a restartant first to access its reviews.
For example:
const restaurantCount = await review.methods.restaurantCount().call()
const restaurant = await review.methods.restaurants(2).call()
for(let i = 0; i > restaurant.reviewCount; i++){
let reviewObj = restaurant.reviews[i];
console.log(reviewObj);
};

HTML5 history API to reduce server requests

I am trying to develop a search filter and making use of the HTML5 history API to reduce the number of requests sent to the server. If the user checks a checkbox to apply a certain filter I am saving that data in the history state, so that when the user unchecks it I am able to load the data back from the history rather than fetching it again from the server.
When the user checks or unchecks a filter I am changing the window URL to match the filter that was set, for instance if the user tries to filter car brands only of a certain category I change the URL like 'cars?filter-brand[]=1'.
But when mutiple filters are applied I have no way of figuring out whether to load the data from the server or to load it from the history.
At the moment I am using the following code.
pushString variable is the new query string that will be created.
var back = [],forward = [];
if(back[back.length-1] === decodeURI(pushString)){ //check last back val against the next URL to be created
back.pop();
forward.push(currentLocation);
history.back();
return true;
}else if(forward[forward.length-1] === decodeURI(pushString)){
forward.pop();
back.push(currentLocation);
history.forward();
return true;
}else{
back.push(currentLocation); //add current win location
}
You can check if your filters are equivalent.
Comparing Objects
This is a simple function that takes two files, and lets you know if they're equivalent (note: not prototype safe for simplicity).
function objEqual(a, b) {
function toStr(o){
var keys = [], values = [];
for (k in o) {
keys.push(k);
values.push(o[k]);
}
keys.sort();
values.sort();
return JSON.stringify(keys)
+ JSON.stringify(values);
}
return toStr(a) === toStr(b);
}
demo
Using the URL
Pass the query part of the URL (window.location.search) to this function. It'll give you an object you can compare to another object using the above function.
function parseURL(url){
var obj = {}, parts = url.split("&");
for (var i=0, part; part = parts[i]; i++) {
var x = part.split("="), k = x[0], v = x[1];
obj[k] = v;
}
return obj;
}
Demo
History API Objects
You can store the objects with the History API.
window.history.pushState(someObject, "", "someURL")
You can get this object using history.state or in a popState handler.
Keeping Track of Things
If you pull out the toStr function from the first section, you can serialize the current filters. You can then store all of the states in an object, and all of the data associated.
When you're pushing a state, you can update your global cache object. This code should be in the handler for the AJAX response.
var key = toStr(parseUrl(location.search));
cache[key] = dataFromTheServer;
Then abstract your AJAX function to check the cache first.
function getFilterResults(filters, callback) {
var cached = cache[toStr(filters)]
if (cached != null) callback(cached);
else doSomeAJAXStuff().then(callback);
}
You can also use localstorage for more persistent caching, however this would require more advanced code, and expiring data.

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}

Efficient way to run multiple scripts using javax.script

I am developing a game where I'd like to have multiple scripts that all implement the same structure. Each script would need to be run in its own scope so that code doesn't overlap other scripts. For example:
structure.js
function OnInit() {
// Define resources to load, collision vars, etc.
}
function OnLoop() {
// Every loop
}
function ClickEvent() {
// Someone clicked me
}
// Other fun functions
Now, lets say I have: "BadGuy.js", "ReallyReallyBadGuy.js", "OtherBadGuy.js" - They all look like the above in terms of structure. Within the game whenever an event takes place, I'd like to invoke the appropriate function.
The problem comes down to efficiency and speed. I found a working solution by creating an engine for each script instance (using getEngineByName), but that just doesn't seem ideal to me.
If there isn't a better solution, I'll probably resort to each script having its own unique class / function names. I.e.
BadGuy.js
var BadGuy = new Object();
BadGuy.ClickEvent = function() {
}
I don't think you need to create a new ScriptEngine for every "Guy". You can manage them all in one engine. So with advance apologies for butchering you game scenario.....
Get one instance of the Rhino engine.
Issue eval(script) statements to add new JS Objects to the engine, along with the different behaviours (or functions) that you want these Objects to support.
You have a couple of different choices for invoking against each one, but as long as each "guy" has a unique name, you can always reference them by name and invoke a named method against it.
For more performance sensitive operations (perhaps some sort of round based event loop) you can precompile a script in the same engine which can then be executed without having to re-evaluate the source.
Here's a sample I wrote in Groovy.
import javax.script.*;
sem = new ScriptEngineManager();
engine = sem.getEngineByExtension("js");
engine.getBindings(ScriptContext.ENGINE_SCOPE).put("out", System.out);
eventLoop = "for(guy in allGuys) { out.println(allGuys[guy].Action(action)); }; "
engine.eval("var allGuys = []");
engine.eval("var BadGuy = new Object(); allGuys.push(BadGuy); BadGuy.ClickEvent = function() { return 'I am a BadGuy' }; BadGuy.Action = function(activity) { return 'I am doing ' + activity + ' in a BAD way' }");
engine.eval("var GoodGuy = new Object(); allGuys.push(GoodGuy); GoodGuy.ClickEvent = function() { return 'I am a GoodGuy' }; GoodGuy.Action = function(activity) { return 'I am doing ' + activity + ' in a GOOD way' }");
CompiledScript executeEvents = engine.compile(eventLoop);
println engine.invokeMethod(engine.get("BadGuy"), "ClickEvent");
println engine.invokeMethod(engine.get("GoodGuy"), "ClickEvent");
engine.getBindings(ScriptContext.ENGINE_SCOPE).put("action", "knitting");
executeEvents.eval();