BigCommerce Updating global context variable - bigcommerce

I am trying to add additional data to objects already stored in the global context by front matter.
I am assigning front matter results to a global context variable in the home.html page
---
products:
new:
limit: {{theme_settings.homepage_new_products_count}}
---
{{inject 'newProducts' products.new}}
I am then doing some updates to the data in home.js. Basically I'm pulling the product UPC and Minimum Purchase Quantity using a graphQL query and adding this to the objects.
this.context.newProducts[0].minimumPurchaseQuantity = 51;
this.context.newProducts[0].upc = '360000000000';
The new objects with the new data added are stored in the global context, but the updated data is not visible to templates that the product data has been passed down to?
{{> components/products/featured products=products.featured columns=theme_settings.homepage_featured_products_column_count urls=urls}}
How can I force the template to update the data that has been passed down?

Related

Apply a date field value to expiration date in inventory details subrecord

I'm a newbie in NetSuite Scripting and was recently asked to apply the value from a date field (custbody_expiration_date) on item receipt transaction body to the expiration date field in the inventory details of all items when the item receipt is created.
Since there is no way to create a workflow on inventory details, I've managed to work out below codes however I'm keeping getting all sorts of different error message. Below is one of them after I click on save on item receipt.
Notice (SuiteScript)
org.mozilla.javascript.EcmaError: TypeError: Cannot find function getCurrentLineItemValue in object standard record. (/SuiteScripts/ARROW/Expiration_date_apply_to_all (1).js#27)
I am very confused on the difference between dynamic and standard mode, which functions should be used in which mode? Also, I am a bit hesitated on whether user event script is the correct way to go?
/**
*#NApiVersion 2.0
*#NScriptType UserEventScript
*#NModuleScope Public
*/
define(['N/record','N/search'], function (record, search) {
function beforeSubmit(context) {
var IRrecord = context.newRecord;
var numberOfLineItems = IRrecord.getLineCount({
sublistId: 'item'
});
var expirationdate = IRrecord.getValue({
fieldId: 'custbody_expiration_date'
});
for (var i=1; i<=numberOfLineItems; i++){
IRrecord.setSublistValue({
sublistId: 'item',
fieldId: 'item',
line: i,
value: true
});
//First get Lot Number and Quantity
var lotNumber = IRrecord.getCurrentLineItemValue('item', 'receiptinventorynumber');
var quantity = IRrecord.getCurrentLineItemValue('item', 'quantity');
var inventoryDetail = IRrecord.createCurrentLineItemSubrecord('item','inventorydetail');
inventoryDetail.selectNewLineItem('inventoryassignment');
inventoryDetail.setCurrentLineItemValue('inventoryassignment', 'issueinventorynumber', lotNumber);
inventoryDetail.setCurrentLineItemValue('inventoryassignment', 'quantity', quantity);
inventoryDetail.setCurrentLineItemValue('inventoryassignment', 'expirationdate', expirationdate);
inventoryDetail.commitLineItem('inventoryassignment');
inventoryDetail.commit();
IRrecord.commitLineItem('item');
}
nlapiSubmitRecord(IRrecord);
}
return {
beforeSubmit: beforeSubmit
}
});
Dynamic records are the kind you see client-side (as a rule) - modify a field value and some other field becomes refreshed and updated in real time. Forms sometimes need to have their fields filled in a particular order to prevent form completion errors triggering or field sourcing to work. For example, when entering a sales order, selecting the customer then defaults the sales tax when items are added to the order. Errors may be thrown at any point before the record save because a field is triggering dynamic sourcing (updating other fields), based on what has been entered.
Standard mode is - less dynamic. You populate the fields of the record in any order you choose, and when the save is performed, you choose whether sourcing (updating other fields from the data available) is triggered. Any errors in data entry are reported when the save is performed. I think it also has a lower client-side load as there are fewer AJAX queries being triggered.
Both are available in client-side and server-side javascript, but some record types cannot be updated client-side and must be done server-side using workflow actions, User Event, Restlet, Suitelets, or scheduled scripts. To the best of my knowledge, inventory subrecords on fulfillments, receipts and the like are one such type.
The way lines are updated changes between dynamic and standard mode. In dynamic mode, lines are selected, updated then committed and the methods used would be :
selectLine
setCurrentLineItemValue
commitLine (only do this if actually changing the line)
For standard mode, the way of changing lines is only to use setSublistValue and include the line number in the parameters.
Workflow action scripts will load the record in dynamic mode, but the load method can be investigated using the isDynamic() method on the record.
The other thing is, in SuiteScript 2, sublist lines are indexed from 0, not from 1 as your script is using. What's confusing is, in Suitescript 1, indexing was from 1. The code is using a mix of v1 & v2. nlapiSubmitRecord is v1, IRrecord.save is v2.
And for more information, see SuiteAnswer 79715 which explains how to set a value on the inventory detail on an item receipt. The example reloads the record in standard mode and updates the inventoryStatus field. SuiteAnswer 45372 explains the Record object and the difference between standard and dynamic modes. Take a look at SuiteAnswer 67605 which explains the basics of SuiteScript v2. SuiteAnswers is an amazing resource and the search is surprisingly good. I can also recommend Eric T Grubaugh's site (#erictgrubaugh) which has some great videos including comparisons between v1 & v2.

dojo 1.10 JsonRest idAttribute - server passed a float in PUT

Just getting started with dojo/JsonRest, but having some problems with sending updates back to my server. I've got 2 questions that I'm stuck with;
The code below produces a grid with one of the columns set to editable.
The primary key in my json data is the "jobName" attribute (hence idAttribute in the JsonRest store).
First question is about the URI in the PUT;
- When I call dataStore.save() the server get's a PUT, but the URI is /myrestservice/Jobs/0.9877865987 (it changes each time, but is always a float)
- I don't see where dojo is getting the float number from? It's not my idAttribute value from that row. How can I get the PUT to respect the idAttribute in the JsonRest store?
- I did try setting idProperty in the MemoryStore to "jobName", but that changed the PUT in to a POST and removed the float, but I still don't get a jobName in the URI which is what my REST server needs.
Second question about the content of the PUT;
- The PUT contains the whole row. I'd really just like the idAttribute and the data that changed - is that possible?
I've been through the examples and docs, but there aren't many examples of handling the PUT/POST part of JsonRest.
Thanks
var userMemoryStore = new dojo.store.Memory( );
var userJsonRestStore = new dojo.store.JsonRest({target:"/myrestservice/Jobs/", idAttribute:"jobName"});
var jsonStore = new dojo.store.Cache(userJsonRestStore, userMemoryStore);
var dataStore = new dojo.data.ObjectStore( {objectStore: jsonStore } );
/*create a new grid*/
var grid = new dojox.grid.DataGrid({
id: 'grid'
,store: dataStore
,structure: layout
,rowSelector: '20px'}
,"gridDiv");
grid.startup();
dojo.query("#save").onclick(function() {
dataStore.save();
});
I think you want idProperty, not idAttribute. It also might help to set idProperty in the Memory store being used to cache as well; that may be what's generating the random float.
As for the second question, that'd probably require customization; I don't believe OOTB stores (or grids) generally expect to send partial items.

How to replace relationship field type object IDs with names / titles in KeystoneJS list CSV download / export?

In Keystone admin list view the handy download link exports all list items in a CSV file, however, if some of the fields are of Relationship type, the exported CSV contains Mongo ObjectIDs instead of nmeaningful strings (name, title, etc) which would be useful.
How can one force the ObjectIDs to be mapped / replaced by another field?
Keystone has an undocumented feature that allows you to create your own custom CSV export function. This feature was added back in April (see KeystoneJS Issue #278).
All you need to do is add a method to the schema called toCSV. Keystone will inject any of the following dependencies when specified as arguments to this method.
- req (current express request object)
- user (currently authenticated user)
- row (default row data, as generated without custom toCSV())
- callback (invokes async mode, must be provided last)
You could, for example, use the Mongoose Model.Populate method to replace the Object Ids of any relationship field with whatever data you want.
Assume you have a Post list with an author field of Types.Relationship to another list (let's say User) which has a name field. You could replace the author Object Id with the author's name (from the User list) by doing the following.
Post.schema.methods.toCSV = function(callback) {
var post = this,
rtn = this.toJSON();
this.populate('author', function() {
rtn.author = post.author.name; // <-- author now has data from User list
callback(null, rtn);
});
};
.toCSV() will be called for every document returned with the Model as the context. When used asynchronously (as above) you should return a JSON representation of the new CSV data by passing it as the second argument of the callback. When using it synchronously simply return the updated JSON object.

Unable to get a total count of items in the model in the ASP.NET MVC view

I am using the following code to render a pager:
#Html.BootstrapPager(Request.QueryString("Page"), Function(index) Url.Action("Index", "Posts", New With {.page = index}), 14000, System.Web.Configuration.WebConfigurationManager.AppSettings("PageSize"), 15)
My problem is that If I use Model.Count in place of the 14000 then I only get 1 page of records since I am using skip and take in the repository to pull only the need records. How can i in the view access the total number of published records so that I don't have to hard code the value into the view right now?
The original pager code is here. I converted it to VBNET am using it. It works fine if the record count is hardcoded.
This is the repo:
Dim posts As IEnumerable(Of PostSummaryDTO)
Using db As BetterBlogContext = New BetterBlogContext
posts = db.be_Posts.OrderByDescending(Function(x) x.DateCreated).Select(Function(s) New PostSummaryDTO With {.Id = s.PostRowID, .PostDateCreated = s.DateCreated, .PostSummary = s.Description, .PostTitle = s.Title, .IsPublished = s.IsPublished}).Skip((Page - 1) * PageSize).Take(PageSize).ToList()
Return posts.ToList
End Using
You need two different methods in the lower layer - one to get the total count and one to get the desired page - and then call them both from your controller, passing both results in the model to the view. As such, the model cannot be a collection of records; it must be an object with a property for a collection of records and a property for the count. Either that or use the ViewBag to pass the count.
What we do in my office is have a service layer to contain the business logic and repository to handle the data access. There is a single method in the repository to return an IQueryable that provides access to all the records for a particular table. There are then one or more methods in the service that call that repository method and use it in different ways. In this case, there might be a GetTotalCount method and a GetPage method in the service. Both would call the same repository method to get the same IQueryable and then the first method would call Count on the result while the second method would call Skip and Take. As Skip and Take don't force execution of the query, you'd also call ToArray or the like in that second method. The service might also have a GetRecord method that you would pass an ID and call FirstOrDefault inside to get a single record with a matching ID. You can roll the service and repository into a single class if you want but I'd recommend separating the business logic from the data access.

Persist a top-level collection?

NHibernate allows me to query a database and get an IList of objects in return. Suppose I get a list of a couple of dozen objects and modify a half-dozen or so. Does NHibernate have a way to persist changes to the collection, or do I have to persist each object as I change it?
Here's an example. Suppose I run the following code:
var hql = "from Project";
var query = session.CreateQuery(hql);
var myProjectList = query.List<Project>();
I will get back an IList that contains all projects. Now suppose I execute the following code:
var myNewProject = new Project("My New Project");
myProjectList .Add(myNewProject);
And let's say I do this several times, adding several new projects to the list. Now I'm ready to persist the changes to the collection.
I'd like to persist the changes by simply passing myProjectList to the current ISession for updating. But ISession.SaveOrUpdate() appears to take only individual objects, not collections like myProjectList. Is there a way that I can persist changes to myProjectList, or do I have to persist each new object as I create it? Thanks for your help.
David Veeneman
Foresight Systems
If you load objects like in your example - then yes you have to persist them one by one.
However, if you make a small design change, and load something like : Account that has an IList<Project> - if you specify cascade "what_cascade_you_need" in the mapping , then when you change the projects on Account , you only have to save Account and everything will get saved.