The order data in Virto is not dynamic in the dashboard. It's exactly the same data as shown in the demo version of Virto. Can we change this?
The data is taken from the statistics table in Virto Commerce, that is meant to be updated by a job that runs in the background and gathers the statistics. We will include the job in the next release.
Virto Commerce 1.11 has been released and this background job is already there.
Related
I'm looking for a workaround on the following issue. Hope someone can help.
I'm unable to backfill data in the ga_sessions_ table in BigQuery through product linking in GA. e.g. partition ga_sessions_20180517 is missing
This specific view has already been linked before. Google documentation says that historical load is only done once per view (hence, the issue) (https://support.google.com/analytics/answer/3416092?hl=en)
Is there any way to work around it?
Kind regards,
Martijn
You can use Google Analytics Reporting API to get the data for that view. This method has lot of restrictions like sometimes the data is sampled/only 7 dimensions can be exported in one call, but at least you will be able to fetch your data in a partitioned manner.
Documentation hereDoc
If you need a lot of dimensions/metrics in hit level format, scitylana.com has a service that can provide this data historically.
If you have a clientId set in a custom dimension the data-quality is near perfect.
It also works without a clientId set.
You can get all history as available through the API.
You can get 100+ dimensions/metrics in one batch into BQ.
I just want to create report from Jira (Chart report showing total bugs of weekly report for 2 projects) I tried using time since report.Generated graph displayed total bugs for the 2 projects, But I want 2 projects name with each issues associated with project for each day.Is there any way to get the report?
I guess you are in search of a Dashboard. Dashboard will keep you update after regular interval about the status of the project. For creation of a dashboard you will need to create a filter first and then the Dashboard.
Create filter according to your requirements. i.e. search issues logged by QA in particular time period, in particular sprint etc.
This link will help you https://university.atlassian.com/uac/2.0/courses/end-user/jira/v60/search-and-filters/save-and-share-filter#/lesson-content-header
After creation of the Filter create a dashboard.
Follow steps mentioned in below link https://university.atlassian.com/uac/2.0/courses/end-user/jira/v60/dashboards/create-share-customize-a-dashboard
https://confluence.atlassian.com/display/JIRA/Customizing+the+Dashboard
Hope this helps you.
Hello,
I am new in NopCommerce. I have change in Nop.Core, Nop.Data and Nop.Services. I have change also in some controller, Model and view of Nop.web.
If i wish to upgrade nopcommerce version from 2.8 to 3.10 then, which way is easy and best.
1) I backup my file and get update. Once update is finished then, may i replace only those part which i have updated and differ from original code? May i add new method which is in my backup file but not in original code?
2) Or May i have to create new plugin or other way.
[For example: I have change in product table and add new fields like size, age, color.]
Please let me know your valuable feedback.
Thanks
There is no straight right or wrong answer. I am suggesting on the approach i took. Assuming you have code changes and database changes on top of base nop 2.80.
Ground Work
Write down a detailed modifications list. (Additional functions you have added on top of 2.80.)
Check with 3.10 if any of your modification is supported out of the box.
My modification count was 250 (very detailed up to estimation).
Approach
Upgrade 2.80 db to 3.10 db.
Modify 3.10 code to support new features of 2.80.
DB Upgrade
Find a good database diff tool. ex: SQL Compare.
Restore your production (2.80) DB to your dev pc and install nop 3.10 db into your dev pc as well.
Compare both DB table by table. Basically, you are going to upgrade 2.80 db to 3.10 db by comparing 3.10 schema.
Alter/Delete/Add new columns in 2.80 by comparing 3.10.
Create Store information (Store table). This is new feature in 3.10 and StoreID is needed for most other tables.
Update customer data to match 3.10 schema.
Update products information. ProductVariant table is now merged with Product table. So need to update product table.
Update Order details. OrderVariant is now OrderItem. So move the data.
Move other tables.
I used to create single SQL Script which,
Restores Production DB from a backup file.
Script block for each table which, upgrades each tables and populates data.
This gives you flexibility of run and run and again run the script if there is any error or even this is helpful during scripting.
In addition to this, if you are merging 2 or more stores in to one,
Add all store information in step 5.
Now create a separate script for each store from this point.
You need to find different sequence number for OrderId & Customer id. Can't be same.
When you add 2nd or more store, check for existing customer before adding.
Check 01
Now take a fresh 3.10 code base and run against your migrated db. All should work well if you have done migration properly.
Code Upgrade
There is significant changes to be done on code simple because there is noProductVariant table. So all the custom logic needs to be re written.
Main issue is, invoicing. If you have more than one store, there is no email setting per store basis. So have to custom modify that too.
A good approach would be,
Do all the customer side eCommerce fist.
Then do the admin side.
If customer and admin in same functionality, do together. example, custom modification on order placing work flow.
There will not be big modification needed for plugins.
Check 02
Run the migrated DB with Updated 3.10 code base. All should work.
On Big Day
Backup Production DB and Production Code base.
Run the Upgrade scripts and Replace new code base.
No 3rd Step, since you have done all the hard work before this.
Ok, if you screw up, then roll back.
Things to Note
I learned these by testing. thank god, i found them before actual migration.
There is no detailed instructions at the time we were migrating on how to setup a complete multi-store solution in nop commerce side. There is a instruction here on how to setup nop commerce in production server. but i is not covering all the aspects.
We were using VPS Server to host our platform. If you are using VPS, please beware that SNI is need to be used if you set up multi-store properly. Only IIS 8 and above supports SNI. Which means you need Windows 2012 Server. See here and here for more on SNI
We were using Pleask to manage the server. So set up master domain as primary and all other stores as alias. In IIS side, RDP in to VPS and Set up SSL for each domain using SNI feature of IIS8
Down side of SNI, it is not supported by all old browsers. See here.
Limitations
If you are using Pleask, then email wont work very well. Since email box will be created only for master domains and all other alias will share the same email accounts. So you can send a reply from alias email. unfortunately, its out of nop commerce development scope.
i haven't found a solution for this. working on this.
I'd recommend doing the database incrementally. According to the upgrade guide, you must apply the upgrade scripts one at a time, just read through the guide and have at it.
How can I view all installed SAP notes in a system?
As I remember correctly SPAU respective SPDD don't show up any SAP Note which was regularly installed by support packages (SPAM). Only SAP Notes are listed there, which were implemented by means of modification/correction and are in conflict.
Please run transaction SPAM to see which support package level you already have.
What are you trying to accomplish here? If you just want to see if a note is already applied, then I believe you can pull the note up in transaction SNOTE and check.
Try transaction SPAU. On the modifications Tab select "Corrections in SAP Note"
I'm building an application that runs on a Windows Mobile device. I'm using Microsoft's Sync Framework to sync the Sql CE database with the main corporate db.
The question is how can I limit the fields that are syncronized? The table in question has stacks of fields but I only need to display a few of them on the mobile device and replication is only one way (from the server to the mobile) so that shouldn't be an issue. I've seen this similar question but there's not much info there. Can anyone give me more advice on how to achieve this? I imagine that it's a very common requirement.
Also, does anyone know if I can use the Sync Framework Version 2.0 or do I have to stick to 1.0. I had a feeling that 2.0 doesn't support Windows Mobile but I'm not sure.
Cheers
Mark
You can change the T-SQL that's generated behind the scenes to not include all the columns of the table, but there are a couple of gotchas here. Firstly, it means that you can't use a wizard to modify the sync selection later - not a big deal, and creating your own partial class to override just the specific method with the T-SQL for your table mitigates that a bit.
Second, changes to the unincluded (not sure if that's a word?) columns can also trigger a download of that row as by default the change tracking is by row. You can change this by setting the Track_Columns_Updated flag
ALTER TABLE Employee
ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = ON)
Depending on the number of rows and size of the data and frequency updated, I have often found an easier solution is to provide a trigger on the main table of the server to update records in a separate table containing just the data you need, then sync that. It makes it much easier to change what's downloaded later. This is obviously not a solution if you are downloading the entire works of Shakespeare, but for a few 1000 records of a product catalogue, I think it's perfectly feasible.