I come to solicit your help after several days of research.
I have a PrestaShop 1.6.1.4 (multishop). This shop is connected with an ERP for manage orders and customers.
For each customers, we have multi-groups. The default group on my databse is 3 (customers).
My problem : Several times a day, my database auto-update and change my default-group by another value for all customers. On my shop, my customers have to be on "customers" group to use search engine and order... This is a real problem !
Someone can help me ?
Thanks a lot !
You should look for some module or custom function, probable related with ERP integration. This should be the cause of group modification.
Anyway, a complete update of all your customers default group should include 3 keywords you can find in your source code: update, customer and id_default_group. So, if you can find a PHP or MySQL sentence that combine this 3 keywords you will find the problem. But, is this is part of an ERP integration you should be careful deleting of commenting this code because it can cause a malfunctioning of your integration.
Good luck.
Related
We have an eCommerce Prestashop (ver 1.7.2) shop. Our initial testing is okay and our shop is ready for deployment. We need to remove the order records to have a clean version of the shop. What table records do we need to remove?
Search in your Modules Catalog for cleaner, the full name is PrestaShop Cleaner.
Install it and give it a go.
Official statement:
PrestaShop Cleaner can only be managed from the back office. It allows you to roughly clean your store by deleting your whole catalog, orders, and customers.
For example, this module allows you to suppress all demo data in one go. But this module must be used carefully: once you delete it all, there is no possible rollback. Everything is removed for good.
Hello expert I am working in banking software vendor company and we are working in PostgreSQL database due to load of work in office time(pick hour) I want to execute some function in specified time(off hour) with trigger. So please if you have any idea please help me.
A trigger should always be something fast. You don't want to hold transactions open for a couple of hours, that would be a really bad idea.
The correct solution for such a problem would be a queue. There are some existing implementations like PGQ, but I don't know if the'll meet your requirements.
I am currently working on a project that requires me to upload historical work orders, notification, long texts, etc. After completing my initial development using BAPI "BAPI_ALM_ORDER_MAINTAIN", I noticed that this function module is creating regular work orders, not historical work orders.
Is there a way to designate in this BAPI that I would like historical orders to be created instead, or should I look at another function module?
EDIT: I have posted the answer below.
I know this is late, but I just remembered to update this. We ended up using a custom ABAP report paired with the SARA tool provided by SAP to update and archive orders
I'm building an application that runs on a Windows Mobile device. I'm using Microsoft's Sync Framework to sync the Sql CE database with the main corporate db.
The question is how can I limit the fields that are syncronized? The table in question has stacks of fields but I only need to display a few of them on the mobile device and replication is only one way (from the server to the mobile) so that shouldn't be an issue. I've seen this similar question but there's not much info there. Can anyone give me more advice on how to achieve this? I imagine that it's a very common requirement.
Also, does anyone know if I can use the Sync Framework Version 2.0 or do I have to stick to 1.0. I had a feeling that 2.0 doesn't support Windows Mobile but I'm not sure.
Cheers
Mark
You can change the T-SQL that's generated behind the scenes to not include all the columns of the table, but there are a couple of gotchas here. Firstly, it means that you can't use a wizard to modify the sync selection later - not a big deal, and creating your own partial class to override just the specific method with the T-SQL for your table mitigates that a bit.
Second, changes to the unincluded (not sure if that's a word?) columns can also trigger a download of that row as by default the change tracking is by row. You can change this by setting the Track_Columns_Updated flag
ALTER TABLE Employee
ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = ON)
Depending on the number of rows and size of the data and frequency updated, I have often found an easier solution is to provide a trigger on the main table of the server to update records in a separate table containing just the data you need, then sync that. It makes it much easier to change what's downloaded later. This is obviously not a solution if you are downloading the entire works of Shakespeare, but for a few 1000 records of a product catalogue, I think it's perfectly feasible.
I run a blog where the community can post time-sensitive community links (sports scores and such). After a certain time, those posts are no longer useful, so I want to delete them in batch via a MySQL query, but I don't know how. I imagine that getting rid of those posts entirely is more than just deleting from the wp_posts table, right? There other tables at work per post, aren't there?
I've tried a couple of auto- or batch-delete plugins, but they don't work half the time.
Please, could you provide a MySQL query to delete posts and their pieces older than X days from all relevant tables?
Thank you in advance.
--Nick
Wordpress has published its database structure in its Codex. Find out more.