I have a table called Logs, it contains a column called process in which I want to update the data inside by running a migration file, the structure of values is : "worker + ID " or it may contain "master", what I want to do is to replace only the word worker by "subordinate process" and keep the ID or when it's master replace it by "leading process".
My code is running but it completly replace "worker +id" by "subordinate process", does anyone know how to implement replace function or give some help to refactor it ?
async migrateTenant(tenantID: string): Promise<void> {
const transaction: Transaction = HDBUtils.createTransaction(tenantID);
// #ts-ignore
const { Logs } = cds.entities(Constants.CDS_NAMESPACE);
const substitutions = [{ src: 'master', dst: Constants.LEADING_PROCESS }, { src: 'worker', dst: Constants.SUBORDINATE_PROCESS }];
let updated = 0;
for (const substitution of substitutions) {
try {
await transaction.run(UPDATE.entity(Logs).set({ process: substitution.src }).where({ process: { like:substitution.dst } }));
await transaction.commit();
updated++;
} catch (error) {
await HDBUtils.handleHDBError(MODULE_NAME, 'migrateTenant', tenantID, transaction, 'Unable to migrate data', error);
}
}
}
Related
I have a backend made in express and mongoose:
all my mutations and queries work perfectly except one mutation sends me an infinite loader
updateVehicleVerification: async (_, { id, updateVehicleVerification }) => {
const vehicleVeri = await VehicleVerification.findById(id);
if (!vehicleVeri) {
throw new Error(ErrorMessage + ' : Verification de Vehicule');
}
await VehicleVerification.findByIdAndUpdate(
id,
updateVehicleVerification
);
const veri = await VehicleVerification.findById(id);
return veri;
},
and the query I use here:
export const UPDATE_CONTROL_VEHICLE = gqlmutation updateVehicleVerification( $id: String! $updateVehicleVerification: VerificationVehicleInput ) { updateVehicleVerification( id: $id updateVehicleVerification: $updateVehicleVerification ) { honk { state image comment } mileage dateVerification stateVehicle { damaged good missing } } };
enter code here
I I solved the problem !
I did not manage the case where the data reached me by the request which keyed an infinite loop.
In a Vue component controlling users subsciption to newsletters, I have the fellowing code:
async newSubscriber(event) {
// Validate email
//---------------
if (!this.isEmailValid(this.subscriber_email))
this.subscribeResult = "Email not valid";
else {
// If valid, check if email is not already recorded
//-------------------------------------------------
let alreadyRecorded = false;
let recordedEmails = await this.$apollo.query({ query: gql`query { newslettersEmails { email } }` });
console.log('length ' + recordedEmails.data.newslettersEmails.length);
console.log(recordedEmails.data.newslettersEmails);
for (let i = 0; !alreadyRecorded && i < recordedEmails.data.newslettersEmails.length; i++)
alreadyRecorded = this.subscriber_email === recordedEmails.data.newslettersEmails[i].email;
if (alreadyRecorded)
this.subscribeResult = "Email already recorded";
else {
// If not, record it and warn the user
//------------------------------------
this.$apollo.mutate({
mutation: gql`mutation ($subscriber_email: String!){
createNewslettersEmail(input: { data: { email: $subscriber_email } }) {
newslettersEmail {
email
}
}
}`,
variables: {
subscriber_email: this.subscriber_email,
}
})
.then((data) => { this.subscribeResult = "Email recorded"; })
.catch((error) => { this.subscribeResult = "Error recording the email: " + error.graphQLErrors[0].message; });
}
}
}
At the very first email subscription test, $apollo.query returns me the correct number of emails already recorded (let's say, 10) and record the new subscriber email. But if I try to record a second email without hard refreshing (F5) the browser, $apollo.query returns me the exact same result than the first time (10), EVEN IF the first test email has been correctly recorded by strapi (graphql palyground showns me the added email with the very same query!). Even if I add ten emails, apollo will always return me what it got during its first call (10 recorded emails), as if it uses a buffered result. Of course, that allows Vue to record several times the same email, which I obviously want to avoid!
Does it speaks to anyone ?
After a lot of Google digging (giving the desired results by simply changing in my requests, at the end, "buffering" by "caching" !), I understood that Apollo cache its queries by default (at least, in the configuration of the Vue project I received). To solve the problem I just added "fetchPolicy: 'network-only'" to the query I make:
let recordedEmails = await this.$apollo.query({
query: gql`query { newslettersEmails { email } }`,
});
became
let recordedEmails = await this.$apollo.query({
query: gql`query { newslettersEmails { email } }`,
fetchPolicy: 'network-only'
});
And problem solved ^^
I need a help from somebody experienced. I've built 2 microservices recently (let's call them Amber and Boris) which are communicating between each other using ClientProxy and REDIS. From time to time, when Amber is asking for data from Boris, it gets timeout Error.
This is Amber config:
constructor(companyName: string, userId: number) {
this.companyName = companyName;
this.userId = userId;
this.client = ClientProxyFactory.create({
transport: Transport.REDIS,
options: {
retryAttempts: 0,
retryDelay: 0,
url: 'redis://<some_url>:<some_port>,
},
});
}
Then request-response:
private async sendRequest(pattern: string, payload?: object): Promise<any[]> {
payload = payload || {};
try {
const result = await this.client.send(
{ type: pattern },
{ userId: this.userId, companyName: this.companyName, ...payload}
)
.pipe(
timeout(30000),
map((response: any) => { // Success...
return response;
}),
catchError((error) => { // Error...
return throwError(error);
}),
)
.toPromise();
return result;
} catch (err) {
Logger.error('Couldn\'t get data from Boris service: ' + err.message)
}
}
Then on Boris service, I have basically just Controller set with #MessagePattern and I'm just returning data:
#MessagePattern({type: 'getAvailableCases'})
findAll(#Payload() data: object): Promise<object> {
this.assignPayload(data);
return this.getData();
}
Important to say, Boris service is doing queries to database in order to return data. But on db side seems there is no problem.
What I'm interested in the most is:
whether I have ClientProxy set up properly
whether I have answer processing set up properly with pipe() and toPromise(), as I'm not well familiarized with ClientProxy and RxJs.
Thank you a hundred times for any response!
Turned out, the ClientProxy wasn't releasing connections to Redis after the communication is done. This way, the number of connections was increasing until there was no connection left.
The solution is to close connection after the data are returned:
this.client.close();
I'm having am issue with an array that seems to be getting populated with my mongoose code by itself. It's making it impossible to populate the array with modified values.
Here's the code:
router.get('/in-progress', function(req, res) {
console.log('exporting');
var dataset = [];
Intake.find({}, function(err, intakes) {
if(err){
console.log(err);
} else {
/*intakes.forEach(function(intake) {
dataset.push(
{
//requestName: intake.requestName,
requestName: 'Request Name',
status: intake.phase
}
)
});*/
return dataset;
}
}).then((dataset) => {
console.log(dataset);
const report = excel.buildExport(
[
{
heading: inProgressHeading,
specification: inProgressSpec,
data: dataset
}
]
);
res.attachment('requests-in-progress.xlsx');
return res.send(report);
});
});
As you can see, the logic to push data to "dataset" is commented out, but the console log is logging every Intake that I have in the MongoDB database. Does anyone know what I am doing wrong so that I can push my own values into "dataset"?
In Rally SDK 2, how do I update a hash field, like the Author field for a changeset? I read how to update the Message field, but I can't figure out how to update Author["DisplayName"] hash.
var new_message = settings.message;
Rally.data.ModelFactory.getModel({
type: 'Changeset',
success: function(model) {
model.load( '1234', {
fetch: [ 'Artifacts' ],
callback: function(result, operation) {
if ( operation.wasSuccessful() ){
var message = new_message;
record.set( 'Message', message);
record.save( {
callback: function( resultset, operation ) {
console.log( "After saving:", resultset );
if ( operation.wasSuccessful() ) {
var that = tree.ownerCt.ownerCt.ownerCt.ownerCt;
that._getChangesets();
}
}
} );
}
}
})
}
});
The Author property on Changeset is of type User. Like any other object associations on Rally's WSAPI you just set this property to the ref of the object you'd like to link. You set this the same way as you're currently setting Message in your above code snippet. (Assuming author is writable after the changeset has already been created).
record.set('Author', '/user/123456');
You can probably also avoid the deeply nested structure of your code a little bit by specifying scope on your callbacks and using member functions in your app definition:
_loadChangesetModel: function() {
//If you already have a changeset record you can get the model
//via record.self. Otherwise, load it fresh.
Rally.data.ModelFactory.getModel({
type: 'Changeset',
success: this._onChangesetModelLoaded,
scope: this
});
},
_onChangesetModelLoaded: function(model) {
model.load( '1234', {
fetch: [ 'Artifacts' ],
callback: this._onChangesetLoaded,
scope: this
});
},
_onChangesetLoaded: function(record, operation) {
if ( operation.wasSuccessful() ){
var message = settings.message;
record.set( 'Message', message);
record.save( {
callback: this._onChangesetSaved,
scope: this
} );
}
},
_onChangesetSaved: function( resultset, operation ) {
console.log( "After saving:", resultset );
if ( operation.wasSuccessful() ) {
//You shouldn't need to do this now that the scope is correct.
//I'm guessing 'that' was referring to the app itself?
//var that = tree.ownerCt.ownerCt.ownerCt.ownerCt;
this._getChangesets();
}
},
_getChangesets: function() {
//refresh
}