Which name need the sql-file in the sql folder? When I want to load custom sql for the User Model.
Initial SQL Data
From the link you provided :
The hook is simple: Django just looks for a file called sql/modelname.sql, in your app directory, where modelname is the model's name in lowercase.
So, if you had a Person model in an app called myapp, you could add arbitrary SQL to the file sql/person.sql inside your myapp directory. Here's an example of what the file might contain:
Related
I am trying to create a lambda function that will get triggered once a folder is uploaded to a S3 Bucket. But the lambda will perform an operation that will save files back on the same folder, how can I do so without having a self calling function?
I want to upload the following folder structure to the bucket:
Project_0001/input/inputs.csv
The outputs will create and be saved on:
Project_0001/output/outputs.csv
But, my project number will change, so I can't simply assign a static prefix. Is there a way of dynamically change the prefix, something like:
Project_*/input/
From Shubham's comment I drafted my solution using the prefix and sufix.
For my case, I stated the prefix being 'Project_' and for the suffix I choose one specific file for the trigger, so my suffix is '/input/myFile.csv'.
So every time I upload the structure Project_/input/allmyfiles_with_myFile.csv it triggers the function and then I save my output in the same project folder, under the output folder, thus not triggering the function again.
I get project name with the following code
key = event['Records'][0]['s3']['object']['key']
project_id = key.split("/")[0]
I have model properties file defined in .yml and query defined in .sql. There are many models and there are some properties which are shared across models. So for instance I have used meta tag in .yml model property file and need to define a value let say a: . Now this is used across multiple model meta's. How can I define and use t hem in the dbt model properties file?
Variables not only works in .sql but also works in .yml in the model properties file. Here one sample of it
https://github.com/dbt-labs/dbt-core/pull/2015
works
I am testing the azure-devops-migration-tools and have create a project using https://azuredevopsdemogenerator.azurewebsites.net/ (Parts Unlimited). I have generated the configuration.json and changed the Source and Target so I can test a migration, but I'm getting errors while migrating Work Items.
[15:14:41 ERR] Error running query
Microsoft.TeamFoundation.WorkItemTracking.Client.ValidationException: TF51005: The query references a field that does not exist. The error is caused by «ReflectedWorkItemId».
I've tried different options on the "ReflectedWorkItemIDFieldName" field, Scrum, Basic, Agile, Custom, empty but am still unable to migrate the work items.
How can I get the value to put on this field for the specific project?
Thanks,
Bruno
Quick Solution: Most ADO instances use the prefix 'custom' for new fields. Try "Custom.ReflectedWorkItemId" in your configuration.json to see if that resolves the problem.
More details: It's hard to tell without an actual configuration.JSON file to review. One possible problem is that you need to use the actual and full internal 'Name' of the ReflectedWorkItemID field. This doesn't show in ADO, or the Process Template when created. The recommendation is that you create a query referencing your custom field, and export the WIQL file (query file). Once you export the WIQL file, you can then open the file and see the full syntax of the custom field.
Exporting Queries: If you don't know how to do this, it can be done with VisualStudio. If you don't know how to do that, you can install this extension. It's a handy WIQL import/Export and editor. Install, and your ADO Queries with have an Edit in WIQL Editor option. Create a query that exposes your 'ReflectedWorkItemID' as a column, then edit that query in the WIQL editor and see the full names of the Reflected Work Items ID Feild. https://marketplace.visualstudio.com/items?itemName=ottostreifel.wiql-editor
SELECT
[System.Id],
[System.WorkItemType],
[System.Title],
[System.AssignedTo],
[System.ChangedBy],
[Custom.ReflectedWorkItemId]
FROM workitems
WHERE ...
I found a possible solution. I have created a custom process, change the process from the projects to this new one and add a new field. This is the field I'm using on the configuration.json and now I'm able to migrate work items
To make the migration in the "ReflectedWorkItemIDFieldName" you must do:
"Organization Settings" -> Process -> Select the process where you project are (Basic, Scrum, Agil, or CMMI).
then click on the 3 dots and create a new Inherited process.
Then with the inherited process, you are able to create a new field for each work item type. The name that you type (could be "IronMan") that name will be in your configuration file.
How do you generate a path based on dynamic data inside the model the file is being saved to? An example would be having a User model with a fileAttachment field. If one instance of the User model has a registrationNumber of 123, I want to store their file at /123/fileName.pdf. If another user has a registrationNumber of 456, I want to store their file at 456/fileName.pdf. The path field accepts a string, and unfortunately during the time it is set, there's no access to the model fields. In addition, the file is named and uploaded to AWS by the time the pre('save', ...) hook is executed which prevents renaming there.
I added a custom eav attribute to my Magento application product entity using an installer script (Basically, following the procedure described here: Installing Custom Attributes with Your Module). Now, I want to use an update script to change (populate) the values of this attribute for each product according to some criteria (based on the product category). The script I attempted to use was essentially like this:
$attributeValues = array(...) // Map from $productId to the desired $value
$product = Mage::getModel('catalog/product');
foreach($attributeValues as $productId=>$value){
$product->load($productId)->setMyAttribute($value);
$product->save();
}
My questions would then be: Is it ok to use this level of abstraction (Mage::getModel('catalog/product') and its methods) in update scripts? If it isn't, how would you recommend to change these attribute values using update scripts (without requiring sql)?
The script I used (until now) has not worked and failed with the error:
Call to a member function getStoreIds() on a non-object
in a magento core file.
I don't know if this error is a Magento bug or is a problem with how I'm using the update scripts.
I'm using Magento 1.4.0.1
Data update scripts are the way to go
Simply use a data upgrade script for this. This is a script placed in the data folder instead of the sql folder. Those scripts run later then the database structure updates, and allow access to more functionality.
Example file names:
app/code/local/My/Module/data/your_setup/data-install-0.1.0.php
app/code/local/My/Module/data/your_setup/data-upgrade-0.1.0-0.2.0.php
This is already available in Magento 1.4.
Try adding Mage::app()->setUpdateMode(false) in your sql upgrade script. e.g.
$installer = new Mage_Eav_Model_Entity_Setup('core_setup');;
$installer->startSetup();
Mage::app()->setUpdateMode(false);
Mage::app()->setCurrentStore('your_store_code_here');
If you look in Mage::app()->getStore() you will see the following snippet that returns an incorrect store which is required for saving a product.
if (!Mage::isInstalled() || $this->getUpdateMode()) {
return $this->_getDefaultStore();
}