Is there a way to Jhipster to automatically generate CRUD when generating an entity a second time ?
For instance, I missed to add a field in an entity and replay "yo entity...".
I modified the .json configuration file as described in the doc but when I checked the page, the changes didn't appear.
Do you met the same issue ?
Thanks for your help, JM
What I do is I remove all generated code (except .jhipster/.json) and run yo jhipster:entity again. When yo jhipster:entity has run, it outputs all the created files. I save that list in a text file for the next time I want to change the entity. This way I can remove all generated files easily. I know it's not ideal but it works for me.
Related
Recently, my company started to focus Extension_v2 development for Dynamics NAV BC. We store our code in an internal Git-Server. So far, so good.
But startig a new project is still a very fiddly task. You have to create a repository, clone it, execute the AL Code-Task, move the files to the fitting location push the repository to the correct upstream etc. And all this does not include the first initial Steps (README, CHANGELOG and all other fundamental files...).
So I wanted to write a small PowerShell-Script, to do all these initial steps before being able to start working on the Project.
The Problem: I could not find a way to execute the "AL-GO!" task via script.
I have already searched the Internet and some forums for an answer... but it seems like microsoft did not consider the possibility to execute tasks from the AL-Language-Extension via script.
I also played around with the New-NAVAppManifest and the New-NAVAppManifestFile command for the old Extension_V1 development, but that did not do the trick.
I am looking for a fair and easy way to combine the creation of the app.json file and the launch.json file with other commands to easily initialize a new Project without haveing to write all commands manually. Maybe I did not recognize the easy solution. Or maybe this is just the way we have to do it in Extension_v2.
Anyway, thanks for all your help nevertheless.
Greetings.
Stay away from Ext V1. It's highly deprecated at this point.
First of all, why do you need to execute the "AL-Go!" via script? The "AL-Go!" command should already include all necessary steps to create an empty project including the launch.json and app.json. (minimal adjuments required dependent on your BC environment)
There is an extension/plugin for Git in Visual Studio Code which will handle all the repository stuff for you. You don't need to change file locations if everything is set up for Git. I rarely use it yet, but saw a demo for it on the Directions EMEA last year and I'm pretty sure it works at its current state (someone correct me if I'm wrong)
A way to implement the "AL-GO!" command for a script or for setting up additional steps in your project setup might be to write your own visual studio code extension/plugin which requires some additional know-how for that.
OR
You just change the settings/files of the default project, I bet there is at least a file for creating the initial AL project. Just change that to your requirements
Sorry if this seems stupid but I wonder if it's possible to add a database entry after an ftp upload.
To be more clear, thanks to winSCP I have several folders sending everything I put in there automatically to my server.
However, I would like to create a mysql entry for each uploaded files and once again, automatically. Is it possible to do that? How?
To gives the full details of what I need to do, you can read the following.
I have several folders with pictures and each folders are uploaded automatically.
Each of those folders belong to one user and the goal is to give them an account and allow them to see and download those files through a web interface. Since one account = one folder, that's kinda easy.
And I think a simple .htaccess can simply secure things so one user can only see and download the file in his own repository, no?
However if I want them to be able to see what's new (=something they didn't download or simply mark as read) I think I need a table to manage those files.
Something like id | file (string) | read (bool).
If you think this way to proceed is bad, they I'm open to change how to do things, but to be clear uploading the file need to work this way. Not using any kind of formulary.
Thanks for reading that, sorry for my english.
Your problem contains three steps:
Folders/Files been automatically uploaded to your server directory, as you say, this been efficiently handled by winSCP.
You need to update your database with all the files and folders present in your server directory.
You need to update whether or not it is been read/downloaded by the user.
Since your first step is in place, we don't need anything there. For second step, you should write a script and schedule that script to run at a fixed time interval using CRON (if using LINUX or UNIX, or WINDOWS). The script would be responsible to create a list of file(s) present in the directory, and simply insert the file(s) information that are not present in your database.
EDIT:
This edit is to describe how your script file should work. As I explained, the cron jobs would simply help you run your script file in fixed set of interval (which can be every minute, or every hour, or every day, and so on). Lets say your database table has following columns:
fileid (varchar[20])
filepath (varchar[20])
status (boolean)
Your script file should do following things:
Create a list of existing filepaths in your server directory
Run a select query, create a list of existing filepaths from database table.
Compare list1 with list2, and find the ones that doesn't exist in list2 (This would give you a list of filepath that needs to be inserted into table)
Just insert the list of file paths you got above, and set there status to be false (which means the file is not read/downloaded yet)
NOTE: Please keep in mind that I am not advising right now that how your database table should look like. It can be what you have proposed or can even differ depending on your will or requirements.
For the third step, simply keep the status of your file to be unread when creating entries in your table from the second step, and then when user click on the file link in your application whether to view or download it, send a POST request to your server updating the file status to be marked as read.
Let me know if this helps!
We have custom content types that were created as extensions of the ATTypes, two of them extend the ATFile type and one extends the ATImage type. We recently upgraded from Plone 4.2 to Plone 4.3.2. Just discovered we are not using Blob storage at all. No wonder our Data.fs is HUGE. So, I have been trying to migrate these custom types.
I have followed all of the steps explained in this example and the product's notes from pypi, these Plone instructions, and used the example from the pypi page for archetypes.schemaextender (Sorry, since I'm still a noob my reputation won't let me post more than 2 links).
In the end, I created an extender script that just extends the ATFile type changing the FileField to BlobField. It seems to be working for new items. I can add a new CustomFileType and it appears to be uploading the file to blob, and my new upload field is showing (I changed the description as a quick way to verify which one it was using).
However, I am having a problem migrating all existing content items to move the binary files over to blob. I tried the generic migrate() script, then I created my own migrate and walker as suggested in the above resources. It doesn't seem like it is doing anything though. When printing results for each item it tries merging, I do see this returned for each item:
DEBUG ATCT.migration Migrating /site/path/to/custom/file/filename.ext (CustomFile -> Blob)
When I navigate to the custom file type in the site, where it usually shows the link to the file, it is just empty. Then going to edit, it treats it as if there is no file there. As a check, I disabled the extender, restarted, and reloaded the custom file. The file was there now. So it looks like the script I am running just isn't moving that file over to where it should be now.
I feel like I am missing something simple, and it is right there, but I can't seem to find it. All of this is learn as I go and a bit over my head, so hopefully someone can easily set me straight.
If I need to provide any additional information leave a comment and I will try to provide what you need.
UPDATE
I used the Red Turtle objects as examples to migrate my custom types as suggested by keul. I still was not able to get the file to migrate to blob within the type itself. So, I tried a different approach. I created a new custom type "CustomBlob", that is a mimic setup of my CustomFile type, and only extended this new blob type to be blob aware. Then I migrated the CustomFiles to CustomBlob, did a complete clear and rebuild, and packed the zeo. The migration seemed to work for the most part, the blobstorage grew by an expected amount, the new types worked. However, the Data.fs didn't go down in size. I would have thought that the binary files that were stored in Data.fs would be removed during the migration. Am I understanding this incorrectly? How can I remove these files so the Data.fs size goes down appropriately?
Not sure if this is the best solution, but here is how I was able to get this to work.
I created temporary content types parallel of each type (for CustomImage I made CustomImageBlob, and so on). I made the new types blob-aware only, migrated all types to their parallel. Then I enabled the extender for the original types to make them blob-aware, and migrated back. It is a little redundant and time consuming, but I just could not get the files to migrate to blob when migrating to itself.
Providing this as the best answer so far in case it helps someone else, or might encourage someone to find a better solution. Thanks for the tip keul, it definitely helped me get to this solution.
We have different streams for different environments. It is a grail project. So there is a property file called application.properties which has a property called app.version. I want that to be updated automatically post every promote done on the stream. Each stream will have different version number. Trigger server_post_promote_trig will be used to handle the post promote operation, but I am not sure how to access the files in the stream through script. I tried to give the path as /Folder1/file as reflected in the xml trigger input file, but I cannot update the file as trigger perl file complains it cannot find the file.
Any help is much appreciated.
If I understand your question correctly. You want to increment the version in a file under source control when ever a promotion occurs in the stream. If this is correct, you need to create a workspace off said stream which will edit/keep/promote the new version of this file. I would create a separate script that gets called by the server_post_promote trigger whenever a promotion occurs in this stream. This script would be placed under src control which is accessible in the workspace you created above.
In Accurev, files can only be modified via a workspace. As this is the case It may be better to implement a pre promote trigger to update this version information in the file when the user is performing the workspace to stream promote.
This would be similar to the existing Addheader script that can be found in the examples directory on the accurev server.
Also, within the script, you will probably want to build in logic to detect the promotion of the version file, to block updating the file again.
I am building a CSV import, and am looking to run through the CSV and display any potential errors to the user before actually doing the import.
$model->getErrors() only works after an attempted $model->save(), I am looking to get those potential errors before the save so the importer can then adjust his CSV file and make the necessary changes to have a flawless import.
Any ideas?
$model->validate();
$model->getErrors();
at first place, why dont you try this tool wrote by me: http://www.yiiframework.com/wiki/401/simple-csv-export/