How to seed initial data after adding new entity? - asp.net-core

I've noticed a Seed folder in MyProject.EntityFrameworkCore project with the code to seed initial data to the database.
If I add code to populate the database with my new entity, where and how will the code be called?
Do the .NET Core and the full .NET Framework versions work the same way?

It is run:
On application startup, called in the PostInitialize method of YourEntityFrameworkModule:
public override void PostInitialize()
{
if (!SkipDbSeed)
{
SeedHelper.SeedHostDb(IocManager);
}
}
If you build Migrator project and run the .exe, called in Run method of MultiTenantExecuter:
public void Run(bool skipConnVerification)
{
// ...
Log.Write("HOST database migration started...");
try
{
_migrator.CreateOrMigrateForHost(SeedHelper.SeedHostDb);
}
// ...
}
If you add new code to populate your custom entity, remember to check before adding, like this:
var defaultEdition = _context.Editions.IgnoreQueryFilters().FirstOrDefault(e => e.Name == EditionManager.DefaultEditionName);
if (defaultEdition == null)
{
// ...
/* Add desired features to the standard edition, if wanted... */
}
Yes, the .NET Core and full .NET Framework versions work the same way.

Related

ABP: Rebuilding Localization Sources from Custom Provider

I am using ABP v4.9.0 (.NET CORE 2.2) with angular client
I built some custom localization providers. These providers get translation dictionaries from an external API.
I add localization sources on startup with these providers.
var customProvider = new CustomLocalizationProvider(...);
var localizationSource = new DictionaryBasedLocalizationSource("SOURCENAME", customProvider );
config.Localization.Sources.Add(localizationSource );
On startup, the providers InitializeDictionaries() is called and localization dictionaries are built.
So far, so good, working as intended.
Now i'd like to manually Reload these translations on demand, but I can't make this working.
Here is what I tried.
Here I trigger the re-synchronize of the language ressources:
foreach (var localizationSource in _localizationConfiguration.Sources)
{
try
{
localizationSource.Initialize(_localizationConfiguration, _iocResolver);
}
catch (Exception e)
{
Logger.Warn($"Could not get Localization Data for source '{localizationSource.Name}'", e);
}
}
In the custom provider, I first clear the Dictionaries
public class CustomLocalizationProvider : LocalizationDictionaryProviderBase
{
protected int IterationNo = 0;
protected override void InitializeDictionaries()
{
Dictionaries.Clear();
IterationNo += 1;
var deDict = new LocalizationDictionary(new CultureInfo("de-DE"));
deDict["HelloWorld"] = $"Hallo Welt Nummer {IterationNo}";
Dictionaries.Add("de-DE", deDict);
var enDict = new LocalizationDictionary(new CultureInfo("en"));
enDict["HelloWorld"] = $"Hello World number {IterationNo}";
Dictionaries.Add("en", enDict);
}
}
The provider is executed again as expected.
But when I eventually use the localization clientside (angular), I still get the original translations.
What am I missing?
Thanks for the help.
In the meanwhile I had to go for another approach.
I am now using a XmlEmbeddedFileLocalizationDictionaryProvider wrapped by a MultiTenantLocalizationDictionaryProvider.
This way, I am using db-localizations with xml-sources as fallback
Then I manually load the ressources from my API in some appservice. These localizations are then updated in the database by using LanguageTextManager.UpdateStringAsync().

Best practices for prepopulated tables via OrmLite in Servicestack

I'm generating tables via OrmLite and I was wondering about best practices for prepopulating tables. Example tables - countries, states, cities, etc.
I can think of a few ways to pre-populate tables:
List item
Seed DB
API (when possible)
Static file
In code
Separate project
However, in some cases the data could get large as in the example of cities around the world so in code is not viable.
I could also consider generating tables that need to be pre-populated directly via another project where I can fetch data from a source and get it into the DB.
However, I was wondering about the scenario when you do generate it via an ORM (especially in production). How would you approach the problem?
This must be a common problem across all ORM's.
If it's only code tables like countries, states, etc, they're small enough to still have them as part of the project, normally I'd create a separate static class called SeedData with all the data in POCO's
1. Maintaining Code Tables in Host Project
public static class SeedData
{
public static List<Country> Countries
{
get { return new[] { new Country(...), ... }; }
}
}
Then in your AppHost populate add a flag on whether to re-create them on startup, e.g:
public void Configure(Container container)
{
var appSettings = new AppSettings(); //Read from Web.config <appSettings/>
if (appSettings.Get("RecreateTables", false))
{
using (var db = container.Resolve<IDbConnectionFactory>().Open())
{
db.DropAndCreateTable<Country>();
db.InsertAll(SeedData.Countries);
...
}
}
}
Change AppSetting to recreate tables
This will then let you re-create the tables and re-populate the data when you change the RecreateTables appSetting to True, e.g:
<appSettings>
<add key="RecreateTables" value="True" />
</appSettings>
As the default behavior of ASP.NET will automatically restart the AppDomain, just saving a change to Web.config is enough to restart your ASP.NET application the next time any page gets refreshed.
2. Add to Test Project in adhoc Explicit Test
If the Data gets too big to fit in the working project I would first move it to a separate test project inside an [Explicit] text fixture (so it's never automatically run), that you can easily run manuallu, e.g:
[Explicit]
[TestFixture]
public class AdminTasks
{
[Test]
public void Recreate_and_populate_tables()
{
var dbFactory = new OrmLiteConnectionFactory(...);
using (var db = dbFactory.Open())
{
db.DropAndCreateTable<Country>();
db.InsertAll(SeedData.Countries);
...
}
}
}
3. Save data in external static text Files
Finally if the data is even too big to fit in C# classes, I would then save it out to a static file in the test that you can easily re-hydrate into POCO's that you can populate with OrmLite, e.g:
[Test]
public void Recreate_and_populate_tables()
{
var dbFactory = new OrmLiteConnectionFactory(...);
using (var db = dbFactory.Open())
{
db.DropAndCreateTable<Country>();
var countries = File.ReadAllText("~/countries.txt".MapAbsolutePath())
.FromJson<List<Country>>();
db.InsertAll(countries);
...
}
}

Adding 'GO' statements to Entity Framework migrations

So I have an application with a ton of migrations made by Entity framework.
We want to get a script for all the migrations at once and using the -Script tag does work fine.
However...it does not add GO statements in the SQL giving us problems like Alter view should be the first statement in a batch file...
I have been searching around and manually adding Sql("GO"); help with this problem but only for the entire script. When I use the package console manager again it returns an exception.
System.Data.SqlClient.SqlException (0x80131904): Could not find stored procedure 'GO'.
Is there a way to add these GO tags only when using the -Script tag?
If not, what is a good approach for this?
Note: we have also tried having multiple files but since we have so many migrations, this is near impossible to maintain every time.
If you are trying to alter your view using Sql("Alter View dbo.Foos As etc"), then you can avoid the should be the first statement in a batch file error without adding GO statements by putting the sql inside an EXEC command:
Sql("EXEC('Alter View dbo.Foos As etc')")
In order to change the SQL Generated by entity framework migrations you can create a new SqlServerMigrationSqlGenerator
We have done this to add a GO statement before and after the migration history:
public class MigrationScriptBuilder: SqlServerMigrationSqlGenerator
{
protected override void Generate(System.Data.Entity.Migrations.Model.InsertHistoryOperation insertHistoryOperation)
{
Statement("GO");
base.Generate(insertHistoryOperation);
Statement("GO");
}
}
then add in the Configuration constructor (in the Migrations folder of the project where you DbContext is) so that it uses this new sql generator:
[...]
internal sealed class Configuration : DbMigrationsConfiguration<PMA.Dal.PmaContext>
{
public Configuration()
{
SetSqlGenerator("System.Data.SqlClient", new MigrationScriptBuilder());
AutomaticMigrationsEnabled = false;
}
[...]
So now when you generate a script using the -Script tag, you can see that the insert into [__MigrationHistory] is surrounded by GO
Alternatively in your implementation of SqlServerMigrationSqlGenerator you can override any part of the script generation, the InsertHistoryOperation was suitable for us.
Turn out the concept exist deep in the SqlServerMigrationSqlGenerator as an optional argument for Statement(sql, batchTerminator). Here is something based on Skyp idea. It works both in -script mode or not. The GOs are for different operations than for Skyp only because our needs are a little different. You then need to register this class in the Configuration as per Skyp instructions.
public class MigrationScriptBuilder : SqlServerMigrationSqlGenerator
{
private string Marker = Guid.NewGuid().ToString(); //To cheat on the check null or empty of the base generator
protected override void Generate(AlterProcedureOperation alterProcedureOperation)
{
SqlGo();
base.Generate(alterProcedureOperation);
SqlGo();
}
protected override void Generate(CreateProcedureOperation createProcedureOperation)
{
SqlGo();
base.Generate(createProcedureOperation);
SqlGo();
}
protected override void Generate(SqlOperation sqlOperation)
{
SqlGo();
base.Generate(sqlOperation);
}
private void SqlGo()
{
Statement(Marker, batchTerminator: "GO");
}
public override IEnumerable<MigrationStatement> Generate(IEnumerable<MigrationOperation> migrationOperations, string providerManifestToken)
{
var result = new List<MigrationStatement>();
var statements = base.Generate(migrationOperations, providerManifestToken);
bool pendingBatchTerminator = false;
foreach (var item in statements)
{
if(item.Sql == Marker && item.BatchTerminator == "GO")
{
pendingBatchTerminator = true;
}
else
{
if(pendingBatchTerminator)
{
item.BatchTerminator = "GO";
pendingBatchTerminator = false;
}
result.Add(item);
}
}
return result;
}
}
The easiest way is to add /**/ before the GO statement.
Just replace the current statement with a .Replace("GO", "");

How to easily convert .sdf developed in Vs 2010 to SQL Server database

I don't have SQL Server installed my machine. Hence I decided to start working with a SQL Server Compact Edition (.sdf) in VS2010. After then I installed SQL Server at the moment, now I'd like to convert from .sdf to a "real" SQL Server database.
How can I do it ? Please advise.
What you should to is to change your connectionString that points to your Compact SQL to your new SQL Server instance.
After that in you Context file, inside of a static constructor which is called:
static YourDbContext()
{
Database.SetInitializer(new CreateDatabaseIfNotExists<YourDbContext>());
}
This should create your database and tables based on your models. If you need to insert any data you should Enable-Migrations and in a configuration file that is createdc8 override Seed method.
One important thing DO NOT ASSUME that you have rights or privileges to CREATE or DROP DATABASE or to execute table modificiations.
I will assume that you used EF Code First approach. Your context file could look something like this:
public YourDbContext : DbContext
{
static YourDbContext()
{
// Database.SetInitializer<DbContext>(null); // Change this line to the next one
Database.SetInitializer(new CreateDatabaseIfNotExists<YourDbContext>());
}
// The rest of implementation
}
Inside of Visual Studio in Package Manager Console execute:
Enable-Migrations -ProjectName YourProjectName
(If you have more than one DbContext implementation you will need to follow the instructions from the error message that Enable-Migrations throws back at you.)
Once this is done you will notice a new folder Migrations with one file Configuration.cs. Open it and you will see method Seed.
protected override void Seed(YourDbContext context)
{
// This method will be called after migrating to the latest version.
// You can use the DbSet<T>.AddOrUpdate() helper extension method
// to avoid creating duplicate seed data. E.g.
//
// context.People.AddOrUpdate(
// p => p.FullName,
// new Person { FullName = "Andrew Peters" },
// new Person { FullName = "Brice Lambson" },
// new Person { FullName = "Rowan Miller" }
// );
//
// Here you can call your context.DbSetImplementation.Add(new Something {...});
}
That's about it.

How do you set the Customvalidation in Metadata file, If the Metadata is in different Model project

My silverlight solution has 3 project files
Silverlight part(Client)
Web part(Server)
Entity model(I maintained the edmx along with Metadata in a seperate project)
Metadata file is a partial class with relavent dataannotation validations.
[MetadataTypeAttribute(typeof(User.UserMetadata))]
public partial class User
{
[CustomValidation(typeof(UsernameValidator), "IsUsernameAvailable")]
public string UserName { get; set; }
}
Now my question is where I need to keep this class UsernameValidator
If my Metadata class and edmx are on Server side(Web) then I know I need to create a .shared.cs class in my web project, then add the proper static method.
My IsUserAvailable method intern will call a domainservice method as part of asyc validation.
[Invoke]
public bool IsUsernameAvailable(string username)
{
return !Membership.FindUsersByName(username).Cast<MembershipUser>().Any();
}
If my metadata class is in the same project as my domain service is in then I can call domain service method from my UsernameValidator.Shared.cs class.
But here my entity models and Metadata are in seperate library.
Any idea will be appreciated
Jeff wonderfully explained the asyc validation here
http://jeffhandley.com/archive/2010/05/26/asyncvalidation-again.aspx
but that will work only when your model, metadata and Shared class, all are on server side.
There is a kind of hack to do this. It is not a clean way to do it it, but this is how it would probably work.
Because the .shared takes care of the code generation it doesn't complain about certain compile errors in the #if brackets of the code. So what you can do is create a Validator.Shared.cs in any project and just make sure it generates to the silverlight side.
Add the following code. and dont forget the namespaces.
#if SILVERLIGHT
using WebProject.Web.Services;
using System.ServiceModel.DomainServices.Client;
#endif
#if SILVERLIGHT
UserContext context = new UserContext();
InvokeOperation<bool> availability = context.DoesUserExist(username);
//code ommited. use what logic you want, maybe Jeffs post.
#endif
The compiler will ignore this code part because it does not meet the condition of the if statement. Meanwhile on the silverlight client side it tries to recompile the shared validator where it DOES meet the condition of the if-statement.
Like I said. This is NOT a clean way to do this. And you might have trouble with missing namespaces. You need to resolve them in the non-generated Validator.shared.cs to finally let it work in silverlight. If you do this right you can have the validation in silverlight with invoke operations. But not in your project with models and metadata like you would have with Jeff's post.
Edit: I found a cleaner and better way
you can create a partial class on the silverlight client side and doing the following
public partial class User
{
partial void OnUserNameChanging(string value)
{
//must be new to check for this validation rule
if(EntityState == EntityState.New)
{
var ctx = new UserContext();
ctx.IsValidUserName(value).Completed += (s, args) =>
{
InvokeOperation invop = (InvokeOperation) s;
bool isValid = (bool) invop.Value;
if(!isValid)
{
ValidationResult error = new ValidationResult(
"Username already exists",
new string[] {"UserName"});
ValidationErrors.Add(error;
}
};
}
}
}
This is a method generated by WCF RIA Services and can be easily partialled and you can add out-of-band validation like this. This is a much cleaner way to do this, but still this validation now only exists in the silverlight client side.
Hope this helps