I have a SQL-Azure database created with Entity Framework 6.1, Code-First.
The "datetime" field in my 'EmazeEvents' table was created like this:
datetime = c.DateTime(nullable: false, defaultValueSql: "GETUTCDATE()")
and defined like this in the code:
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
[Index]
public DateTime datetime { get; set; }
I understand this means that in case this field is omitted in insertion, it will get by default the insertion date, which indeed it does.
However, I am having trouble inserting rows that set this field. Although I set the value of the appropriate variable, it still writes to the database the default date.
Some code extractions:
EmazeEvents is defined like this:
public class EmazeEvents:DbContext
{
public EmazeEvents()
: base("EmazeEvent")
{ }
public DbSet<EmazeEvent> events { get; set; }
}
}
What I do is:
context = new EmazeEvents();
EmazeEvent e = new EmazeEvent();
// e.datetime does get the correct date
e.datetime = DateTime.ParseExact("2014-05-31T00:00:06.8900000", "O", CultureInfo.InvariantCulture);
context.events.Add(e);
context.SaveChanges();
The record written to the database has the current date-time, ignoring the one in e.datetime.
I found out that the problem was with the definition of the 'datetime' field. When I removed the:
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
It started letting me write other values than the default.
Related
We have a field in our SQL Server database table which is autogenerated by SQL Server, the field is called CreatedTime.
We have mapped the whole database table to our datamodel in Entity Framework, thus also the field CreatedTime.
When we insert a new row in the database, via Entity Framework, we thus do not provide any value for CreatedTime.
This causes the insert to fail with the error:
SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM
So the question is: Is there is a way to to exclude a particular field in the Entity datamodel in the Entity Framework insert statement? So that we will not get the above error?
We would like to keep the field CreatedTime in the Entity model, because we might want to access it later.
If using Fluent API:
using System.ComponentModel.DataAnnotations.Schema;
this.Property(t => t.CreatedTime)
.HasDatabaseGeneratedOption(DatabaseGeneratedOption.Computed);
If using Annotations
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public System.DateTime CreatedTime { get; set; }
I found a simple solution to the problem on this thread:
http://social.msdn.microsoft.com/Forums/en-US/adodotnetentityframework/thread/7db14342-b259-4973-ac09-93e183ae48bb
There Fernando Soto writes:
"If you go to the EDM designer click on the field in the table that is auto-generated by the database, right click on it and select Properties and
look at the properties windows click on StoreGeneratedPattern and set its value to Computed, I believe it will give you what you are looking for."
The above solution was super quick and easy and it seems to work.
Also thank you for your contributions guys, but the above solution seems to do the job.
Try to use NotMapped attribute on this property
http://msdn.microsoft.com/en-us/library/system.componentmodel.dataannotations.schema.notmappedattribute.aspx
there are two things you can do:
If you have access to the database, check if the field has a default value. If it doesn't you can set it to GETDATE(), and the field should be set correctly, and you don't have to add/update it through Entity Framework.
If you don't have access to the database, or don't want to make any changes there, you can alter the behavior of the Entity Data Model to automatically set the date. Simply extend your ObjectContext model.
public partial class MyEntities
{
public override int SaveChanges()
{
var entityChangeSet = ChangeTracker.Entries<SomeEntity>();
if (entityChangeSet != null)
{
foreach (DbEntityEntry<SomeEntity> entry in entityChangeSet )
{
switch (entry.State)
{
case EntityState.Modified:
entry.Entity.LastModifiedDate = DateTime.UtcNow;
break;
case EntityState.Added:
entry.Entity.CreatedDate = DateTime.UtcNow;
break;
}
}
}
return base.SaveChanges();
}
}
This way you don't have to add any information for those fields when you add or update an item, the model will do it for you. If you have multiple entities which need this behavior, you can create an interface and make the Entity classes inherit that:
public interface IHaveCreatedDate {
DateTime CreatedDate { get; set; }
}
public partial class MyEntity : IHaveCreatedDate {
//MyEntity already implements this!
}
Then all you need to do is change the call to the ChangeTracker:
var entityChangeSet = ChangeTracker.Entries<IHaveCreatedDate>();
Is CreatedTime nullable?
One possible workaround - if CreatedTime is NOT nullable:
DateTime sqlServerMinDateTime = new DateTime(1753, 1, 1, 12, 0, 1, 0);
if(myEntity.CreatedTime < sqlServerMinDateTime)
{
myEntity.CreatedTime = sqlServerMinDateTime;
}
// do insert here
// ....
One possible workaround - if CreatedTime is nullable:
DateTime sqlServerMinDateTime = new DateTime(1753, 1, 1, 12, 0, 1, 0);
if(myEntity.CreatedTime < sqlServerMinDateTime)
{
myEntity.CreatedTime = null;
}
// do insert here
// ....
Based on the EF Core docs (https://learn.microsoft.com/en-us/ef/core/modeling/value-conversions), I'm trying to use EF Core's new Value Conversion on an enumeration. I want to save the enumeration as a string in the SQL Database table.
Here's the entity and enumeration.
public enum InputSetType
{
TypeA, TypeB
}
public class MonthlyInputSet
{
public int Id { get; set; }
public InputSetType Type { get; set; }
}
Here is where I configure the MonthlyInputSet Entity:
public class MonthlyInputSetConfiguration : IEntityTypeConfiguration<MonthlyInputSet>
{
public void Configure(EntityTypeBuilder<MonthlyInputSet> builder)
{
builder.Property(mis => mis.Type).HasConversion(v => v.ToString(), v => (InputSetType)Enum.Parse(typeof(InputSetType), v));
}
}
So, I try to run a basic query to get this data and it fails. The query is:
var saved = await _context.MonthlyInputSets.Include(mis => mis.InsertedBy)
.Include(mis => mis.UpdatedBy)
.Include(mis => mis.MonthlyInputs)
.ThenInclude(mi => mi.EmissionsUnit)
.FirstOrDefaultAsync(mis => mis.Id == id);
But, an error is thrown on the first line of this query that says, "ArgumentException: Must specify valid information for parsing in the string." So my guess is that I have not properly configured the conversion of the string in the table to the enum in C#.
Full raw stack trace:
I verified that the correct string value is being returned from the database. It is not null and it is not a blank string. The string value returned matches a member of the enum perfectly.
public void Configure(EntityTypeBuilder<MonthlyInputSet> builder)
{
builder.Property(mis => mis.Type).HasConversion(
convertToProviderExpression: v => v.ToString(),
convertFromProviderExpression: v => Troubleshooting(v)
);
}
private InputSetType Troubleshooting(string v)
{
return (InputSetType)Enum.Parse(typeof(InputSetType), v);
}
these two images show that the text in every row in the database type column is identical to the 2nd member of the enum that is mapped to this field.
The only way I can reproduce this error is for v to be an empty string, ie "". Not a null.
This :
Enum.Parse(typeof(InputSetType),"xx")
Throws Requested value 'xx' was not found..
While this:
Enum.Parse(typeof(InputSetType),"")
Throws :
Must specify valid information for parsing in the string. (Parameter 'value')
It seems some rows contain an empty string instead of null.
You'll have to decide what to do with those values - are they bad data? Or should the application handle them and replace them with some default value?
If you decide to use a default value you could use String.IsEmptyorString.IsNullOrWhitespace` to handle them, eg:
v=> String.NullOrWhitespace(v)?InputSetType.Unknown:Enum.Parse(typeof(InputSetType),v)
Probably this exception is caused by invalid data in the database - data, that is not present in the enum, null, empty string or so on. This error is common in Enum.Parse() so I think it is related to this method.
I am developing a site in which nhibernate is using. that is working fine for static mapping. but problem that i apply this application on existing database. so is there any way that mapping of classes took place at run time. i mean user provide tables and column names for mapping. Thanks
From your question I interpret you saying that the POCO classes exists, but you don't know the table or column names at build time.
So, if you already had this class:
public class MyGenericClass
{
public virtual long Id { get; set; }
public virtual string Title { get; set; }
}
You could bind it to a table and columns at runtime:
string tableName; // Set somewhere else by user input
string idColumnName; // Set somewhere else by user input
string titleColumnName; // Set somewhere else by user input
var configuration = new NHibernate.Cfg.Configuration();
configuration.Configure();
var mapper = new NHibernate.Mapping.ByCode.ModelMapper();
mapper.Class<MyGenericClass>(
classMapper =>
{
classMapper.Table(tableName);
classMapper.Id(
myGenericClass => myGenericClass.Id,
idMapper =>
{
idMapper.Column(idColumnName);
idMapper.Generator(Generators.Identity);
}
);
classMapper.Property(c => c.Title,
propertyMapper =>
{
propertyMapper.Column(titleColumnName);
}
);
}
);
ISessionFactory sessionFactory = configuration.BuildSessionFactory();
ISession session = sessionFactory.OpenSession();
////////////////////////////////////////////////////////////////////
// Now we can run an SQL query over this newly specified table
//
List<MyGenericClass> items = session.QueryOver<MyGenericClass>().List();
I don't think that could be possibly with NHibernate, but you could use a workaround.
You could use a view instead a table for the NHibernate mapping.
And in runtime, you could create that View or update it with the especified user mapping you need.
For example, you define a mapping in NHibernate to a view named ViewMapped with two columns Name and Mail.
And in the other hand, the user has a table with three columns Name, SecondName, EMail.
you can create a view on runtime with the following select:
(SELECT Name + ' ' + SecondName as Name, EMail as Mail FROM tableName) AS ViewMapped
I hope that helps you, or at least leads you to a solution.
Title says it all. The model does not require the StartDate field but on POST I'm told it's required. It's one of several search fields, each one optional. Due to that, I'm not checking IsModel.Valid so the search works anyway, but the message shows up onscreen. If I set, in the view, #Html.ValidationSummary(true), that hides the message but the field still turns red.
Also, I do have a check to make sure EndDate is later than StartDate, so I need the messages for errors /requried fields to show up, just not when there ISN'T an error.
Here's the code:
MODEL (Partial)
[Display(Name = "Start Date")]
[DataType(DataType.Date)]
public DateTime StartDate { get; set; }
[Display(Name = "End Date")]
[GreaterThanOrEqualTo("StartDate", ErrorMessage = "End Date must be later than Start Date")]
[DataType(DataType.Date)]
public DateTime EndDate { get; set; }
VIEW (partial)
#using (Html.BeginForm()){
#Html.ValidationSummary(false)
<table>
<tr>
<td>
#Html.DisplayNameFor(m => m.StartDate)
#Html.TextBox("StartDate", "", new { #class = "datefield" })
etc.
SHARED/DISPLAY TEMPLATES
#model Nullable<DateTime>
#(Model != null ? string.Format("{0:M/d/yyyy}", Model) : string.Empty)
SHARED/EDITOR TEMPLATES
#model Nullable<DateTime>
#{
DateTime dt = DateTime.Now;
if (Model != null)
{
dt = (System.DateTime) Model;
}
#Html.TextBox("", String.Format("{0:M/d/yyyy}", dt.ToShortDateString()), new { #class = "datefield", type = "date" })
}
Some of these editors are to make a pop-up calendar work, btw.
I've tried turning on/off various things and one way or another, it still says the date fields are required. Any ideas? Thanks.
Easy way to remove validation is make int Null-able, I have already tested and it works fine. here is example:
public int? Id { get; set; }
As mentioned in the comments, value types like DateTime, int, decimal, etc. are treated as required if you don't make them nullable.
If the GreaterThanOrEqualTo attribute doesn't come from a library (such as MVC Foolproof Validation), you should let it return true if both Startdate en Enddate are null. Else you woud have to write your own custom validation attribute, but it's not that hard to do.
I am looking at my Azure logs in the WADLogsTable and would like to filter the results, but I'm clueless as to how to do so. There is a textbox that says:
"Enter a WCF Data Services filter to limit the entities returned"
What is the syntax of a "WCF Data Services filter"? The following gives me an InvalidValueType error saying "The value specified is invalid.":
Timestamp gt '2011-04-20T00:00'
Am I even close? Is there a handy syntax reference somewhere?
This query should be in the format:
Timestamp gt datetime'2011-04-20T00:00:00'
Remembering to put that datetime in there is the important bit.
This trips me up every time, so I use the OData overview for reference.
Adding to knightffhor's response, you can certainly write a query which filters by Timstamp but this is not recommended approach as querying on "Timestamp" attribute will lead to full table scan. Instead query this table on PartitionKey attribute. I'm copying my response from other thread here (Can I capture Performance Counters for an Azure Web/Worker Role remotely...?):
"One of the key thing here is to understand how to effectively query this table (and other diagnostics table). One of the things we would want from the diagnostics table is to fetch the data for a certain period of time. Our natural instinct would be to query this table on Timestamp attribute. However that's a BAD DESIGN choice because you know in an Azure table the data is indexed on PartitionKey and RowKey. Querying on any other attribute will result in full table scan which will create a problem when your table contains a lot of data.The good thing about these logs table is that PartitionKey value in a way represents the date/time when the data point was collected. Basically PartitionKey is created by using higher order bits of DateTime.Ticks (in UTC). So if you were to fetch the data for a certain date/time range, first you would need to calculate the Ticks for your range (in UTC) and then prepend a "0" in front of it and use those values in your query.
If you're querying using REST API, you would use syntax like:
PartitionKey ge '0<from date/time ticks in UTC>' and PartitionKey le '0<to date/time in UTC>'."
I've written a blog post about how to write WCF queries against table storage which you may find useful: http://blog.cerebrata.com/specifying-filter-criteria-when-querying-azure-table-storage-using-rest-api/
Also if you're looking for a 3rd party tool for viewing and managing diagnostics data, may I suggest that you take a look at our product Azure Diagnostics Manager: /Products/AzureDiagnosticsManager. This tool is built specifically for surfacing and managing Windows Azure Diagnostics data.
The answer I accepted helped me immensely in directly querying the table through Visual Studio. Eventually, however, I needed a more robust solution. I used the tips I gained here to develop some classes in C# that let me use LINQ to query the tables. In case it is useful to others viewing this question, here is roughly how I now query my Azure logs.
Create a class that inherits from Microsoft.WindowsAzure.StorageClient.TableServiceEntity to represent all the data in the "WADLogsTable" table:
public class AzureDiagnosticEntry : TableServiceEntity
{
public long EventTickCount { get; set; }
public string DeploymentId { get; set; }
public string Role { get; set; }
public string RoleInstance { get; set; }
public int EventId { get; set; }
public int Level { get; set; }
public int Pid { get; set; }
public int Tid { get; set; }
public string Message { get; set; }
public DateTime EventDateTime
{
get
{
return new DateTime(EventTickCount, DateTimeKind.Utc);
}
}
}
Create a class that inherits from Microsoft.WindowsAzure.StorageClient.TableServiceContext and references the newly defined data object class:
public class AzureDiagnosticContext : TableServiceContext
{
public AzureDiagnosticContext(string baseAddress, StorageCredentials credentials)
: base(baseAddress, credentials)
{
this.ResolveType = s => typeof(AzureDiagnosticEntry);
}
public AzureDiagnosticContext(CloudStorageAccount storage)
: this(storage.TableEndpoint.ToString(), storage.Credentials) { }
// Helper method to get an IQueryable. Hard code "WADLogsTable" for this class
public IQueryable<AzureDiagnosticEntry> Logs
{
get
{
return CreateQuery<AzureDiagnosticEntry>("WADLogsTable");
}
}
}
I have a helper method that creates a CloudStorageAccount from configuration settings:
public CloudStorageAccount GetStorageAccount()
{
CloudStorageAccount.SetConfigurationSettingPublisher(
(name, setter) => setter(RoleEnvironment.GetConfigurationSettingValue(name)));
string configKey = "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";
return CloudStorageAccount.FromConfigurationSetting(configKey);
}
I create an AzureDiagnosticContext from the CloudStorageAccount and use that to query my logs:
public IEnumerable<AzureDiagnosticEntry> GetAzureLog(DateTime start, DateTime end)
{
CloudStorageAccount storage = GetStorageAccount();
AzureDiagnosticContext context = new AzureDiagnosticContext(storage);
string startTicks = "0" + start.Ticks;
string endTicks = "0" + end.Ticks;
IQueryable<AzureDiagnosticEntry> query = context.Logs.Where(
e => e.PartitionKey.CompareTo(startTicks) > 0 &&
e.PartitionKey.CompareTo(endTicks) < 0);
CloudTableQuery<AzureDiagnosticEntry> tableQuery = query.AsTableServiceQuery();
IEnumerable<AzureDiagnosticEntry> results = tableQuery.Execute();
return results;
}
This method takes advantage of the performance tip in Gaurav's answer to filter on PartitionKey rather than Timestamp.
If you wanted to filter the results by more than just date, you could filter the returned IEnumerable. But, you'd probably get better performance by filtering the IQueryable. You could add a filter parameter to your method and call it within the IQueryable.Where(). Eg,
public IEnumerable<AzureDiagnosticEntry> GetAzureLog(
DateTime start, DateTime end, Func<AzureDiagnosticEntry, bool> filter)
{
...
IQueryable<AzureDiagnosticEntry> query = context.Logs.Where(
e => e.PartitionKey.CompareTo(startTicks) > 0 &&
e.PartitionKey.CompareTo(endTicks) < 0 &&
filter(e));
...
}
In the end, I actually further abstracted most of these classes into base classes in order to reuse the functionality for querying other tables, such as the one storing the Windows Event Log.