I want to fetch data from mongo using comma separated string in C#. Below is my code.
string numbers = "32,12,56,78";
List<CustomerNumbers> calling_number= new List<CDRs>();
IMongoCollection<CustomerNumbers> Collec;
calling_number= Collec.Find(x => x.customer == ID && x.numbers.Contains(numbers)).ToList();
I am new to mongo and and did't know the exact approach. while using above code I am getting records for single number. Please guide me to fix this.
TIA
Class structure
public class CustomerNumbers
{
public string numbers { get; set; }
public int customer { get; set; }
}
Ideally, you'd have modeled your numbers field in mongo as an array rather than a delimited string, this would allow you to add indexes and performance tune your queries.
However, we can make use of the aggregation pipeline for this. We'll need an extra class like the following that we'll use:
public class CustomerNumbersSplit
{
public string[] numbers { get; set; }
public int customer { get; set; }
}
We'll also need to do is split your comma-delimited string.
var numbers = "32,12,56,78";
var numbersSplit = numbers.Split(",", StringSplitOptions.RemoveEmptyEntries)
.Select(int.Parse)
.ToArray();
Once we've done that we can then write a query as the following:
var result = await collection.Aggregate()
.Match(x => x.customer == ID)
.AppendStage<CustomerNumbersSplit>(#"{ $addFields: { numbers: { $split: [""$numbers"", "","" ] } } }")
.Match(x => x.numbers.Any(y => numbersSplit.Contains(y)))
.ToListAsync();
This makes use of a $split and $addFields, this is so we can use the database engine to split the number and query them in the database engine.
If you're interested in the query that it generated:
[
{ "$match" : { "customer" : 12324 } },
{ "$addFields" : { "numbers" : { "$split" : ["$numbers", ","] } } },
{ "$match" : { "numbers" : { "$elemMatch" : { "$in" : ["32", "12", "56", "78"] } } } }
]
Related
I'm trying to configure ASP.NET Core 5's ForwardedHeadersMiddleware from appsettings.config. I'm having trouble to set KnownProxies (IList<IPAddress> KnownProxies { get; }) and it keeps reverting back to the default value. I assume it has to do with the options machinery not knowing how to convert the string to an IPAddress, or KnownProxies only having a getter.
{
"ForwardedHeaders": {
"ForwardedHeaders": "All"
"KnownProxies": ["10.0.0.1"]
}
}
services.Configure<ForwardedHeadersOptions>(Configuration.GetSection("ForwardedHeaders"));
How can I achieve what I want, without doing the parsing manually?
Can I specify the mapping somewhere generic?
Why doesn't this throw an exception that some of my configuration could not be parsed / is invalid?
I may propose my recipe for this:
Define your own options class like the following:
public class ForwardedForKnownNetworks
{
public class Network
{
public string Prefix { get; set; }
public int PrefixLength { get; set; }
}
public List<Network> Networks { get; set; } = new List<Network>();
public List<string> Proxies { get; set; } = new List<string>();
}
To make your code shorter and more agile you may need kind of helper method to resolve hostnames and/or convert parse IP addressed strings:
public static class Extensions
{
public static IPAddress[] ResolveIP(this string? host)
{
return (!string.IsNullOrWhiteSpace(host))
? Dns.GetHostAddresses(host)
: new IPAddress[0];
}
}
When configuring your services, you can add something similar to the following:
var forwardedForKnownNetworks = builder.Configuration.GetSection(nameof(ForwardedForKnownNetworks)).Get<ForwardedForKnownNetworks>();
_ = builder.Services.Configure<ForwardedHeadersOptions>(options => {
options.ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto;
forwardedForKnownNetworks?.Networks?.ForEach((network) => options.KnownNetworks.Add(new IPNetwork(IPAddress.Parse(network.Prefix), network.PrefixLength)));
forwardedForKnownNetworks?.Proxies?.ForEach((proxy) => proxy.ResolveIP().ToList().ForEach((ip) => options.KnownProxies.Add(ip)));
});
And finally your appsettings.json may appear like the following:
{
"ForwardedForKnownNetworks": {
"Networks": [
{
"Prefix": "172.16.0.0",
"PrefixLength": 12
},
{
"Prefix": "192.168.0.0",
"PrefixLength": 16
},
{
"Prefix": "10.0.0.0",
"PrefixLength": 8
}
],
"Proxies": ["123.234.32.21", "my.proxy.local"] // Here you can mention IPs and/or hostnames as the ResolveIP() helper will take care of that and resolve any hostname to its IP(s)
}
}
I need to add an import/export functionality to my ASP.NET Core application.
What I would like is to take entities in one database, export these entities into one file, and then import that file into a new database.
My problem is that I have some entities that hold same foreign key. Here is a simple model illustrating what I want to do:
public class Bill
{
public List<Product> Products { get; set; }
public int Id { get; set; }
...
}
public class Product
{
public int Id { get; set; }
public int ProductCategoryId { get; set; }
public ProductCategory ProductCategory { get; set; }
...
}
public class Category
{
public int Id { get; set; }
public string Name { get; set; }
}
I want to export a bill to be imported on an other environment of my application. So if I export the bill, I will get a Json like this:
{
"id": 1,
"products" : [
{
"id" : 1,
"productCategoryId": 1,
"productCategory": {
"id" : 1,
"name" : "Category #1"
}
},
{
"id" : 2,
"productCategoryId": 1,
"productCategory": {
"id" : 1,
"name" : "Category #1"
}
},
{
"id" : 3,
"productCategoryId": 1,
"productCategory": {
"id" : 2,
"name" : "Category #2"
}
}
]
}
If I deserialize this json into entities in my new environment (ignoring Ids mapping of course), I will get three new categories (category will be duplicated for product 1 and 2) because the serializer will instanciate two categories...
So when I push it into my database, it will add 3 lines instead 2 into the Category table...
Thanks in advance for your answers.
Suppose you have a list of Category to be imported, you could firstly get the id list of these categories and then query the database to make sure which has been already stored in database. And for those already existing ones, just skip them (or update them as you like).
Since we have multiple entity types (categories,products,bills and potential BillProducts), rather than writing a new importer for each TEntity , I prefer to writing a generic Importer method to deal with any Entity type list with Generic and Reflection :
public async Task ImportBatch<TEntity,TKey>(IList<TEntity> entites)
where TEntity : class
where TKey: IEquatable<TKey>
{
var ids = entites.Select( e=> GetId<TEntity,TKey>(e));
var existingsInDatabase=this._context.Set<TEntity>()
.Where(e=> ids.Any(i => i.Equals(GetId<TEntity,TKey>(e)) ))
.ToList();
using (var transaction = this._context.Database.BeginTransaction())
{
try{
this._context.Database.ExecuteSqlCommand("SET IDENTITY_INSERT " + typeof(TEntity).Name + " ON;");
this._context.SaveChanges();
foreach (var entity in entites)
{
var e= existingsInDatabase.Find(existing => {
var k1 =GetId<TEntity,TKey>(existing);
var k2=GetId<TEntity,TKey>(entity);
return k1.Equals(k2);
});
// if not yet exists
if(e == null){
this._context.Add(entity);
}else{
// if you would like to update the old one when there're some differences
// uncomment the following line :
// this._context.Entry(e).CurrentValues.SetValues(entity);
}
}
await this._context.SaveChangesAsync();
transaction.Commit();
}
catch{
transaction.Rollback();
}
finally{
this._context.Database.ExecuteSqlCommand($"SET IDENTITY_INSERT " + typeof(TEntity).Name + " OFF;");
await this._context.SaveChangesAsync();
}
}
return;
}
Here the GetId<TEntity,TKey>(TEntity e) is a simple helper method which is used to get the key filed of e:
// use reflection to get the Id of any TEntity type
private static TKey GetId<TEntity,TKey>(TEntity e)
where TEntity : class
where TKey : IEquatable<TKey>
{
PropertyInfo pi=typeof(TEntity).GetProperty("Id");
if(pi == null) { throw new Exception($"the type {typeof(TEntity)} must has a property of `Id`"); }
TKey k = (TKey) pi.GetValue(e);
return k ;
}
To make the code more reusable, we can create an EntityImporter service to hold the method above :
public class EntityImporter{
private DbContext _context;
public EntityImporter(DbContext dbContext){
this._context = dbContext;
}
public async Task ImportBatch<TEntity,TKey>(IList<TEntity> entites)
where TEntity : class
where TKey: IEquatable<TKey>
{
// ...
}
public static TKey GetId<TEntity,TKey>(TEntity e)
where TEntity : class
where TKey : IEquatable<TKey>
{
// ...
}
}
and then register the services at startup time:
services.AddScoped<DbContext, AppDbContext>();
services.AddScoped<EntityImporter>();
Test Case :
Firstly, I'll take several categories as an example :
var categories = new ProductCategory[]{
new ProductCategory{
Id = 1,
Name="Category #1"
},
new ProductCategory{
Id = 2,
Name="Category #2"
},
new ProductCategory{
Id = 3,
Name="Category #3"
},
new ProductCategory{
Id = 2,
Name="Category #2"
},
new ProductCategory{
Id = 1,
Name="Category #1"
},
};
await this._importer.ImportBatch<ProductCategory,int>(categories);
The expected results should be only 3 rows imported :
1 category #1
2 category #2
3 category #3
And here's a screenshot it works:
Finally, for your bills json, you can do as below to import the entites :
var categories = bill.Products.Select(p=>p.ProductCategory).ToList();
var products = bill.Products.ToList();
// other List<TEntity>...
// import the categories firstly , since they might be referenced by other entity
await this._importer.ImportBatch<ProductCategory,int>(categories);
// import the product secondly , since they might be referenced by BillProduct table
await this._import.ImportBatch<Product,int>(products);
// ... import others
I'm using RavenDB 2.5 and what I want to do is query a Group (see below) providing a valid Lucene search term and get back a collection of Member instances (or even just Ids) that match. So, class definition:
public class Group {
public string Id { get; set; }
public IList<Member> Members { get; set; }
}
public class Member {
public string Name { get; set; }
public string Bio { get; set; }
}
And they are stored in the database as session.Store(groupInstance); as you'd expect. What I'd like to do is query and return the Member instances which match a given search term.
So, something like:
public class GroupMembers_BySearchTerm : AbstractIndexCreationTask {
public override IndexDefinition CreateIndexDefinition(){
return new IndexDefinition {
Map = "from g in docs.Groups select new { Content = new [] { g.Members.Select(m => m.Name), g.Members.Select(m => m.Bio) }, Id = g.Id }",
Indexes = { { "Id", FieldIndexing.Default }, { "Content", FieldIndexing.Analyzed } }
}
}
}
If I call this using something like:
session.Advanced.LuceneQuery<Group, GroupMembers_BySearchTerm>().Where("Id: myId AND Content: search terms").ToList();
I obviously get back a Group instance, but how can I get back the Members instead?
What about an index like this:
public class Members_BySearchTermAndGroup : AbstractIndexCreationTask {
public override IndexDefinition CreateIndexDefinition(){
return new IndexDefinition {
Map = "from g in docs.Groups
from member in g.Members
select new {
GroupdId = g.Id,
Name = member.Name,
Bio = member.Bio,
Content = new [] {member.Name, member.Bio },
}",
Indexes = {
{ "GroupId", FieldIndexing.Default },
{ "Content", FieldIndexing.Analyzed }
},
Stores = {
{ "Name", FieldStorage.Yes },
{ "Bio", FieldStorage.Yes }
}
}
}
}
If you take a closer look you'll see that we are creating a new lucene entry for each member inside of a group. Consequently, you'll be able to query on those elements and retrieve them.
Finally you can query your store like this (more info about searching):
session.Query<Member, Members_BySearchTermAndGroup>()
.Search(x => x.Content, "search terms")
.ProjectFromIndexFieldsInto<Member>()
.ToList();
I cannot check this right now but I guess that you need to project your results using the ProjectFromIndexFieldsInto(). Some more information about projections in this link.
or, following your example:
session.Advanced
.LuceneQuery<Member, Members_BySearchTermAndGroup>()
.Where("GroupId: myId AND Content: search terms")
.SelectFields<Member>()
.ToList();
In a question I asked in the RavenDB discussion boards about making a large number of similar indexes, I was told that I should not make "so many indexes that do the same thing" and told I should make a "string index" instead - the exact words were ...
There is no reason to have so many indexes that do the same thing.
Create an index with:
from d in docs
select new { d.Id, d.Name, Collection = d["#metadata"]["Raven-Entity-Name"] }
And query on that.
Reference Topic
I do not understand at all what this means, I have read the raven documentation many times before today, and I'm still very lost and in the dark.
The best I can come up with; Or the closest I understand, is something kind of like this...
RavenSession.Advanced.DocumentStore.DatabaseCommands.PutIndex("Index/Name",
new Raven.Client.Indexes.IndexDefinitionBuilder<IMayTransform>{
Map = results => from result in results
select new{
result.Id,
result.Name,
result["#metadata"]["Raven-Entity-Name"]
}
});
Update
Adding the interfaces involved, by request.
public interface IMayTransform : IHasIdentity, IHasName { }
public interface IHasIdentity {
/// <summary>
/// A string based identity that is used by raven to store the entity.
/// </summary>
string Id { get; set; }
}
public interface IHasName {
/// <summary>
/// The friendly display name for the entity.
/// </summary>
string Name { get; set; }
}
But this does not work. First of all, the part about ["#metadata"] is not compiling; I do not even understand the purpose if its existence in the example I was cited. I do not understand how I am supposed to query something like this, either - or where it goes, or where it is defined, how it is called, etc. I have literally no concept of what a "string index" is.
On top of that, I do not comprehend how I add analyzers to this. Any help is appreciated; This has been a horrible, horrible morning and I am frantic to get at least some modicum of work done, but this is leaving me very confused.
IMayTransform is an interface that every entity that needs to fit this implements.
Update (Again)
Between the answer here on stack overflow, and the glorious help from Ayende on the ravendb google groups, I have the following index that works. And I HATE IT.
public class EntityByName : AbstractIndexCreationTask {
public override IndexDefinition CreateIndexDefinition() {
return new IndexDefinition {
Map = "from doc in docs let collection = doc[\"#metadata\"][\"Raven-Entity-Name\"] select new { doc.Id, doc.Name, doc.Abbreviation, Group_Name = doc.Group.Name, Tags = doc.Tags.Select( r => r.Name ), collection };",
Indexes = {
{"Name", FieldIndexing.Analyzed },
{"Abbreviation", FieldIndexing.Analyzed },
{"Tags", FieldIndexing.Analyzed },
{"Group.Name", FieldIndexing.Analyzed }
},
Fields ={
{ "Name" },
{ "Abbreviation" },
{ "Group.Name" },
//{ "Tags" }
},
Stores ={
{ "Name", FieldStorage.Yes },
{ "Abbreviation", FieldStorage.Yes },
{ "Group.Name", FieldStorage.Yes },
//{ "Tags", FieldStorage.Yes }
},
SortOptions = {
{ "collection", SortOptions.String },
{ "Name", SortOptions.String }
}
};
}
}
This does exactly what I need it to do. It is fast, efficient, it works fine, it gives me no trouble, but hard coding everything into a string query is driving me nuts. I am closing this question and opening a new one concerning this, but if anyone has suggestions on ways I can get away from this, I would love to hear them. This is an issue of personal desire, not functional need.
try it like this
RavenSession.Advanced.DocumentStore.DatabaseCommands.PutIndex(
"Index/Name",
new IndexDefinition {
Map = "from d in docs select new { Id = d.Id, Name = [\"#metadata\"][\"Raven-Entity-Name\"]}"
});
now you see, what a "string index" is as well.
BTW: there is a build-in index, that does a similar thing as this: Raven/DocumentsByEntityName
Usage:
public class Entity
{
public string Id { get; set; }
public string Name { get; set; } }
}
session.Query<Entity>("Index/Name").Where(x => x.Name == "Foo");
Further Explanation:
There is now way to build indexes using interfaces only. To achieve what you want, Raven would need to implement a BaseEntity that populates the matadata as properties and all documents must derive from that Entity. But that is not going to happen and nobody wnats that.
But, when designing your documents, you could integrate the entity name yourself, like
public class EntityBase
{
public EntityBase()
{
this.TypeName = this.GetType().Name;
}
public string Name { get; set; }
public string TypeName { get; set; }
}
public class Person : EntityBase
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
Now you could build an ravendb index based on EntityBase and query TypeName
Consider following database model:
And following query code:
using (var context = new DatabaseEntities())
{
return context.Feed.ToHierarchy(f => f.Id_Parent, null);
}
Where ToHierarchy is an extension to ObjectSet<TEntity> as:
public static List<TEntity> ToHierarchy<TEntity, TProperty>(this ObjectSet<TEntity> set, Func<TEntity, TProperty> parentIdProperty, TProperty idRoot) where TEntity : class
{
return set.ToList().Where(f => parentIdProperty(f).Equals(idRoot)).ToList();
}
This would result in example JSON formatted response:
[
{
"Description":"...",
"Details":[ ],
"Id":1,
"Id_Parent":null,
"Title":"...",
"URL":"..."
},
{
"Description":"...",
"Details":[
{
"Description":"...",
"Details":[ ],
"Id":4,
"Id_Parent":3,
"Title":"...",
"URL":"..."
},
{
"Description":"...",
"Details":[
{
"Description":"...",
"Details":[
{
"Description":"...",
"Details":[ ],
"Id":7,
"Id_Parent":6,
"Title":"...",
"URL":"..."
}
],
"Id":6,
"Id_Parent":5,
"Title":"...",
"URL":"..."
}
],
"Id":5,
"Id_Parent":3,
"Title":"...",
"URL":null
}
],
"Id":3,
"Id_Parent":null,
"Title":"...",
"URL":null
}
]
As you may have noticed ToHierarchy method is supposed to (and apparently do indeed) retrieve all rows from a given set (flat) and return hierarchical representation of these as per "parent property".
When I was in the middle of my implementation I quick tried my code and surprisingly it worked! Now, I imagine how weird does this sound to many of you, but I really don't understand why or how that piece of code works, even though I kinda wrote it down on my own...
Could you explain how does it work?
P.S.: if you look closer, ToHierarchy is not near the same as .Include("Details").
It works because set.ToList will load all records from the database table to your application and the rest is done be EF and its change tracking mechanism which should ensure correct referencing between related entities.
Btw. you are filtering records in the memory of your application, not in the database. For example if your table contains 10000 records and your filter should return only 10, you will still load all 10000 from the database.
You will find that implementing this with EF is quite hard because EF has no support for hierarchical data. You will always end with bad solution. The only good solution is using stored procedure and some support for hierarchical queries in the database - for example common table expressions (CTE) in SQL server.
I just made this very simple example and it works as I described in comment:
public class SelfReferencing
{
public int Id { get; set; }
public string Name { get; set; }
public SelfReferencing Parent { get; set; }
public ICollection<SelfReferencing> Children { get; set; }
}
public class Context : DbContext
{
public DbSet<SelfReferencing> SelfReferencings { get; set; }
}
public class Program
{
static void Main(string[] args)
{
using (var context = new Context())
{
context.Database.Delete();
context.Database.CreateIfNotExists();
context.SelfReferencings.Add(
new SelfReferencing()
{
Name = "A",
Children = new List<SelfReferencing>()
{
new SelfReferencing()
{
Name = "AA",
Children = new List<SelfReferencing>()
{
new SelfReferencing()
{
Name = "AAA"
}
}
},
new SelfReferencing()
{
Name = "AB",
Children = new List<SelfReferencing>()
{
new SelfReferencing()
{
Name = "ABA"
}
}
}
}
});
context.SaveChanges();
}
using (var context = new Context())
{
context.Configuration.LazyLoadingEnabled = false;
context.Configuration.ProxyCreationEnabled = false;
var data = context.SelfReferencings.ToList();
}
}
}
It uses code first approach but internally it is same as when using EDMX. When I get data I have 5 entities in list and all have correctly configured Parent and Children navigation properties.