Convert Json to Newline Delimit json - google-bigquery

I need to convert my json to Newline delimiter to insert data in BigQuery from C#(.NET Application).
Please suggest the workaround
Input
[
{
"DashboardCategoryId":1,
"BookingWindows":[
{
"DaysRange":"31-60 Days",
"BookingNumber":2
},
{
"DaysRange":"Greater Than 1 year",
"BookingNumber":1
}
]
},
{
"DashboardCategoryId":1,
"BookingWindows":[
{
"DaysRange":"61-120 Days",
"BookingNumber":1
},
{
"DaysRange":"8-14",
"BookingNumber":1
}
]
}
]
Required Output
{"DashboardCategoryId": 1,"BookingWindows": [{"DaysRange": "31-60 Days","BookingNumber":2},{"DaysRange": "Greater Than 1 year","BookingNumber": 1}]}
{"DashboardCategoryId": 1,"BookingWindows": [{"DaysRange": "61-120 Days","BookingNumber":1},{"DaysRange": "8-14","BookingNumber": 1}]}

If you have already loaded your JSON array into memory as, say, a List<JToken>, you can write it to newline delimited JSON by using the answer from Serialize as NDJSON using Json.NET.
However, since BigQuery newline delimited JSON files do tend to be... big, I'm going to suggest instead an entirely streaming solution:
public static class JsonExtensions
{
public static void ToNewlineDelimitedJson(Stream readStream, Stream writeStream)
{
var encoding = new UTF8Encoding(false, true);
// Let caller dispose the underlying streams.
using (var textReader = new StreamReader(readStream, encoding, true, 1024, true))
using (var textWriter = new StreamWriter(writeStream, encoding, 1024, true))
{
ToNewlineDelimitedJson(textReader, textWriter);
}
}
public static void ToNewlineDelimitedJson(TextReader textReader, TextWriter textWriter)
{
using (var jsonReader = new JsonTextReader(textReader) { CloseInput = false, DateParseHandling = DateParseHandling.None })
{
ToNewlineDelimitedJson(jsonReader, textWriter);
}
}
enum State { BeforeArray, InArray, AfterArray };
public static void ToNewlineDelimitedJson(JsonReader jsonReader, TextWriter textWriter)
{
var state = State.BeforeArray;
do
{
if (jsonReader.TokenType == JsonToken.Comment || jsonReader.TokenType == JsonToken.None || jsonReader.TokenType == JsonToken.Undefined || jsonReader.TokenType == JsonToken.PropertyName)
{
// Do nothing
}
else if (state == State.BeforeArray && jsonReader.TokenType == JsonToken.StartArray)
{
state = State.InArray;
}
else if (state == State.InArray && jsonReader.TokenType == JsonToken.EndArray)
{
state = State.AfterArray;
}
else
{
// Formatting.None is the default; I set it here for clarity.
using (var jsonWriter = new JsonTextWriter(textWriter) { Formatting = Formatting.None, CloseOutput = false })
{
jsonWriter.WriteToken(jsonReader);
}
// http://specs.okfnlabs.org/ndjson/
// Each JSON text MUST conform to the [RFC7159] standard and MUST be written to the stream followed by the newline character \n (0x0A).
// The newline charater MAY be preceeded by a carriage return \r (0x0D). The JSON texts MUST NOT contain newlines or carriage returns.
textWriter.Write("\n");
// Root value wasn't an array after all, so end writing with one item.
if (state == State.BeforeArray)
state = State.AfterArray;
}
}
while (jsonReader.Read() && state != State.AfterArray);
}
}
Then use it as follows:
using (var readStream = File.OpenRead(fromFileName))
using (var writeStream = File.Open(toFileName, FileMode.Create))
{
JsonExtensions.ToNewlineDelimitedJson(readStream, writeStream);
}
This takes advantage of the method JsonWriter.WriteToken(JsonReader) to write and format directly from a JsonReader to a JsonWriter without ever loading the entire JSON token hierarchy into memory.
Working sample .Net fiddle.

Newtonsoft Json.NET can be used to format JSON.
I've found example here:
private static string FormatJson(string json)
{
dynamic parsedJson = JsonConvert.DeserializeObject(json);
return JsonConvert.SerializeObject(parsedJson, Formatting.Indented);
}

Related

How to modify the LoadConvert Apex payload?

I've written a trigger under the LeadConvert update event as follows:
trigger WebhookSenderTriggerLeadConvert on Lead (after update) {
if (Trigger.new.size() == 1) {
if (Trigger.old[0].isConverted == false && Trigger.new[0].isConverted == true) {
if (Trigger.new[0].ConvertedAccountId != null) {
String url = 'https://mydomain.io';
String content = WebhookSender.jsonContent(Trigger.new, Trigger.old);
WebhookSender.callout(url, content);
}
}
}
}
This works for me on a dev Salesforce, and in the payload I correctly receive:
{
"new":[
{
"attributes":{
"type":"Lead",
"url":"/services/data/v56.0/sobjects/Lead/B00000000000000000"
},
"Id":"B00000000000000000",
...(+30 more fields)
}
],
"old":[
{
"attributes":{
"type":"Lead",
"url":"/services/data/v56.0/sobjects/Lead/B00000000000000000"
},
"Id":"B00000000000000000",
...(+30 more fields)
}
],
"userId":"A00000000000000000"
}
However in another third party Salesforce account I get the following:
{
"new":[
{
"attributes":{
"type":"Lead",
"url":"/services/data/v56.0/sobjects/Lead/C00000000000000000"
},
...(9 more fields)
}
],
"old":[
{
"attributes":{
},
...(9 more fields)
}
],
"userId":"D00000000000000000"
}
I've obfuscated a lot of the fields here as a lot of it is sensitive, but what i'm unable to determine is what exactly causing a large portion of fields in the third-party Salesforce to not be there, including the Id field, where in the dev Salesforce everything is present.
Is there anything that may be doing this?
EDIT:
Posting WebhookSender, as it's been brought up in comments
public class WebhookSender {
public static String jsonContent(List<Object> triggerNew, List<Object> triggerOld) {
String newObjects = '[]';
if (triggerNew != null) {
newObjects = JSON.serialize(triggerNew);
}
String oldObjects = '[]';
if (triggerOld != null) {
oldObjects = JSON.serialize(triggerOld);
}
String userId = JSON.serialize(UserInfo.getUserId());
String content = '{"new": ' + newObjects + ', "old": ' + oldObjects + ', "userId": ' + userId + '}';
return content;
}
#future(callout=true)
public static void callout(String url, String content) {
Http h = new Http();
HttpRequest req = new HttpRequest();
req.setEndpoint(url);
req.setMethod('POST');
req.setHeader('Content-Type', 'application/json');
req.setHeader('Authorization', 'someKey');
req.setBody(content);
if (!Test.isRunningTest()) {h.send(req);}
}
public static Map<String, Object> ParseRequest(RestRequest req) {
String body = req.requestBody.toString();
Map<String, Object> data = (Map<String, Object>) JSON.deserializeUntyped(body);
return data;
}
}
Well, is the WebhookSender class identical in both? Does it dump all fields it received to JSON or does it contain some security-related code such as "stripInaccessible"? Maybe the fields are there but your Profile doesn't see them so strip... cuts them out?
Can it be that your dev org simply has 20+ custom fields in Lead table more than the other org?
Are the fields you're missing coming from managed package? They'd have namespace__FieldName__c format, with 4 underscores total. Maybe the package isn't installed. If you know the fields are there but your user doesn't have license for that managed package - it's possible they'll be hidden.

Update appsetting.json in run time in .netcore

I have this appsetting json file :
"RavenOptions": {
"PublicUrl": "http://127.0.0.1:61570",
"PublicDbName": "TestHostBuilder_ctor_1",
"TseDbName": "TestHostBuilder_ctor_1",
"IsHttps": "false",
"CertificateDirectory": "",
"ShardUrls": [
"http://127.0.0.1:61570",
"http://127.0.0.1:61570",
"http://127.0.0.1:61570",
"http://127.0.0.1:61570"
]
}
I need to update the values of the file in runtime .I am using this function :
public static void AddOrUpdateAppSetting<T>(string key, T value)
{
try
{
var filePath = Path.Combine(AppContext.BaseDirectory, "appSettingTest.json");
string json = File.ReadAllText(filePath);
dynamic jsonObj = Newtonsoft.Json.JsonConvert.DeserializeObject(json);
var sectionPath = key.Split(":")[0];
if (!string.IsNullOrEmpty(sectionPath))
{
var keyPath = key.Split(":")[1];
jsonObj[sectionPath][keyPath] = value;
}
else
{
jsonObj[sectionPath] = value; // if no sectionpath just set the value
}
string output = Newtonsoft.Json.JsonConvert.SerializeObject(jsonObj, Newtonsoft.Json.Formatting.Indented);
File.WriteAllText(filePath, output);
}
catch (ConfigurationErrorsException)
{
Console.WriteLine("Error writing app settings");
}
}
this function works fine but the problem is when I want to update the values of sharedurls (this is an array) the result is like this :
All records are in one row .But I need tho be like this :
every records is one row.
Here is my update code :
string serverAddress = "\""+documentStore.Identifier.Split("(").First()+"\"";
Appset.AddOrUpdateAppSetting("RavenOptions:PublicDbName", documentStore.Database);
Appset.AddOrUpdateAppSetting("RavenOptions:PublicUrl", documentStore.Identifier.Split("(").First());
Appset.AddOrUpdateAppSetting("RavenOptions:ShardUrls","["+String.Join(",", Enumerable.Repeat(serverAddress, 4).ToArray())+"]");

EF Core decimal precision for Always Encrypted column

Hello I have SQL server with setting up always encrypted feature, also I setup EF for work with always encrypted columns, but when I try to add/update, for Db manipulation I use DbContext, entry in my Db I get follow error:
Operand type clash: decimal(1,0) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = '****', column_encryption_key_database_name = '****') is incompatible with decimal(6,2) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = '*****', column_encryption_key_database_name = '****')
Model that I use
public class Model
{
/// <summary>
/// Payment method name
/// </summary>
[Column(TypeName = "nvarchar(MAX)")]
public string Name { get; set; }
/// <summary>
/// Payment method description
/// </summary>
[Column(TypeName = "nvarchar(MAX)")]
public string Description { get; set; }
/// <summary>
/// Fee charges for using payment method
/// </summary>
[Column(TypeName = "decimal(6,2)")]
public decimal Fee { get; set; }
}
Also I tried to specify decimal format in OnModelCreating method
builder.Entity<Model>().Property(x => x.Fee).HasColumnType("decimal(6,2)");
What I missed ?
Thanks for any advice
My colleague and I have found a workaround to the problem using the DiagnosticSource.
You must know that:
Entity Framework Core hooks itself into DiagnosticSource.
DiagnosticSource uses the observer pattern to notify its observers.
The idea is to populate the 'Precision' and 'Scale' fields of the command object (created by EFCore), in this way the call made to Sql will contain all the information necessary to correctly execute the query.
First of all, create the listener:
namespace YOUR_NAMESPACE_HERE
{
public class EfGlobalListener : IObserver<DiagnosticListener>
{
private readonly CommandInterceptor _interceptor = new CommandInterceptor();
public void OnCompleted()
{
}
public void OnError(Exception error)
{
}
public void OnNext(DiagnosticListener value)
{
if (value.Name == DbLoggerCategory.Name)
{
value.Subscribe(_interceptor);
}
}
}
}
Where CommandInterceptor is:
namespace YOUR_NAMESPACE_HERE
{
public class CommandInterceptor : IObserver<KeyValuePair<string, object>>
{
// This snippet of code is only as example, you could maybe use Reflection to retrieve Field mapping instead of using Dictionary
private Dictionary<string, (byte Precision, byte Scale)> _tableMapping = new Dictionary<string, (byte Precision, byte Scale)>
{
{ "Table1.DecimalField1", (18, 2) },
{ "Table2.DecimalField1", (12, 6) },
{ "Table2.DecimalField2", (10, 4) },
};
public void OnCompleted()
{
}
public void OnError(Exception error)
{
}
public void OnNext(KeyValuePair<string, object> value)
{
if (value.Key == RelationalEventId.CommandExecuting.Name)
{
// After that EF Core generates the command to send to the DB, this method will be called
// Cast command object
var command = ((CommandEventData)value.Value).Command;
// command.CommandText -> contains SQL command string
// command.Parameters -> contains all params used in sql command
// ONLY FOR EXAMPLE PURPOSES
// This code may contain errors.
// It was written only as an example.
string table = null;
string[] columns = null;
string[] parameters = null;
var regex = new Regex(#"^INSERT INTO \[(.+)\] \((.*)\)|^VALUES \((.*)\)|UPDATE \[(.*)\] SET (.*)$", RegexOptions.Multiline);
var matches = regex.Matches(command.CommandText);
foreach (Match match in matches)
{
if(match.Groups[1].Success)
{
// INSERT - TABLE NAME
table = match.Groups[1].Value;
}
if (match.Groups[2].Success)
{
// INSERT - COLS NAMES
columns = match.Groups[2].Value.Split(",", StringSplitOptions.RemoveEmptyEntries).Select(c => c.Replace("[", string.Empty).Replace("]", string.Empty).Trim()).ToArray();
}
if (match.Groups[3].Success)
{
// INSERT - PARAMS VALUES
parameters = match.Groups[3].Value.Split(",", StringSplitOptions.RemoveEmptyEntries).Select(c => c.Trim()).ToArray();
}
if (match.Groups[4].Success)
{
// UPDATE - TABLE NAME
table = match.Groups[4].Value;
}
if (match.Groups[5].Success)
{
// UPDATE - COLS/PARAMS NAMES/VALUES
var colParams = match.Groups[5].Value.Split(",", StringSplitOptions.RemoveEmptyEntries).Select(p => p.Replace("[", string.Empty).Replace("]", string.Empty).Trim()).ToArray();
columns = colParams.Select(cp => cp.Split('=', StringSplitOptions.RemoveEmptyEntries)[0].Trim()).ToArray();
parameters = colParams.Select(cp => cp.Split('=', StringSplitOptions.RemoveEmptyEntries)[1].Trim()).ToArray();
}
}
// After taking all the necessary information from the sql command
// we can add Precision and Scale to all decimal parameters
foreach (var item in command.Parameters.OfType<SqlParameter>().Where(p => p.DbType == DbType.Decimal))
{
var index = Array.IndexOf<string>(parameters, item.ParameterName);
var columnName = columns.ElementAt(index);
var key = $"{table}.{columnName}";
// Add Precision and Scale, that fix our problems w/ always encrypted columns
item.Precision = _tableMapping[key].Precision;
item.Scale = _tableMapping[key].Scale;
}
}
}
}
}
Finally add in the Startup.cs the following line of code to register the listener:
DiagnosticListener.AllListeners.Subscribe(new EfGlobalListener());
Ecountered the same issue.
Adjusted #SteeBono interceptor to work with commands which contain multiple statements:
public class AlwaysEncryptedDecimalParameterInterceptor : DbCommandInterceptor, IObserver<KeyValuePair<string, object>>
{
private Dictionary<string, (SqlDbType DataType, byte? Precision, byte? Scale)> _decimalColumnSettings =
new Dictionary<string, (SqlDbType DataType, byte? Precision, byte? Scale)>
{
// MyTableDecimal
{ $"{nameof(MyTableDecimal)}.{nameof(MyTableDecimal.MyDecimalColumn)}", (SqlDbType.Decimal, 18, 6) },
// MyTableMoney
{ $"{nameof(MyTableMoney)}.{nameof(MyTableMoney.MyMoneyColumn)}", (SqlDbType.Money, null, null) },
};
public void OnCompleted()
{
}
public void OnError(Exception error)
{
}
// After that EF Core generates the command to send to the DB, this method will be called
public void OnNext(KeyValuePair<string, object> value)
{
if (value.Key == RelationalEventId.CommandExecuting.Name)
{
System.Data.Common.DbCommand command = ((CommandEventData)value.Value).Command;
Regex regex = new Regex(#"INSERT INTO \[(.+)\] \((.*)\)(\r\n|\r|\n)+VALUES \(([^;]*)\);|UPDATE \[(.*)\] SET (.*)|MERGE \[(.+)\] USING \((\r\n|\r|\n)+VALUES \(([^A]*)\) AS \w* \((.*)\)");
MatchCollection matches = regex.Matches(command.CommandText);
foreach (Match match in matches)
{
(string TableName, string[] Columns, string[] Params) commandComponents = GetCommandComponents(match);
int countOfColumns = commandComponents.Columns.Length;
// After taking all the necessary information from the sql command
// we can add Precision and Scale to all decimal parameters and set type for Money ones
for (int index = 0; index < commandComponents.Params.Length; index++)
{
SqlParameter decimalSqlParameter = command.Parameters.OfType<SqlParameter>()
.FirstOrDefault(p => commandComponents.Params[index] == p.ParameterName);
if (decimalSqlParameter == null)
{
continue;
}
string columnName = commandComponents.Columns.ElementAt(index % countOfColumns);
string settingKey = $"{commandComponents.TableName}.{columnName}";
if (_decimalColumnSettings.ContainsKey(settingKey))
{
(SqlDbType DataType, byte? Precision, byte? Scale) settings = _decimalColumnSettings[settingKey];
decimalSqlParameter.SqlDbType = settings.DataType;
if (settings.Precision.HasValue)
{
decimalSqlParameter.Precision = settings.Precision.Value;
}
if (settings.Scale.HasValue)
{
decimalSqlParameter.Scale = settings.Scale.Value;
}
}
}
}
}
}
private (string TableName, string[] Columns, string[] Params) GetCommandComponents(Match match)
{
string tableName = null;
string[] columns = null;
string[] parameters = null;
// INSERT
if (match.Groups[1].Success)
{
tableName = match.Groups[1].Value;
columns = match.Groups[2].Value.Split(",", StringSplitOptions.RemoveEmptyEntries)
.Select(c => c.Replace("[", string.Empty)
.Replace("]", string.Empty)
.Trim()).ToArray();
parameters = match.Groups[4].Value
.Split(",", StringSplitOptions.RemoveEmptyEntries)
.Select(c => c.Trim()
.Replace($"),{Environment.NewLine}(", string.Empty)
.Replace("(", string.Empty)
.Replace(")", string.Empty))
.ToArray();
return (
TableName: tableName,
Columns: columns,
Params: parameters);
}
// UPDATE
if (match.Groups[5].Success)
{
tableName = match.Groups[5].Value;
string[] colParams = match.Groups[6].Value.Split(",", StringSplitOptions.RemoveEmptyEntries)
.Select(p => p.Replace("[", string.Empty).Replace("]", string.Empty).Trim())
.ToArray();
columns = colParams.Select(cp => cp.Split('=', StringSplitOptions.RemoveEmptyEntries)[0].Trim()).ToArray();
parameters = colParams.Select(cp => cp.Split('=', StringSplitOptions.RemoveEmptyEntries)[1].Trim()).ToArray();
return (
TableName: tableName,
Columns: columns,
Params: parameters);
}
// MERGE
if (match.Groups[7].Success)
{
tableName = match.Groups[7].Value;
parameters = match.Groups[9].Value.Split(",", StringSplitOptions.RemoveEmptyEntries)
.Select(c => c.Trim()
.Replace($"),{Environment.NewLine}(", string.Empty)
.Replace("(", string.Empty)
.Replace(")", string.Empty))
.ToArray();
columns = match.Groups[10].Value.Split(",", StringSplitOptions.RemoveEmptyEntries).Select(c => c.Replace("[", string.Empty).Replace("]", string.Empty).Trim()).ToArray();
return (
TableName: tableName,
Columns: columns,
Params: parameters);
}
throw new Exception($"{nameof(AlwaysEncryptedDecimalParameterInterceptor)} was not able to parse the command");
}
}

Swashbuckle 5 and multipart/form-data HelpPages

I am stuck trying to get Swashbuckle 5 to generate complete help pages for an ApiController with a Post request using multipart/form-data parameters. The help page for the action comes up in the browser, but there is not included information on the parameters passed in the form. I have created an operation filter and enabled it in SwaggerConfig, the web page that includes the URI parameters, return type and other info derived from XML comments shows in the browser help pages; however, nothing specified in the operation filter about the parameters is there, and the help page contains no information about the parameters.
I must be missing something. Are there any suggestion on what I may have missed?
Operation filter code:
public class AddFormDataUploadParamTypes : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription) {
if (operation.operationId == "Documents_Upload")
{
operation.consumes.Add("multipart/form-data");
operation.parameters = new[]
{
new Parameter
{
name = "anotherid",
#in = "formData",
description = "Optional identifier associated with the document.",
required = false,
type = "string",
format = "uuid"
},
new Parameter
{
name = "documentid",
#in = "formData",
description = "The document identifier of the slot reserved for the document.",
required = false,
type = "string",
format = "uuid"
},
new Parameter
{
name = "documenttype",
#in = "formData",
description = "Specifies the kind of document being uploaded. This is not a file name extension.",
required = true,
type = "string"
},
new Parameter
{
name = "emailfrom",
#in = "formData",
description = "A optional email origination address used in association with the document if it is emailed to a receiver.",
required = false,
type = "string"
},
new Parameter
{
name = "emailsubject",
#in = "formData",
description = "An optional email subject line used in association with the document if it is emailed to a receiver.",
required = false,
type = "string"
},
new Parameter
{
name = "file",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
}
};
}
}
}
With Swashbuckle v5.0.0-rc4 methods listed above do not work. But by reading OpenApi spec I have managed to implement a working solution for uploading a single file. Other parameters can be easily added:
public class FileUploadOperationFilter : IOperationFilter
{
public void Apply(OpenApiOperation operation, OperationFilterContext context)
{
var isFileUploadOperation =
context.MethodInfo.CustomAttributes.Any(a => a.AttributeType == typeof(YourMarkerAttribute));
if (!isFileUploadOperation) return;
var uploadFileMediaType = new OpenApiMediaType()
{
Schema = new OpenApiSchema()
{
Type = "object",
Properties =
{
["uploadedFile"] = new OpenApiSchema()
{
Description = "Upload File",
Type = "file",
Format = "binary"
}
},
Required = new HashSet<string>()
{
"uploadedFile"
}
}
};
operation.RequestBody = new OpenApiRequestBody
{
Content =
{
["multipart/form-data"] = uploadFileMediaType
}
};
}
}
I presume you figured out what your problem was. I was able to use your posted code to make a perfect looking 'swagger ui' interface complete with the file [BROWSE...] input controls.
I only modified your code slightly so it is applied when it detects my preferred ValidateMimeMultipartContentFilter attribute stolen from Damien Bond. Thus, my slightly modified version of your class looks like this:
public class AddFormDataUploadParamTypes<T> : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
{
var actFilters = apiDescription.ActionDescriptor.GetFilterPipeline();
var supportsDesiredFilter = actFilters.Select(f => f.Instance).OfType<T>().Any();
if (supportsDesiredFilter)
{
operation.consumes.Add("multipart/form-data");
operation.parameters = new[]
{
//other parameters omitted for brevity
new Parameter
{
name = "file",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
}
};
}
}
}
Here's my Swagger UI:
FWIW:
My NuGets
<package id="Swashbuckle" version="5.5.3" targetFramework="net461" />
<package id="Swashbuckle.Core" version="5.5.3" targetFramework="net461" />
Swagger Config Example
public class SwaggerConfig
{
public static void Register()
{
var thisAssembly = typeof(SwaggerConfig).Assembly;
GlobalConfiguration.Configuration
.EnableSwagger(c =>
{
c.Schemes(new[] { "https" });
// Use "SingleApiVersion" to describe a single version API. Swagger 2.0 includes an "Info" object to
// hold additional metadata for an API. Version and title are required but you can also provide
// additional fields by chaining methods off SingleApiVersion.
//
c.SingleApiVersion("v1", "MyCorp.WebApi.Tsl");
c.OperationFilter<MyCorp.Swashbuckle.AddFormDataUploadParamTypes<MyCorp.Attr.ValidateMimeMultipartContentFilter>>();
})
.EnableSwaggerUi(c =>
{
// If your API supports ApiKey, you can override the default values.
// "apiKeyIn" can either be "query" or "header"
//
//c.EnableApiKeySupport("apiKey", "header");
});
}
}
UPDATE March 2019
I don't have quick access to the original project above, but, here's an example API controller from a different project...
Controller signature:
[ValidateMimeMultipartContentFilter]
[SwaggerResponse(HttpStatusCode.OK, Description = "Returns JSON object filled with descriptive data about the image.")]
[SwaggerResponse(HttpStatusCode.NotFound, Description = "No appropriate equipment record found for this endpoint")]
[SwaggerResponse(HttpStatusCode.BadRequest, Description = "This request was fulfilled previously")]
public async Task<IHttpActionResult> PostSignatureImage(Guid key)
You'll note that there's no actual parameter representing my file in the signature, you can see below that I just spin up a MultipartFormDataStreamProvider to suck out the incoming POST'd form data.
Controller Body:
var signatureImage = await db.SignatureImages.Where(img => img.Id == key).FirstOrDefaultAsync();
if (signatureImage == null)
{
return NotFound();
}
if (!signatureImage.IsOpenForCapture)
{
ModelState.AddModelError("CaptureDateTime", $"This equipment has already been signed once on {signatureImage.CaptureDateTime}");
}
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
string fileName = String.Empty;
string ServerUploadFolder = System.Web.Hosting.HostingEnvironment.MapPath("~/App_Data/");
DirectoryInfo di = new DirectoryInfo(ServerUploadFolder + key.ToString());
if (di.Exists == true)
ModelState.AddModelError("id", "It appears an upload for this item is either in progress or has already occurred.");
else
di.Create();
var fullPathToFinalFile = String.Empty;
var streamProvider = new MultipartFormDataStreamProvider(di.FullName);
await Request.Content.ReadAsMultipartAsync(streamProvider);
foreach (MultipartFileData fileData in streamProvider.FileData)
{
if (string.IsNullOrEmpty(fileData.Headers.ContentDisposition.FileName))
{
return StatusCode(HttpStatusCode.NotAcceptable);
}
fileName = cleanFileName(fileData.Headers.ContentDisposition.FileName);
fullPathToFinalFile = Path.Combine(di.FullName, fileName);
File.Move(fileData.LocalFileName, fullPathToFinalFile);
signatureImage.Image = File.ReadAllBytes(fullPathToFinalFile);
break;
}
signatureImage.FileName = streamProvider.FileData.Select(entry => cleanFileName(entry.Headers.ContentDisposition.FileName)).First();
signatureImage.FileLength = signatureImage.Image.LongLength;
signatureImage.IsOpenForCapture = false;
signatureImage.CaptureDateTime = DateTimeOffset.Now;
signatureImage.MimeType = streamProvider.FileData.Select(entry => entry.Headers.ContentType.MediaType).First();
db.Entry(signatureImage).State = EntityState.Modified;
try
{
await db.SaveChangesAsync();
//cleanup...
File.Delete(fullPathToFinalFile);
di.Delete();
}
catch (DbUpdateConcurrencyException)
{
if (!SignatureImageExists(key))
{
return NotFound();
}
else
{
throw;
}
}
char[] placeHolderImg = paperClipIcon_svg.ToCharArray();
signatureImage.Image = Convert.FromBase64CharArray(placeHolderImg, 0, placeHolderImg.Length);
return Ok(signatureImage);
Extending #bkwdesign very useful answer...
His/her code includes:
//other parameters omitted for brevity
You can actually pull all the parameter information (for the non-multi-part form parameters) from the parameters to the filter. Inside the check for supportsDesiredFilter, do the following:
if (operation.parameters.Count != apiDescription.ParameterDescriptions.Count)
{
throw new ApplicationException("Inconsistencies in parameters count");
}
operation.consumes.Add("multipart/form-data");
var parametersList = new List<Parameter>(apiDescription.ParameterDescriptions.Count + 1);
for (var i = 0; i < apiDescription.ParameterDescriptions.Count; ++i)
{
var schema = schemaRegistry.GetOrRegister(apiDescription.ParameterDescriptions[i].ParameterDescriptor.ParameterType);
parametersList.Add(new Parameter
{
name = apiDescription.ParameterDescriptions[i].Name,
#in = operation.parameters[i].#in,
description = operation.parameters[i].description,
required = !apiDescription.ParameterDescriptions[i].ParameterDescriptor.IsOptional,
type = apiDescription.ParameterDescriptions[i].ParameterDescriptor.ParameterType.FullName,
schema = schema,
});
}
parametersList.Add(new Parameter
{
name = "fileToUpload",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
});
operation.parameters = parametersList;
first it checks to make sure that the two arrays being passed in are consistent. Then it walks through the arrays to pull out the required info to put into the collection of Swashbuckle Parameters.
The hardest thing was to figure out that the types needed to be registered in the "schema" in order to have them show up in the Swagger UI. But, this works for me.
Everything else I did was consistent with #bkwdesign's post.

entity framework 5 change log how to implement?

I am creating an application with MVC4 and entity framework 5. How do can I implement this?
I have looked around and found that I need to override SaveChanges .
Does anyone have any sample code on this? I am using code first approach.
As an example, the way I am saving data is as follows,
public class AuditZoneRepository : IAuditZoneRepository
{
private AISDbContext context = new AISDbContext();
public int Save(AuditZone model, ModelStateDictionary modelState)
{
if (model.Id == 0)
{
context.AuditZones.Add(model);
}
else
{
var recordToUpdate = context.AuditZones.FirstOrDefault(x => x.Id == model.Id);
if (recordToUpdate != null)
{
recordToUpdate.Description = model.Description;
recordToUpdate.Valid = model.Valid;
recordToUpdate.ModifiedDate = DateTime.Now;
}
}
try
{
context.SaveChanges();
return 1;
}
catch (Exception ex)
{
modelState.AddModelError("", "Database error has occured. Please try again later");
return -1;
}
}
}
There is no need to override SaveChanges.
You can
Trigger Context.ChangeTracker.DetectChanges(); // may be necessary depending on your Proxy approach
Then analyze the context BEFORE save.
you can then... add the Change Log to the CURRENT Unit of work.
So the log gets saved in one COMMIT transaction.
Or process it as you see fit.
But saving your change log at same time. makes sure it is ONE Transaction.
Analyzing the context sample:
I have a simple tool, to Dump context content to debug output so when in debugger I can use immediate window to check content. eg
You can use this as a starter to prepare your CHANGE Log.
Try it in debugger immediate window. I have FULL dump on my Context class.
Sample Immediate window call. UoW.Context.FullDump();
public void FullDump()
{
Debug.WriteLine("=====Begin of Context Dump=======");
var dbsetList = this.ChangeTracker.Entries();
foreach (var dbEntityEntry in dbsetList)
{
Debug.WriteLine(dbEntityEntry.Entity.GetType().Name + " => " + dbEntityEntry.State);
switch (dbEntityEntry.State)
{
case EntityState.Detached:
case EntityState.Unchanged:
case EntityState.Added:
case EntityState.Modified:
WriteCurrentValues(dbEntityEntry);
break;
case EntityState.Deleted:
WriteOriginalValues(dbEntityEntry);
break;
default:
throw new ArgumentOutOfRangeException();
}
Debug.WriteLine("==========End of Entity======");
}
Debug.WriteLine("==========End of Context======");
}
private static void WriteCurrentValues(DbEntityEntry dbEntityEntry)
{
foreach (var cv in dbEntityEntry.CurrentValues.PropertyNames)
{
Debug.WriteLine(cv + "=" + dbEntityEntry.CurrentValues[cv]);
}
}
private static void WriteOriginalValues(DbEntityEntry dbEntityEntry)
{
foreach (var cv in dbEntityEntry.OriginalValues.PropertyNames)
{
Debug.WriteLine(cv + "=" + dbEntityEntry.OriginalValues[cv]);
}
}
}
EDIT: Get the changes
I use this routine to get chnages...
public class ObjectPair {
public string Key { get; set; }
public object Original { get; set; }
public object Current { get; set; }
}
public virtual IList<ObjectPair> GetChanges(object poco) {
var changes = new List<ObjectPair>();
var thePoco = (TPoco) poco;
foreach (var propName in Entry(thePoco).CurrentValues.PropertyNames) {
var curr = Entry(thePoco).CurrentValues[propName];
var orig = Entry(thePoco).OriginalValues[propName];
if (curr != null && orig != null) {
if (curr.Equals(orig)) {
continue;
}
}
if (curr == null && orig == null) {
continue;
}
var aChangePair = new ObjectPair {Key = propName, Current = curr, Original = orig};
changes.Add(aChangePair);
}
return changes;
}
edit 2 If you must use the Internal Object tracking.
var context = ???// YOUR DBCONTEXT class
// get objectcontext from dbcontext...
var objectContext = ((IObjectContextAdapter) context).ObjectContext;
// for each tracked entry
foreach (var dbEntityEntry in context.ChangeTracker.Entries()) {
//get the state entry from the statemanager per changed object
var stateEntry = objectContext.ObjectStateManager.GetObjectStateEntry(dbEntityEntry.Entity);
var modProps = stateEntry.GetModifiedProperties();
Debug.WriteLine(modProps.ToString());
}
I decompiled EF6 . Get modified is indeed using private bit array to track fields that have
been changed.
// EF decompiled source..... _modifiedFields is a bitarray
public override IEnumerable<string> GetModifiedProperties()
{
this.ValidateState();
if (EntityState.Modified == this.State && this._modifiedFields != null)
{
for (int i = 0; i < this._modifiedFields.Length; ++i)
{
if (this._modifiedFields[i])
yield return this.GetCLayerName(i, this._cacheTypeMetadata);
}
}
}