For getting Pdf report in Asp.net MVC I am working with Stimulsoft 2015. The problem is that I have no idea how to convert my code in order to work with Stimulsoft Core in Asp.net Core. it seems some features are not available anymore in Stimulsoft Core (like StiReport).
This is the code which works fine in Asp.net MVC
public ActionResult GetReportSnapshot(string sort)
{
StiReport report = new StiReport();
report.Load(Server.MapPath("~/Reports/Jobs.mrt"));
report["#PrjectId"] = 1;
report["#OrderBy"] = sort;
report.Dictionary.Variables["title"] = new Stimulsoft.Report.Dictionary.StiVariable("title", sort);
report.Render();
MemoryStream stream = new MemoryStream();
report.ExportDocument(StiExportFormat.Pdf, stream);
stream.Position = 0;
FileStreamResult fsr = new FileStreamResult(stream, "application/pdf");
return fsr;
}
I will appreciate any help.
In NuGet Package Stimulsoft.Reports.Web.NetCore version 2018.3.5.
and Asp.Net core 2.0.
This is working for me, Try this:
public IActionResult GetReportSnapshot(string sort)
{
StiReport report = new StiReport();
report.Load(#"C:\Users\Admin\Desktop\report.mrt"); // laod report
report.Render();
report["#PrjectId"] = 1;
report["#OrderBy"] = sort;
report.Dictionary.Variables["title"] = new Stimulsoft.Report.Dictionary.StiVariable("title", sort);
// Create an PDF settings instance. You can change export settings.
var settings = new Stimulsoft.Report.Export.StiPdfExportSettings();
// Create an PDF service instance.
var service = new Stimulsoft.Report.Export.StiPdfExportService();
// Create a MemoryStream object.
var stream = new MemoryStream();
// Export PDF using MemoryStream.
service.ExportTo(report, stream, settings);
return File(stream.ToArray(), "application/octet-stream");
}
What nuget packages are you using? It could be you are missing the nuget packages containing the StiReport class. (I see they split up their library over multiple nuget packages)
Also it could be they have not migrated this part to dotnet core yet.
i'd advise you to click around there github repo and see if you can find any information there: https://github.com/stimulsoft, or on there website.
By the looks of nuget they have only recently started to migrate to dotnet core so this would suppose my second suggestion is the right suggestion.
Related
I am using ABP v4.9.0 (.NET CORE 2.2) with angular client
I built some custom localization providers. These providers get translation dictionaries from an external API.
I add localization sources on startup with these providers.
var customProvider = new CustomLocalizationProvider(...);
var localizationSource = new DictionaryBasedLocalizationSource("SOURCENAME", customProvider );
config.Localization.Sources.Add(localizationSource );
On startup, the providers InitializeDictionaries() is called and localization dictionaries are built.
So far, so good, working as intended.
Now i'd like to manually Reload these translations on demand, but I can't make this working.
Here is what I tried.
Here I trigger the re-synchronize of the language ressources:
foreach (var localizationSource in _localizationConfiguration.Sources)
{
try
{
localizationSource.Initialize(_localizationConfiguration, _iocResolver);
}
catch (Exception e)
{
Logger.Warn($"Could not get Localization Data for source '{localizationSource.Name}'", e);
}
}
In the custom provider, I first clear the Dictionaries
public class CustomLocalizationProvider : LocalizationDictionaryProviderBase
{
protected int IterationNo = 0;
protected override void InitializeDictionaries()
{
Dictionaries.Clear();
IterationNo += 1;
var deDict = new LocalizationDictionary(new CultureInfo("de-DE"));
deDict["HelloWorld"] = $"Hallo Welt Nummer {IterationNo}";
Dictionaries.Add("de-DE", deDict);
var enDict = new LocalizationDictionary(new CultureInfo("en"));
enDict["HelloWorld"] = $"Hello World number {IterationNo}";
Dictionaries.Add("en", enDict);
}
}
The provider is executed again as expected.
But when I eventually use the localization clientside (angular), I still get the original translations.
What am I missing?
Thanks for the help.
In the meanwhile I had to go for another approach.
I am now using a XmlEmbeddedFileLocalizationDictionaryProvider wrapped by a MultiTenantLocalizationDictionaryProvider.
This way, I am using db-localizations with xml-sources as fallback
Then I manually load the ressources from my API in some appservice. These localizations are then updated in the database by using LanguageTextManager.UpdateStringAsync().
Working on a project, where all users are created in active directory under a domain.
What I want is get all users that are in the domain, is this possible?
I am new to active directory, using .net core2.0.
Any suggestions and guidance is appreciated.
There is an implementation of System.DirectoryServices.AccountManagement for .NET Core 2, published by the .Net Foundation. See the nuget reference here
This will allow you to make calls like this:
using (var ctx = new PrincipalContext(ContextType.Domain, "MyDomain")){
var myDomainUsers = new List<string>();
var userPrinciple = new UserPrincipal(ctx);
using (var search = new PrincipalSearcher(userPrinciple)){
foreach (var domainUser in search.FindAll()){
if (domainUser.DisplayName != null){
myDomainUsers.Add(domainUser.DisplayName);
}
}
}
}
Exact what i was looking for...
You can use the Microsoft Graph API to interact with the data users in the Microsoft cloud(Including AAD).
Link here
Documentation
This example no longer seems to work in .NET Core 2.0, it now serializes using an XmlDiffGram. Any easy way to get it working? Do I need to do this whole thing?
As of 25/04/2018. Download latest version of Newtonsoft. I upgraded to 11.0.2. It now works with ASP Core 2. Datasets get converted to JSON.
It's looks like the Netonsoft crew have written a specific converters for DataSet & DataTables which should point you in the right direction.
Quick update:
It looks like these are not quite in the latest nuget release yet, but coming soon.
when they are released, you'll need to change the line in the example to something like.
string json = JsonConvert.SerializeObject(dataSet, Formatting.Indented, new JsonSerializerSettings { Converters = new[] { new Newtonsoft.Json.Converters.DataSetConverter() } });
Well this works...
var xml = new XDocument();
using (var writer = xml.CreateWriter())
{
dataSet.WriteXml(writer);
writer.Flush();
}
return Json(xml);
I have searched high and low for this one and can't seem to find a way of accessing the Request.Content in an MVC web api. I basically am trying to create a File Service to and from Azure Blob and Table storage (table for storing metadata about the file, blob for the actual file)....
I was converting the steps in the following link, but this is where I have come unstuck
the back end I have working but can't find a way of the new unified controller passing a fileobject from json post through to the service! Any ideas would be greatly appreciated as always... or am I just going about this the wrong way?
Article here....
UPDATE: so to clarify, what I am trying to do in the new MVC 6 (where you no longer have an apicontroller to inherit from) is to access a file that has been uploaded to the api from a JSON post. That is the long and short of what I am trying to achieve.
I am trying to use the article based on the old Web API which uses the Request.Content to access it, however even if I use the WebAPIShim which they provide I still come unstuck with other objects or properties that are no longer available so I'm wondering if I need to approach it a different way, but either way, all I am trying to do is to get a file from a JSON post to a MVC 6 Web api and pass that file to my back end service....
ANY IDEAS?
Here is an example without relying on model binding.
You can always find the request data in Request.Body, or use Request.Form to get the request body as a form.
[HttpPost]
public async Task<IActionResult> UploadFile()
{
if (Request.Form.Files != null && Request.Form.Files.Count > 0)
{
var file = Request.Form.Files[0];
var contentType = file.ContentType;
using (var fileStream = file.OpenReadStream())
{
using (var memoryStream = new MemoryStream())
{
await fileStream.CopyToAsync(memoryStream);
// do what you want with memoryStream.ToArray()
}
}
}
return new JsonResult(new { });
}
If the only thing in your request is a File you can use the IFormFile class in your action:
public FileDetails UploadSingle(IFormFile file)
{
FileDetails fileDetails;
using (var reader = new StreamReader(file.OpenReadStream()))
{
var fileContent = reader.ReadToEnd();
var parsedContentDisposition = ContentDispositionHeaderValue.Parse(file.ContentDisposition);
fileDetails = new FileDetails
{
Filename = parsedContentDisposition.FileName,
Content = fileContent
};
}
return fileDetails;
}
Updated my MVC3/.Net 4.5/Azure solution to MVC4.
My code for uploading an image to blob storage appears to fail each time in the upgraded MVC4 solution. However, when I run my MVC3 solution works fine. Code that does the uploading, in a DLL, has not changed.
I’ve uploaded the same image file in the MVC3 and MVC4 solution. I’ve inspected in the stream and it appears to be fine. In both instance I am running the code locally on my machine and my connections point to blob storage in cloud.
Any pointers for debugging? Any known issues that I may not be aware of when upgrading to MVC4?
Here is my upload code:
public string AddImage(string pathName, string fileName, Stream image)
{
var client = _storageAccount.CreateCloudBlobClient();
client.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(5));
var container = client.GetContainerReference(AzureStorageNames.ImagesBlobContainerName);
image.Seek(0, SeekOrigin.Begin);
var blob = container.GetBlobReference(Path.Combine(pathName, fileName));
blob.Properties.ContentType = "image/jpeg";
blob.UploadFromStream(image);
return blob.Uri.ToString();
}
I managed to fix it. For some reason reading the stream directly from the HttpPostFileBase wasn't working. Simply copy it into a new memorystream solved it.
My code
public string StoreImage(string album, HttpPostedFileBase image)
{
var blobStorage = storageAccount.CreateCloudBlobClient();
var container = blobStorage.GetContainerReference("containerName");
if (container.CreateIfNotExist())
{
// configure container for public access
var permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}
string uniqueBlobName = string.Format("{0}{1}", Guid.NewGuid().ToString(), Path.GetExtension(image.FileName)).ToLowerInvariant();
CloudBlockBlob blob = container.GetBlockBlobReference(uniqueBlobName);
blob.Properties.ContentType = image.ContentType;
image.InputStream.Position = 0;
using (var imageStream = new MemoryStream())
{
image.InputStream.CopyTo(imageStream);
imageStream.Position = 0;
blob.UploadFromStream(imageStream);
}
return blob.Uri.ToString();
}