I have 3 resource files:
/Resources/Values.en-US.resx
/Resources/Values.es-ES.resx
/Resources/Values.fr-FR.resx
(English, Spanish, French)
From here I want to 'scan' what languages (from these resource files) are available so I can put them in a list and display them to the user for selecting. After releasing my program, people should be able to add more languages. The program will scan for new languages and make them available from a list.
Is there a way to get the files from Resources folder?
You can iterate through files located under application content directory, then select the resource files, extract the culture fragment from the file name and eventually create a list of cultures.
First, inject the IHostingEnvironment to use the ContentRootPath property it provides.
private readonly IHostingEnvironment _hostingEnvironment;
public HomeController(IHostingEnvironment hostingEnvironment)
{
_hostingEnvironment = hostingEnvironment;
}
As long as you keep all your resource files under ./Resources/ directory you should be fine.
Next, create DirectoryInfo:
var contentRootPath = Path.Combine(_hostingEnvironment.ContentRootPath, "Resources");
DirectoryInfo contentDirectoryInfo;
try
{
contentDirectoryInfo = new DirectoryInfo(contentRootPath);
}
catch (DirectoryNotFoundException)
{
// Here you should handle "Resources" directory not found exception.
throw;
}
Get the resource file names:
var resoruceFilesInfo = contentDirectoryInfo.GetFiles("*.resx", SearchOption.AllDirectories);
var resoruceFileNames = resoruceFilesInfo.Select(info => info.Name);
All three examples of resource files you provided follow a culture naming pattern. That is, a combination of an ISO 639 two-letter lowercase culture code associated with a language and an ISO 3166 two-letter uppercase subculture code associated with a country or region. For proper culture fragment extraction I suggest using a Regular Expression like this one below:
var regex = new Regex(#"(?<=\.)[a-z]{2}-[A-Z]{2}(?=\.resx$)");
var culturePrefixes = resoruceFileNames.Select(fileName => regex.Match(fileName).Value);
Finally, create a culture collection:
var cultureList = culturePrefixes.Select(prefix => new CultureInfo(prefix));
Related
I'm pretty new to ABP Framework and probably this question has a really simple answer, but I haven't managed to find it. Images are an important part of any app and handling them the best way (size, caching) is mandatory.
Scenario
setup a File System Blob Storing provider. This means that the upload file will be stored in the file system as an image file
make a service that uses a Blob container to save and retrieve the image. So, after saving it, I use the unique file name as a blob name. This name is used to retrieve it back.
the user is logged in, so authorization is required
I can easily obtain the byte[]s of the image by calling blobContainer.GetAllBytesOrNullAsync(blobName)
I want to easily display the image in <img> or in datatable row directly.
So, here is my question: is there an easy way to use a blob stored image as src of a <img> directly in a razor page? What I've managed to achieve is setting in the model, a source as a string made from image type + bytes converted to base 64 string (as here) however in this case I need to do it in the model and also I don't know if caching is used by the browser. I don't see how caching would work in this case.
I am aware that this may be a question more related to asp.net core, but I was thinking that maybe in abp there is some way via a link to access the image.
If you have the ID of the blob then it is easy to do. Just create a Endpoint to get the Image based on the blob id.
Here is the sample AppService
public class DocumentAppService : FileUploadAppService
{
private readonly IBlobContainer<DocumentContainer> _blobContainer;
private readonly IRepository<Document, Guid> _repository;
public DocumentAppService(IRepository<Document, Guid> repository, IBlobContainer<DocumentContainer> blobContainer)
{
_repository = repository;
_blobContainer = blobContainer;
}
public async Task<List<DocumentDto>> Upload([FromForm] List<IFormFile> files)
{
var output = new List<DocumentDto>();
foreach (var file in files)
{
using var memoryStream = new MemoryStream();
await file.CopyToAsync(memoryStream).ConfigureAwait(false);
var id = Guid.NewGuid();
var newFile = new Document(id, file.Length, file.ContentType, CurrentTenant.Id);
var created = await _repository.InsertAsync(newFile);
await _blobContainer.SaveAsync(id.ToString(), memoryStream.ToArray()).ConfigureAwait(false);
output.Add(ObjectMapper.Map<Document, DocumentDto>(newFile));
}
return output;
}
public async Task<FileResult> Get(Guid id)
{
var currentFile = _repository.FirstOrDefault(x => x.Id == id);
if (currentFile != null)
{
var myfile = await _blobContainer.GetAllBytesOrNullAsync(id.ToString());
return new FileContentResult(myfile, currentFile.MimeType);
}
throw new FileNotFoundException();
}
}
Upload function will upload the files and Get function will get the file.
Now set the Get route as a src for the image.
Here is the blog post: https://blog.antosubash.com/posts/dotnet-file-upload-with-abp
Repo: https://github.com/antosubash/FileUpload
How one would write a Lucene 8.11 ByteBuffersDirectory to disk?
something similar to Lucene 2.9.4 Directory.copy(directory, FSDirectory.open(indexPath), true)
You can use the copyFrom method to do this.
For example:
You are using a ByteBuffersDirectory:
final Directory dir = new ByteBuffersDirectory();
Assuming you are not concurrently writing any new data to that dir, you can declare a target where you want to write the data - for example, a FSDirectory (a file system directory):
Directory to = FSDirectory.open(Paths.get(OUT_DIR_PATH));
Use whatever string you want for the OUT_DIR_PATH location.
Then you can iterate over all the files in the original dir object, writing them to this new to location:
IOContext ctx = new IOContext();
for (String file : dir.listAll()) {
System.out.println(file); // just for testing
to.copyFrom(dir, file, file, ctx);
}
This will create the new OUT_DIR_PATH dir and populate it with files, such as:
_0.cfe
_0.cfs
_0.si
segments_1
... or whatever files you happen to have in your dir.
Caveat:
I have only used this with a default IOContext object. There are other constructors for the context - not sure what they do. I assume they give you more control over how the write is performed.
Meanwhile I figured it out by myself and created a straight forward method for it:
#SneakyThrows
public static void copyIndex(ByteBuffersDirectory ramDirectory, Path destination) {
FSDirectory fsDirectory = FSDirectory.open(destination);
Arrays.stream(ramDirectory.listAll())
.forEach(fileName -> {
try {
// IOContext is null because in fact is not used (at least for the moment)
fsDirectory.copyFrom(ramDirectory, fileName, fileName, null);
} catch (IOException e) {
log.error(e.getMessage(), e);
}
});
}
I have a single MVC5 site which is accessed via several different regional URLs. In my case .co.uk (for the UK), .de (for Germany) and .fr (for France).
The site content is localised using RESX files and users can switch language via a cookie for persistence and a HttpModule which sets the asp.net thread locale based on the cookie (I used this approach).
I want the default language to be relevant to the top-level domain the user is accessing the site as. For example if a user is on .de, the default language should be de-DE. The user may choose to change the language in which case the default is overwritten, but it is very important that the default language is appropriate to the top-level domain (for users and search engines).
How can I achieve this in MVC5? The best I have got to so far is using JavaScript to check the url, set the cookie and refresh the page, but i know this is nasty and there must be a better way.
PS: Please note it is the top level domain that I want to drive this. I'm not using regional routing, for example http://whatever.com/DE or http://whatever.com/EN
PPS: I do not want to use the browser language detection feature either because that causes problems for search engines. i.e. it may cause the .de site to show in en-GB because that is what the search engine uses (or the search engine has no language so that is the default). If that happens the .de site will be treated as a duplicate of the .co.uk site which is never good for SEO
I figured out how to do this. Add this to global.asax
protected void Application_AcquireRequestState(object sender, EventArgs e)
{
if (Request.Cookies[Constants.LanguageCookieName] == null)
{
var culture = GetCultureFromHost();
Thread.CurrentThread.CurrentUICulture = culture;
Thread.CurrentThread.CurrentCulture = culture;
}
}
private CultureInfo GetCultureFromHost()
{
//set default culture of en-GB
CultureInfo ci = new CultureInfo("en-GB");
//get top level domain
string host = Request.Url.Host.ToLower();
//check for other known domains and set culture accordingly
if (host.Contains("whatever.de"))
{
ci = new CultureInfo("de-DE");
}
return ci;
}
In my case I set Persian cluture in global.asax and works well
protected void Application_BeginRequest(object sender, EventArgs e)
{
var persianCulture = new PersianCulture();
persianCulture.DateTimeFormat.ShortDatePattern = "yyyy/MM/dd";
persianCulture.DateTimeFormat.LongDatePattern = "dddd d MMMM yyyy";
persianCulture.DateTimeFormat.AMDesignator = "صبح";
persianCulture.DateTimeFormat.PMDesignator = "عصر";
Thread.CurrentThread.CurrentCulture = persianCulture;
Thread.CurrentThread.CurrentUICulture = persianCulture;
}
Hi I'm new to using Amazon EMR and Hadoop. I was wondering how to read an external file (stored in S3) from an EMR job. For example, I have a file containing a long list of blacklisted strings. When my EMR job is processing my input, how do I get the job to read in this list of blacklisted strings beforehand in order to use it during processing?
I tried using a regular Java Scanner class and hardcoding the S3 path to the file but that didn't seem to work, although I could just be doing it wrong...
I'd do something like this (sorry code is scala not java, but it's the same)
Pass the path in as a argument to your main method
Set that as a property in your configuration
val conf = new Configuration()
conf.set("blacklist.file", args(0))
In the mapper's setup method, read the file:
var blacklist: List[String] = List()
override def setup(context: Context) {
val path = new Path(context.getConfiguration.get("blacklist.file"))
val fileSystem = FileSystem.get(path.toUri, context.getConfiguration)
blacklist = scala.io.Source.fromInputStream(fileSystem.open(path)).getLines.toList
}
It would be better if you may add this file to the distributed cache as follows :
...
String s3FilePath = args[0];
DistributedCache.addCacheFile(new URI(s3FilePath), conf);
...
Later, in configure() of your mapper/reducer, you can do the following:
...
Path s3FilePath;
#Override
public void configure(JobConf job) {
s3FilePath = DistributedCache.getLocalCacheFiles(job)[0];
FileInputStream fstream = new FileInputStream(s3FilePath.toString());
// Read the file and build a HashMap/List or something which can be accessed from map/reduce methods as desired.
...
}
I have a T4 template that I am trying to create that will code gen lookup values from a database via Nhibernate. My problem is my data access layer uses the path of the Nhibernate configuration in order to compile the configuration upon startup (a static constructor).
I don't know how to make t4 "see" this file path so that when my code gen runs it can get this configuration file. Nor do I know how to make t4 "see" my configuration manager class; which contains the app setting that lists the path to the nhibernate xml file.
I have two configuration files, one for SQL Server and one for sqlite. The configuration file needs to be in the root directory of the executing assembly in order to nhibernate to compile the configuration.
It seems like my template won't be able to use the high level business layer to select the data from the database, rather I may have to copy all the nhibernate configuration code into the template as well (ugh).
My DB wrapper:
private class DBSingleton
{
static DBSingleton()
{
string path = ConfigurationManager.AppSettings["DBConfigFileFullPath"];
NHibernate.Cfg.Configuration cfg = new NHibernate.Cfg.Configuration().Configure(path);
cfg.AddInputStream(HbmSerializer.Default.Serialize(System.Reflection.Assembly.GetAssembly(typeof(Plan))));
instance = cfg.BuildSessionFactory();
}
internal static readonly ISessionFactory instance;
}
And my template:
<#
System.Diagnostics.Debugger.Launch();
string path = Host.ResolvePath(#"..\DB.UnitTest\App.config");
System.Configuration.ConfigurationManager.AppSettings.Add(
"DBConfigFileFullPath", path);
//System.Configuration.ConfigurationFileMap fileMap = new ConfigurationFileMap(path); //Path to your config file
//System.Configuration.Configuration configuration = System.Configuration.ConfigurationManager.OpenMappedMachineConfiguration(fileMap);
ReferenceValueBL bl = new ReferenceValueBL();
List<ReferenceValue> refVals = bl.Select(); <- problem occurs here
foreach(ReferenceValue rv in refVals)
{
Write("Public ");
Write(rv.ReferenceValueCode.GetType().Name);
Write(" ");
Write(rv.ReferenceValueCode);
Write(" = ");
Write(rv.ReferenceValueCode);
}
#>
My problem occurs when the bl variable tries to call select(). That's when the DBSingleton is initialized. It throws an error saying the app setting is null. If I hard code the relative file path in the DB class to just be ./Dbconfig.xml it still throws an error because the executing assembly doesn't have that file in it's local directory.
How do other people handle getting t4 to use app/web config files without reading from the config file within the template and then injecting a connection string into the DAL? I don't have that luxury. The file has to be either placed in a readable location or t4 has to know to look somewhere.
How do other people handle getting t4 to use app/web config files without reading from the config file within the template and then injecting a connection string into the DAL?
You could perhaps convert your text template into runtime text template by setting Properties | Custom Tool to "TextTemplatingFilePreprocessor".
Afterwards the entire code generation process would be encapsulated within standard class which will have TransformText method which produces the resulting string - you can then write it into file of your liking.
Nice thing is that the template class is partial, so you can add implementation of some custom methods. Something like this perhaps:
public partial class RuntimeTextTemplate1
{
public string TransformText(string someParameter)
{
// Do something with someParameter, initialize class field
// with its value and later use this field in your t4 file.
// You also have access to ConfigurationManager.AppSettings here.
return TransformText();
}
}
Also, this:
foreach(ReferenceValue rv in refVals)
{
Write("Public ");
Write(rv.ReferenceValueCode.GetType().Name);
Write(" ");
Write(rv.ReferenceValueCode);
Write(" = ");
Write(rv.ReferenceValueCode);
}
Could be rewritten as:
foreach(ReferenceValue rv in refVals)
{
#>
Public <#= rv.ReferenceValueCode.GetType().Name #> <#= rv.ReferenceValueCode #> = <#= rv.ReferenceValueCode #>
<#+
}