Azure: How to synchronize log files without overwriting old ones? - asp.net-mvc-4

I have an issue about synchronizing a log file to the blob storage. Actually, I can synchronize a log file to the blob storage, but after that when I make a new deployment of my project to the azure my project files are changing and the log files' contents changing too, although the file name stays same. Thus WebRole is trying to synchronize the log file and does it, however because of the file name, file is overwriting and all the data in the blob storage has gone. How can I hold log files for different deployments? I hope I can explain you, sorry for my English.

You can change the fileName before going to use it. By overriding the File property you can add any unique prefix (DeploymentID,TimeTicks,GUID...) to the file name.
public class AzureLocalStorageAppender:RollingFileAppender
{
public override string File
{
get
{
//Trace.WriteLine("get_"+base.File);
return base.File;
}
set
{
base.File = RoleEnvironment.GetLocalResource("LocalResourceNameHere").RootPath + #"\"
+ "_" + Guid.NewGuid().ToString()
+ new FileInfo(value).Name;
//Trace.WriteLine(base.File);
}
}
}

Related

Migrating from Microsoft.Azure.Storage.Blob to Azure.Storage.Blobs - directory concepts missing

These are great guides for migrating between the different versions of NuGet package:
https://github.com/Azure/azure-sdk-for-net/blob/Azure.Storage.Blobs_12.6.0/sdk/storage/Azure.Storage.Blobs/README.md
https://elcamino.cloud/articles/2020-03-30-azure-storage-blobs-net-sdk-v12-upgrade-guide-and-tips.html
However I am struggling to migrate the following concepts in my code:
// Return if a directory exists:
container.GetDirectoryReference(path).ListBlobs().Any();
where GetDirectoryReference is not understood and there appears to be no direct translation.
Also, the concept of a CloudBlobDirectory does not appear to have made it into Azure.Storage.Blobs e.g.
private static long GetDirectorySize(CloudBlobDirectory directoryBlob) {
long size = 0;
foreach (var blobItem in directoryBlob.ListBlobs()) {
if (blobItem is BlobClient)
size += ((BlobClient) blobItem).GetProperties().Value.ContentLength;
if (blobItem is CloudBlobDirectory)
size += GetDirectorySize((CloudBlobDirectory) blobItem);
}
return size;
}
where CloudBlobDirectory does not appear anywhere in the API.
There's no such thing as physical directories or folders in Azure Blob Storage. The directories you sometimes see are part of the blob (e.g. folder1/folder2/file1.txt). The List Blobs requests allows you to add a prefix and delimiter in a call, which are used by the Azure Portal and Azure Data Explorer to create a visualization of folders. As example prefix folder1/ and delimiter / would allow you to see the content as if folder1 was opened.
That's exactly what happens in your code. The GetDirectoryReference() adds a prefix. The ListBlobs() fires a request and Any() checks if any items return.
For V12 the command that'll allow you to do the same would be GetBlobsByHierarchy and its async version. In your particular case where you only want to know if any blobs exist in the directory a GetBlobs with prefix would also suffice.

Spring Cloud Server serving multiple property files for the same application

Lets say I have applicationA that has 3 property files:
-> applicationA
- datasource.properties
- security.properties
- jms.properties
How do I move all properties to a spring cloud config server and keep them separate?
As of today I have configured the config server that will only read ONE property file as this seems to be the standard way. This file the config server picks up seems to be resolved by using the spring.application.name. In my case it will only read ONE file with this name:
-> applicationA.properties
How can I add the other files to be resolved by the config server?
Not possible in the way how you requested. Spring Cloud Config Server uses NativeEnvironmentRepository which is:
Simple implementation of {#link EnvironmentRepository} that uses a SpringApplication and configuration files located through the normal protocols. The resulting Environment is composed of property sources located using the application name as the config file stem (spring.config.name) and the environment name as a Spring profile.
See: https://github.com/spring-cloud/spring-cloud-config/blob/master/spring-cloud-config-server/src/main/java/org/springframework/cloud/config/server/environment/NativeEnvironmentRepository.java
So basically every time when client request properties from Config Server it creates ConfigurableApplicationContext using SpringApplicationBuilder. And it is launched with next configuration property:
String config = application;
if (!config.startsWith("application")) {
config = "application," + config;
}
list.add("--spring.config.name=" + config);
So possible names for property files will be only application.properties(or .yml) and config client application name that is requesting configuration - in your case applicationA.properties.
But you can "cheat".
In config server configuration you can add such property
spring:
cloud:
config:
server:
git:
search-paths: '{application}, {application}/your-subdirectory'
In this case Config Server will search for same property file names but in few directories and you can use subdirectories to keep your properties separate.
So with configuration above you will be able to load configuration from:
applicationA/application.properies
applicationA/your-subdirectory/application.properies
This can be done.
You need to create your own EnvironmentRepository, which loads your property files.
org.springframework.cloud.config.server.support.AbstractScmAccessor#getSearchLocations
searches for the property files to load :
for (String prof : profiles) {
for (String app : apps) {
String value = location;
if (app != null) {
value = value.replace("{application}", app);
}
if (prof != null) {
value = value.replace("{profile}", prof);
}
if (label != null) {
value = value.replace("{label}", label);
}
if (!value.endsWith("/")) {
value = value + "/";
}
output.addAll(matchingDirectories(dir, value));
}
}
There you could add custom code, that reads the required property files.
The above code matches exactly the behaviour described in the spring docs.
The NativeEnvironmentRepository does NOT access GIT/SCM in any way, so you should use
JGitEnvironmentRepository as base for your own implementation.
As #nmyk pointed out, NativeEnvironmentRepository boots a mini app in order to collect the properties by providing it with - sort of speak - "hardcoded" {appname}.* and application.* supported property file names. (#Stefan Isele - prefabware.com JGitEnvironmentRepository ends up using NativeEnvironmentRepository as well, for that matter).
I have issued a pull request for spring-cloud-config-server 1.4.x, that supports defining additional file names, through a spring.cloud.config.server.searchNames environment property, in the same sense one can do for a single springboot app, as defined in the Externalized Configuration.Application Property Files section of the documentation, using the spring.config.name enviroment property. I hope they review it soon, since it seems many have asked about this feature in stack overflow, and surely many many more search for it and read the currently advised solutions.
It worths mentioning that many ppl advise "abusing" the profile feature to achieve this, which is a bad practice, in my humble opinion, as I describe in this answer

Restore Embedded RavenDB on top of existing data

I'm trying to do RavenDB backup/restore from within the application. Raven runs in embedded mode. I've got the backups working fine, now I need to do a restore.
Problem with restore that I'd like to restore into the same location that is currently used by embedded storage. My plan was to shut-down embedded storage, delete all the files in the location, restore there:
var configuration = embeddedStore.Configuration;
var backupLocation = // location of a backup folder
var databaseLocation = // location of current DB
// shutdown the storage???
documentStore.Dispose();
// now, trying to delete the directory, I get exception "Can't delete the folder it is in use by other process"
Directory.Delete(databaseLocation, true); // <-- exception here
Directory.CreateDirectory(databaseLocation);
DocumentDatabase.Restore(configuration, backupLocation, databaseLocation, s => { }, defrag: true);
(The full source on GitHub)
The problem with shutting down the storage. From the exception I get, the engine is not shut down, because I can't delete the folder:
The process cannot access the file '_0.cfs' because it is being used by another process.
The application I run is MVC5. Issue that the DB location is set in web.config and I don't really want to modify it in any way, so the restore has got to go in the same location as existing DB.
What is the correct way to restore embedded database into the same location as the existing DB?
one workaround is initialize ravendb(properly in Application_Start) with a condition to not let ravendb start.
if (WebConfigurationManager.AppSettings["RavenDBStartMode"]=="Start")
RavenDB.initialize();
for restoring db change "RavenDBStartMode" to "DontStart" in webconfig so application pool will restart and start your restore operation
string dbLocation = Server.MapPath("your database location");
string backupLocation = Server.MapPath("your backup location");
System.IO.File.Delete(dbLocation);
DocumentDatabase.Restore(new RavenConfiguration() { DataDirectory = dbLocation }
, backupLocation, dbLocation, x => { }, false);
RavenDB.initialize();
finally change your "RavenDBStartMode" to "Start" again so ravendb can start for other restart reasons

How to update database file over internet?

My App has a database file in its NSBundle. I want to get update database file from internet whenever new database file is available. and this should happen before app displays data from the database file.
Here is the logic i am trying to use. i don't know if it makes sense or is there a better way to do it. an example will be awesome
if (file is available in the Documents Directory)
{
if( check if internet is available )
{
1. get file from network
2. store it in Documents Directory
if ( compare contents of the old & new file)
{
delete downloaded file
} else {
move or delete old file & rename new file ( so that the new file's data can be accessed )
}
} else {
use old file in Documents Directory
}
} else {
copy file from bundle to Documents Directory
}
Ideally you should have a timestamp or version number for the copy of the DB you have on the phone, and transmit that to the server. Then have the server only send a new copy if there is a newer version. Saves the user a lot of data charges.

I can't get netbeans to find a txt file I have in the same directory... java.io.FileNotFoundException

I can't make it path specific because once I get this program to work (this is the last thing I have to do) I'm uploading to my university's ilearn website and it has to run on my professors computer with no modifications. I've tried a few different amalgamations of code similar to the following...
File file = new File("DataFile.txt");
Scanner document = new Scanner(new File("DataFile.txt"));
Or...
java.io.File file = new java.io.File("DataFile.txt");
Scanner document = new Scanner(file);
But nothing seems to work. I've got the necessary stuff imported. I've tried moving DataFile around in a few different folders (the src folder, and other random folders in the project's NetBeansProjects folder) I tried creating a folder in the project and putting the file in that folder and trying to use some kind of
documents/DataFile.txt
bit I found online (I named the folder documents).
I've tried renaming the file, saving it in different ways. I'm all out of ideas.
The file is just a list of numbers that are used in generating random data for this program we got assigned for building a gas station simulator. The program runs great when I just use user input from the console. But I can not get netbeans to find that file for the life of me! Help!?!?!?
Try adding the file to build path ..
public void readTextFile (){
try{
Scanner scFile =new Scanner(new File("filename.txt");
while(scFile.hasNext()){
String line =scFile.nextLine();
Scanner details=new Scanner(line).useDelimiter("symbol");
than you can work from there to store integer values use e.g in an array
litterArr(size)=details.nextInt();
Note: size is a variable counting the size/number of info the array has.
}
scFile.close();
{
catch
(FILENOTFOUNDEXCEPION e){
..... *code*
}
Keep file in the same folder as the program,but if it is saved in another folder you need to supply the path indicating the location of the file as part of the file name e.g memAthletics.Lines.LoadFromFile('C:\MyFiles\Athletics.txt');
hope this helps clear the problem up :)