Capture GlassFish log file into SQL/JPA data base - glassfish

I need a little help getting started. I have a new JSF-2 web application that I intend to deploy under GlassFish 3.1 (or higher). Normally the server stores all its log files as text in one of its private directories, which also includes the logging I do with ether System.println( .. ) or something like java.util.logging.Logger.getLogger( ... )
What I want to do is instead of those logging entries going to the text file, capture them and file them into my SQL data base. I can then add table columns for timestamp and key values so it can be easily searched as part of the admin web page in the application, rather than having to go to the admin console for it. It would be possible also to expose some of that data to users.
Can this be done and how?
Follow up question: could this be done in a way that would be portable to Tomcat or another container?

You will need to write custom log handler. Custom log handler is a class that extends java.util.logging.Handler:
package test.stackoverflow;
import java.util.logging.Handler;
..
public class AlanHandler extends Handler {
..
#Override
public void publish(LogRecord record) {
//CODE THAT STORES LOG RECORD INTO THE DATABASE
}
}
Additionally, you will have to slightly change logging.properties file:
handlers=java.util.logging.ConsoleHandler, test.stackoverflow.AlanHandler
Deploy JAR of AlanHandler on Glassfish (as a library), restart the server and that should do it.

Related

Grails Dashboard says I have no Services

In the process of trying to troubleshoot my Grails SQL Issue, I just realized that the default Grails Splash page says I have 0 Services. I figured this is a separate issue so I made a new question. I'm on Grails 3.3.9. See attached picture. I'm using default scaffolded code here so my index page is calling my list method in the Service. What am I missing here? Service below (excuse my ignorance, I'm coming from Grails 2.3.11):
package TSTSupport
import grails.gorm.services.Service
import grails.gorm.transactions.Transactional
#Service(TST_Customer)
#Transactional
interface TST_CustomerService {
TST_Customer get(Serializable id)
List<TST_Customer> list(Map args)
Long count()
void delete(Serializable id)
TST_Customer save(TST_Customer TST_Customer)
}
Thanks to #JeffScottBrown, I realized that in Grails 3.3.9, the generate-all command creates a gorm data service which is different from a regular service (which is why Grails says I have 0 services). Using create-service is the way to go if you want the actual dashboard to recognize your services.

spring custom scan in msf4j

I have an msf4j application in package com.a.sample1 and I want to scan some component in com.a.sample2. Is there a way to do it in msf4j? I am using:
public static void main(String[] args) {
MSF4JSpringApplication
.run(Application.class, args);
}
I can't put my application in com.a package to scan both sample1 and sample2 automatically, one reason is com.a.sample2 is coming from some external library.
In Spring Boot, if the components, JPA Repositories or Entities are not in sub packages of Application.java's package then we need to specify them explicitly. Is this at all possible in MSF4J?
Though I am still waiting for the answer to scan package other than application package, there is work around. I have created an annotation, and Imported configuration class in that annotation.
So, when you add the annotation (created in sample1) in sample2, it will import configurations from sample1 and load the beans into sample2.
I checked MSF4J sources and found, what scans started only for package of Application class, passed as first argument into run method: https://github.com/wso2/msf4j/blob/release-2.1.0/spring/src/main/java/org/wso2/msf4j/spring/MSF4JSpringApplication.java#L165
Unfortunately it is private method and you cannot change it.
From another side, "source" argument (first argument used in run method) used only for determining package autoscan - so, you can simple place any DummyClass into com.a package and run it via :
MSF4JSpringApplication
.run(DummyClass.class, args);

Posting a Task to the Web Consoles Execution(Management) Context

In the apache brooklyn web interface we would like to display some content for the sytsem managers. The content is too long to be served as a simple sensor value.
Our idea was to create a task and write the content into the output stream of the task, and then offer the REST based URL to the managers like this:
/v1/activities/{task}/stream/stdout (Of course the link masked with some nice text)
The stream and task is created like this:
LOG.info("{} Creating Activity for ClusterReport Feed", this);
activity = Tasks.builder().
displayName("clusterReportFeed").
description("Output for the Cluster Report Feed").
body(new Runnable() {
#Override
public void run() {
//DO NOTHING
}
}).
parallel(true).
build();
LOG.info("{} Task Created with Id: " + activity.getId(), this);
Entities.submit(server, activity).getUnchecked();
The task seems to be created and the interraction works perfectly fine.
However when I want to access the tasks output stream from my browser using a prepared URL I get the error that the task does not exist.
Our idea is that we are not in the right Management/Execution Context. The Web page is running in an other context compared to the entities and their sensors. How can we put a task so that it's visible for the web consoles context also.
Is it possible to write the content into a file and then offer it for download via Jetty(brooklyns web server)? That would be a much simpler way.
Many tasks in Brooklyn default to being transient - i.e. they are deleted shortly after they complete (things like effector invocations are by default non-transient).
You can mark your task as non-transient using the code below in your use of the task builder:
.tag(BrooklynTaskTags.NON_TRANSIENT_TASK_TAG)
However, note that (as of Brooklyn version 0.9.0) tasks are kept in-memory using soft references. This means the stdout of the task will likely be lost at some point in the future, when that memory is needed for other in-memory objects.
For your use-case, would it make sense to have this as an effector result perhaps?
Or could you write to an object store such as S3 instead? The S3-approach would seem best to me.
For writing it to a file, care must be taken when used with Brooklyn high-availability. Would you write to a shared volume?
If you do write to a file, then you'd need to provide a web-extension so that people can access the contents of that file. As of Brooklyn 0.9.0, you can add your own WARs in code when calling BrooklynLauncher (which calls BrooklynWebServer).

MVC4 Getting Migrations to Run when publishing app through visual studio 2012

I created a visual studio 2012 MVC4 App. I am testing the "publish" functionality by right clicking the project and choosing publish. I followed the instructions here. I can connect to the remote web server and the folders get published to the correct folder, except the content folder for some reason.
When I run browse to the remote web server it prompts me for login so the app is working. However, the migrations never happened. The only tables created are the simplemembership tables, so I know the web server is connecting to the remote db server. No other tables are created and the seed method doesn't run. I seed the roles and a default user.
I checked the box in publish settings that says "Execute Code First Migrations (runs on application start)"
Everything works fine on my localdb connection string for local testing. Just can't figure out how to create db from existing migrations and seed when I publish to live site, note I will only seed once. Is there a way to specify which migrations to run again? I can copy the migrations and run on the database server but why the extra step?
EDIT:
When adding the database.setinilizer to my context I now get an error saying some of my fields in my userprofile table are not there, I use simple membership. This error occurs on the first page load after web publish, then on proceeding page loads I get an error The "WebSecurity.InitializeDatabaseConnection" method can be called only once.
HOwever, it does create my simplemembership tables now but my migration for all other tables never runs, that is why I am missing the additional user profile fields.
EDIT:
Basically I am not checking if websecurity is initialized prior to calling WebSecurity.InitializeDatabaseConnection so that resolved that issue. Now I have it partially working. The app creates the simplemembership tables fine but since I added tables to the UserProfile table I can't seed until I change them. So instead I manually create the userprofile table and have the app create the rest of the tables. Then I comment out the userprofile table in my initial migration. After this when I sign in it will then create the rest of my tables.
Open issue is how to get my database migration to run prior to the simplemembership initialization?
To get migration work on remote server, you need to add use SetInitializer in you Context class first :
static MyDatabaseContext()
{
Database.SetInitializer(new MigrateDatabaseToLatestVersion<MyProjectContext, Migrations.Configuration>());
}
And in you Migration Configuration you need to add this code :
public Configuration()
{
AutomaticMigrationsEnabled = false;
AutomaticMigrationDataLossAllowed = false;
}
I don't select the "Execute Code First Migrations (runs on application start)", and just after setting initialization in MyProjectContext, it does the migration for me.
If you have done by here, for seed your data, you can do same as below in your Migration configuration class:
protected override void Seed(MyProject.Models.MyProjectContextcontext)
{
DoSeed(context);
}
private void DoSeed(MyProjectContext context)
{
var users = new List<User>
{
new Site { UserId = 1, Name = "TestUser", IsDeleted = false },
new Site { UserId = 2, Name = "TestUser2", IsDeleted = false }
};
users.ForEach(s => context.Users.AddOrUpdate(s));
context.SaveChanges();
}
I have not selected the "Execute Code First Migrations (runs on application start)" on deploy Profile.
For some more details on this see this:
This link
and
This link
Hope this helps(as it worked for me), otherwise please add any error if there is, when you deploy you app.
I think the issue is ,Because of the fact that as long as you have not tried to access data
or create any data from/in site, which needs to connect to database, the migration and seeding
will not run"
And the reason for running migration on your site after logging into the site,
would be because your site need to be authorised in all pages, or the page that you want
to see data in.
If you have a page example home page, that does not authorization to access to the page,
you will see the page and if in this page there is some data that needs to be fetched from
data base, you may see the migration runs.
I found this from this Deploy it to IIS, but not sure if it is the reason.
Please let me know if your migration still has not ran if you browse your home page that
has data and no authentication needed for this data access.

How to get tcmid of currently logged user in Tridion?

private void Subscribe()
{
EventSystem.Subscribe<User, LoadEventArgs>(GetInfo, EventPhases.Initiated);
}
public void GetInfo(User user, LoadEventArgs args, EventPhases phase)
{
TcmUri id = user.Id;
string name = user.Title;
Console.WriteLine(id.ToString());
Console.WriteLine(name);
}
I wrote above code and add the assembly in config file in Tridion server but no console window is coming on login of a user
The event you were initially subscribing to is the processed phase of any identifiable object with any of its actions, that will trigger basically on every transaction happening in the SDL Tridion CMS, so it won't give you any indication of when a user logs in (it's basically everything which happens all the time).
Probably one of the first things which is happening after a user logs in, is that its user info and application data is read. So what you should try is something along the lines of:
private void Subscribe()
{
EventSystem.Subscribe<User, LoadEventArgs>(GetInfo, EventPhases.Initiated);
}
public void GetInfo(User user, LoadEventArgs args, EventPhases phase)
{
TcmUri id = user.Id;
string name = user.Title;
}
But do keep in mind that this will also be triggered by other actions, things like viewing history, checking publish transactions and possibly a lot more. I don't know how you can distinguish this action to be part of a user login, since there isn't an event triggered specifically for that.
You might want to check out if you can find anything specific for a login in the LoadEventArgs for instance in its ContextVariables, EventStack, FormerLoadState or LoadFlags.
Edit:
Please note that the Event System is running inside the SDL Tridion core, so you won't ever see a console window popup from anywhere. If you want to log information, you can include the following using statement:
using Tridion.Logging;
After adding a reference to the Tridion.Logging.dll which you can find in your ..\Tridion\bin\client directory. Then you can use the following logging statement in your code:
Logger.Write("message", "name", LoggingCategory.General, TraceEventType.Information);
Which you will find back in your Tridion Event log (provided you have set the logging level to show information messages too).
But probably the best option here is to just debug your event system, so you can directly inspect your object when the event is triggered. Here you can find a nice blog article about how to setup debugging of your event system.
If you want to get the TCM URI of the current user, you can do so in a number of ways.
I would recommend one of these:
Using the Core Service, call GetCurrentUser and read the Id property.
Using TOM.NET, read the User.Id property of the current Session.
It looks like you want #2 in this case as your code is in the event system.