How to create new Principal Object using Jackrabbit/JCR - jcr

I'm a new developer trying to tackle the Jackrabbit/JCR library. My team and I have been using the EveryonePrincipal for a couple of months now, but we've been wanting to implement more capabilities per user roles/principals so we can grant them necessary read/write access to each node. However, we're having some difficulties figuring out how to create a new Principal object.
I've been using:
PrincipalImpl newPrincipal = new PrincipalImpl("MyPrincipal");
Then creating a new RolePrincipal class matching the EveryonePrincipal, except the name would be "MyPrincipal" inside the RolePrincipal class. This method doesn't work unfortunately. Is there anything else we're missing from this? And how does the 'everyone' principal gets stored?
EveryonePrincipal.java
public final class EveryonePrincipal implements JackrabbitPrincipal, java.security.acl.Group {
public static final String NAME = "everyone";
private static final EveryonePrincipal INSTANCE = new EveryonePrincipal();
private EveryonePrincipal() { }
public static EveryonePrincipal getInstance() {
return INSTANCE;
}
//----------------------------------------------------------< Principal >---
#Override
public String getName() {
return NAME;
}
//--------------------------------------------------------------< Group >---
#Override
public boolean addMember(Principal user) {
return false;
}
#Override
public boolean removeMember(Principal user) {
throw new UnsupportedOperationException("Cannot remove a member from the everyone group.");
}
#Override
public boolean isMember(Principal member) {
return !member.equals(this);
}
#Override
public Enumeration<? extends Principal> members() {
throw new UnsupportedOperationException("Not implemented.");
}
//-------------------------------------------------------------< Object >---
#Override
public int hashCode() {
return NAME.hashCode();
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
} else if (obj instanceof JackrabbitPrincipal && obj instanceof Group) {
JackrabbitPrincipal other = (JackrabbitPrincipal) obj;
return NAME.equals(other.getName());
}
return false;
}
#Override
public String toString() {
return NAME + " principal";
}
}

Related

How do you write data into a Redis custom state store using Kafka Streams

I've recently been learning about how to use the Kafka Streams client and one thing that I've been struggling with is how to switch from the default state store (RocksDB) to a custom state store using something like Redis. The Confluent documentation makes it clear you have to implement the StateStore interface for your custom store and you must provide an implementation of StoreBuilder for creating instances of that store.
Here's what I have so far for my custom store. I've also added a simple write method to append a new entry into a specified stream via the Redis XADD command.
public class MyCustomStore<K,V> implements StateStore, MyWriteableCustomStore<K,V> {
private String name;
private volatile boolean open = false;
private boolean loggingEnabled = false;
public MyCustomStore(String name, boolean loggingEnabled) {
this.name = name;
this.loggingEnabled = loggingEnabled;
}
#Override
public String name() {
return this.name;
}
#Override
public void init(ProcessorContext context, StateStore root) {
if (root != null) {
// register the store
context.register(root, (key, value) -> {
write(key.toString(), value.toString());
});
}
this.open = true;
}
#Override
public void flush() {
// TODO Auto-generated method stub
}
#Override
public void close() {
// TODO Auto-generated method stub
}
#Override
public boolean persistent() {
// TODO Auto-generated method stub
return true;
}
#Override
public boolean isOpen() {
// TODO Auto-generated method stub
return false;
}
#Override
public void write(String key, String value) {
try(Jedis jedis = new Jedis("localhost", 6379)) {
Map<String, String> hash = new HashMap<>();
hash.put(key, value);
jedis.xadd("MyStream", StreamEntryID.NEW_ENTRY, hash);
}
}
}
public class MyCustomStoreBuilder implements StoreBuilder<MyCustomStore<String,String>> {
private boolean cached = true;
private String name;
private Map<String,String> logConfig=new HashMap<>();
private boolean loggingEnabled;
public MyCustomStoreBuilder(String name, boolean loggingEnabled){
this.name = name;
this.loggingEnabled = loggingEnabled;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withCachingEnabled() {
this.cached = true;
return this;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withCachingDisabled() {
this.cached = false;
return null;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withLoggingEnabled(Map<String, String> config) {
loggingEnabled=true;
return this;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withLoggingDisabled() {
this.loggingEnabled = false;
return this;
}
#Override
public MyCustomStore<String,String> build() {
return new MyCustomStore<String,String>(this.name, this.loggingEnabled);
}
#Override
public Map<String, String> logConfig() {
return logConfig;
}
#Override
public boolean loggingEnabled() {
return loggingEnabled;
}
#Override
public String name() {
return name;
}
}
And here's what my setup and topology look like.
#Bean
public KafkaStreams kafkaStreams(KafkaProperties kafkaProperties) {
final Properties props = new Properties();
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
props.put(StreamsConfig.APPLICATION_ID_CONFIG, appName);
props.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl);
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Long().getClass());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Double().getClass());
props.put(StreamsConfig.STATE_DIR_CONFIG, "data");
props.put(StreamsConfig.APPLICATION_SERVER_CONFIG, appServerConfig);
props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, JsonNode.class);
props.put(DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG, LogAndContinueExceptionHandler.class);
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
final String storeName = "the-custome-store";
Topology topology = new Topology();
// Create CustomStoreSupplier for store name the-custom-store
MyCustomStoreBuilder customStoreBuilder = new MyCustomStoreBuilder(storeName, false);
topology.addSource("input","inputTopic");
topology.addProcessor("redis-processor", () -> new RedisProcessor(storeName), "input");
topology.addStateStore(customStoreBuilder, "redis-processor");
KafkaStreams kafkaStreams = new KafkaStreams(topology, props);
kafkaStreams.start();
return kafkaStreams;
}
public class MyCustomStoreType<K,V> implements QueryableStoreType<MyReadableCustomStore<String,String>> {
#Override
public boolean accepts(StateStore stateStore) {
return stateStore instanceof MyCustomStore;
}
#Override
public MyReadableCustomStore<String,String> create(final StateStoreProvider storeProvider, final String storeName) {
return new MyCustomStoreTypeWrapper<>(storeProvider, storeName, this);
}
}
public class MyCustomStoreTypeWrapper<K,V> implements MyReadableCustomStore<K,V> {
private final QueryableStoreType<MyReadableCustomStore<String, String>> customStoreType;
private final String storeName;
private final StateStoreProvider provider;
public MyCustomStoreTypeWrapper(final StateStoreProvider provider,
final String storeName,
final QueryableStoreType<MyReadableCustomStore<String, String>> customStoreType) {
this.provider = provider;
this.storeName = storeName;
this.customStoreType = customStoreType;
}
#Override
public String read(String key) {
try (Jedis jedis = new Jedis("localhost", 6379)) {
StreamEntryID start = new StreamEntryID(0, 0);
StreamEntryID end = null; // null -> until the last item in the stream
int count = 2;
List<StreamEntry> list = jedis.xrange("MyStream", start, end, count);
if (list != null) {
// Get the most recently added item, which is also the last item
StreamEntry streamData = list.get(list.size() - 1);
return streamData.toString();
} else {
System.out.println("No new data in the stream");
}
return "";
}
}
}
// This throws the InvalidStateStoreException when I try to get access to the custom store
MyReadableCustomStore<String,String> store = streams.store("the-custome-store", new MyCustomStoreType<String,String>());
String value = store.read("testKey");
So, my question is how do I actually get the state store data to persist into Redis now? I feel like I'm missing something in the state store initialization or with the StateRestoreCallback. Any help or clarification with this would be greatly appreciated.
It looks to me that you have the store wired up to the topology correctly. But you don't have any processors using the store.
It could look something like this:
final String storeName = "the-custome-store";
MyCustomStoreBuilder customStoreBuilder = new MyCustomStoreBuilder(storeName, false);
Topology topology = new Topology()
topology.addSource("input", "input-topic");
// makes the processor a child of the source node
// the source node forwards its records to the child processor node
topology.addProcessor("redis-processor", () -> new RedisProcessor(storeName), "input");
// add the store and specify the processor(s) that access the store
topology.addStateStore(storeBuilder, "redis-processor");
class RedisProcessor implements Processor<byte[], byte[]> {
final String storeName;
MyCustomStore<byte[],byte[]> stateStore;
public RedisProcessor(String storeName) {
this.storeName = storeName;
}
#Override
public void init(ProcessorContext context) {
stateStore = (MyCustomeStore<byte[], byte[]>) context.getStateStore(storeName);
}
#Override
public void process(byte[] key, byte[] value) {
stateStore.write(key, value);
}
#Override
public void close() {
}
}
HTH, and let me know how it works out for you.
Update to answer from comments:
I think you need to update MyCustomStore.isOpen() to return the open variable.
Right now it's hardcoded to return false
Override
public boolean isOpen() {
// TODO Auto-generated method stub
return false;
}

Issue in editing in Tableviewer

I created a table-viewer with two columns in it.first column name is "Fristname" and second column name "lastname". I added editor support to both the columns but i able to do edit/select only in the first column. In my second column not able to do editing/selecting. Don't know why some one please help me? Following is the code snippet.
public class ViewPart1 extends ViewPart {
public ViewPart1() {
// TODO Auto-generated constructor stub
}
private ResourceManager resourceManager = new LocalResourceManager(
JFaceResources.getResources());
#Override
public void createPartControl(Composite parent) {
// re-use an existing image
final Image image = FieldDecorationRegistry.getDefault()
.getFieldDecoration(FieldDecorationRegistry.DEC_INFORMATION)
.getImage();
TableViewer tblView = new TableViewer(parent);
final Table table = tblView.getTable();
table.setHeaderVisible(true);
table.setLinesVisible(true);
TableViewerColumn fn = new TableViewerColumn(tblView, SWT.BORDER, 0);
fn.getColumn().setWidth(150);
fn.getColumn().setText("Firstname");
fn.setLabelProvider(new ColumnLabelProvider() {
#Override
public String getText(Object element) {
Person p = (Person) element;
return p.getFirstName();
}
#Override
public Image getImage(Object element) {
return image;
}
});
// fn.setEditingSupport(new EditColumn(tblView));
fn = new TableViewerColumn(tblView, SWT.BORDER, 1);
fn.getColumn().setWidth(150);
fn.getColumn().setText("Last name");
fn.setLabelProvider(new ColumnLabelProvider() {
#Override
public String getText(Object element) {
Person p = (Person) element;
return p.getLastNAme();
}
#Override
public Image getImage(Object element) {
return image;
}
});
// fn.setEditingSupport(new EditColumn(tblView));
tblView.setContentProvider(new QContentProvider());
ArrayList<Person> list = new ArrayList<Person>();
list.add(new Person("a", "b"));
list.add(new Person("C", "D"));
tblView.setInput(list);
tblView.refresh();
}
#Override
public void setFocus() {
// TODO Auto-generated method stub
}
}
Below EditColumn class Code
public class EditColumn extends EditingSupport {
private final TableViewer viewer;
private final CellEditor editor;
public EditColumn(TableViewer viewer) {
super(viewer);
this.viewer = viewer;
this.editor = new TextCellEditor(viewer.getTable());
}
#Override
protected boolean canEdit(Object element) {
System.out.println("can edit");
return true;
}
#Override
protected CellEditor getCellEditor(Object element) {
return editor;
}
#Override
protected Object getValue(Object element) {
// TODO Auto-generated method stub
return ((Person) element).getFirstName();
}
#Override
protected void setValue(Object element, Object value) {
((Person) element).setFirstName(String.valueOf(value));
viewer.update(element, null);
}
}
Your EditColumn class is always using the getFirstName and setFirstName methods of Person so although you can edit the last name column you are not using the correct values.
You need to use different EditingSupport classes for each column.

Using NHibernate interceptor together with Ninject to retrieve the logged in user

I was reading this article and found it quite interesting (thanks #Aaronaught). Was what came closest to solve my problem.
The only detail is that in my case I would use the NHibernate interceptor, but an exception is thrown An unhandled exception of type 'System.StackOverflowException' occurred in System.Core.dll
Code
Session factory:
public class SessionFactoryBuilder : IProvider
{
private ISessionFactory _sessionFactory;
private readonly Configuration _configuration;
public SessionFactoryBuilder(AuditInterceptor auditInterceptor)
{
_configuration = Fluently.Configure(new Configuration().Configure())
.Mappings(m => m.AutoMappings.Add(AutoMap.AssemblyOf<IEntidade>(new AutomappingConfiguration())))
.ExposeConfiguration(SetupDatabase)
.BuildConfiguration();
_configuration.SetInterceptor(auditInterceptor);
_sessionFactory = _configuration.BuildSessionFactory();
}
private static void SetupDatabase(Configuration config)
{
var schema = new SchemaExport(config);
//schema.Execute(true, true, false);
}
public object Create(IContext context)
{
return _sessionFactory;
}
public Type Type
{
get { return typeof(ISessionFactory); }
}
}
I have a module that sets up my repositories and ORM (NHibernate)
public class RepositoriosModule : NinjectModule
{
public override void Load()
{
Bind<AuditInterceptor>().ToSelf().InRequestScope();
// NHibernate
Bind<ISessionFactory>().ToProvider<SessionFactoryBuilder>().InSingletonScope();
Bind<ISession>().ToMethod(CreateSession).InRequestScope();
Bind<NHUnitOfWork>().ToSelf().InRequestScope();
//Model Repositories
Bind<IRepositorio<Usuario>, IUsuariosRepositorio>().To<UsuariosRepositorio>().InRequestScope();
}
private ISession CreateSession(IContext context)
{
return context.Kernel.Get<ISessionFactory>().OpenSession();
}
}
Interceptor to update auditable properties (CriadoEm (create at), CriadoPor (create by), AtualizadoEm and AtualizadoPor)
public class AuditInterceptor : EmptyInterceptor
{
private readonly IUsuario _usuarioLogado;
public AuditInterceptor(IUsuario usuarioLogado)
{
_usuarioLogado = usuarioLogado;
}
public override bool OnFlushDirty(object entity, object id, object[] currentState, object[] previousState, string[] propertyNames, NHibernate.Type.IType[] types)
{
var auditableObject = entity as IAuditavel;
if (auditableObject != null)
{
currentState[Array.IndexOf(propertyNames, "AtualizadoEm")] = DateTime.Now;
return true;
}
return false;
}
public override bool OnSave(object entity, object id, object[] state, string[] propertyNames, NHibernate.Type.IType[] types)
{
var auditableObject = entity as IAuditavel;
if (auditableObject != null)
{
var currentDate = DateTime.Now;
state[Array.IndexOf(propertyNames, "CriadoEm")] = currentDate;
return true;
}
return false;
}
}
A provider to retrieve the logged in user:
public class UsuarioProvider : Provider
{
private Usuario _usuario;
protected override Usuario CreateInstance(IContext context)
{
var usuariosRepositorio = context.Kernel.Get<IUsuariosRepositorio>(); // Stackoverflow on this line!!
if (_usuario == null && WebSecurity.IsAuthenticated)
_usuario = usuariosRepositorio.Get(WebSecurity.CurrentUserId);
return _usuario;
}
}
And the class NinjectWebCommon (web application) define:
private static void RegisterServices(IKernel kernel)
{
kernel.Bind<IUsuario>().ToProvider<UsuarioProvider>().InRequestScope(); //.When((req) => WebSecurity.IsAuthenticated)
kernel.Load(new RepositoriosModule(), new MvcSiteMapProviderModule());
}
[Add] Repository class
public class UsuariosRepositorio : Repositorio<Usuario>, IUsuariosRepositorio
{
public UsuariosRepositorio(NHUnitOfWork unitOfWork)
: base(unitOfWork)
{ }
}
public class Repositorio<T> : IRepositorio<T>
where T : class, IEntidade
{
private readonly NHUnitOfWork _unitOfWork;
public IUnitOfWork UnitOfWork { get { return _unitOfWork; } }
private readonly ISession _session;
public Repositorio(IUnitOfWork unitOfWork)
{
_unitOfWork = (NHUnitOfWork)unitOfWork;
_session = _unitOfWork.Context.SessionFactory.GetCurrentSession();
}
public void Remover(T obj)
{
_session.Delete(obj);
}
public void Armazenar(T obj)
{
_session.SaveOrUpdate(obj);
}
public IQueryable<T> All()
{
return _session.Query<T>();
}
public object Get(Type entity, int id)
{
return _session.Get(entity, id);
}
public T Get(Expression<Func<T, bool>> expression)
{
return Query(expression).SingleOrDefault();
}
public T Get(int id)
{
return _session.Get<T>(id);
}
public IQueryable<T> Query(Expression<Func<T, bool>> expression)
{
return All().Where(expression);
}
}
Problem
The problem occurs in the class UsuarioProvider while trying to retrieve the user repository.
Stackoverflow error:
An unhandled exception of type 'System.StackOverflowException' occurred in System.Core.dll
I see two problems :
The main problem I see is that SessionFactoryBuilder needs an AuditInterceptor which needs an IUsuario, which needs a UsuarioProvider, which needs a SessionFactoryBuilder, thus introducing a cycle, and a stack-overflow.
The second problem I see is that your AuditInterceptor is linked to a request when your SessionFactoryBuilder is singleton like. I must confess I can't see how it work with several logged users.
You should instantiate and attach the AuditInterceptor as part of the CreateSession, instead of trying to create it once and for all as part of the Session builder. Once this is done, your interceptor should not rely on a Session that needs an AuditInterceptor as part of its creation (you may need a separate Session creation mechanism for that. A stateless Session might do the trick)

Vaadin - Lazy Query Container

I'm doing my project in Vaadin 7. I need to implement a Lazy Query Container for a Treetable. I will get data for the Treetable from a web service.
Could someone please show how to use a Lazy Query Container with a web service as my data source?
Please let me know the steps required to implement this or show sample code to get me started.
There is good documentation for LQC here: https://vaadin.com/wiki/-/wiki/Main/Lazy%20Query%20Container
The examples in documentation are implementing MovieQuery using javax.persistence API, but it might be easier to use the simple MockQuery example as basis and replace the actual data fetching with webservice calls.
Have a look at the following lazy loading Hierarchical interface. All data is read from a webservice IViewService. The example uses the Tree component but it also works for TreeTable.
It's very important to store all elements in a local structure (in my case in the HashMap hierarchy), do not read elements multiple times this does not work. I think because Vaadin does not use equals() and hashCode().
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import com.softmodeler.common.CommonPlugin;
import com.softmodeler.model.OutputNode;
import com.softmodeler.service.IViewService;
import com.vaadin.data.Container.Hierarchical;
import com.vaadin.data.Item;
import com.vaadin.data.Property;
import com.vaadin.data.util.BeanItem;
/**
* #author Flavio Donzé
* #version 1.0
*/
public class OutputNodeHierachical implements Hierarchical {
private static final long serialVersionUID = 8289589835030184018L;
/** the view service */
private IViewService service = CommonPlugin.getService(IViewService.class);
/** collection of all root nodes */
private List<OutputNode> rootNodes = null;
/** parent=>children mapping */
private Map<OutputNode, List<OutputNode>> hierarchy = new HashMap<>();
/**
* constructor
*
* #param rootNodes collection of all root nodes
*/
public OutputNodeHierachical(List<OutputNode> rootNodes) {
this.rootNodes = Collections.unmodifiableList(rootNodes);
addToHierarchy(rootNodes);
}
#Override
public Collection<?> getChildren(Object itemId) {
try {
List<OutputNode> children = hierarchy.get(itemId);
if (children == null) {
OutputNode node = (OutputNode) itemId;
children = service.getChildren(node.getNodeId(), false);
hierarchy.put(node, children);
// add children to hierarchy, their children will be added on click
addToHierarchy(children);
}
return children;
} catch (Exception e) {
VaadinUtil.handleException(e);
}
return null;
}
/**
* add each element to the hierarchy without their children hierarchy(child=>null)
*
* #param children elements to add
*/
private void addToHierarchy(List<OutputNode> children) {
for (OutputNode child : children) {
hierarchy.put(child, null);
}
}
#Override
public boolean areChildrenAllowed(Object itemId) {
return !((OutputNode) itemId).getChilds().isEmpty();
}
#Override
public boolean hasChildren(Object itemId) {
return !((OutputNode) itemId).getChilds().isEmpty();
}
#Override
public Object getParent(Object itemId) {
String parentId = ((OutputNode) itemId).getParentId();
for (OutputNode node : hierarchy.keySet()) {
if (node.getNodeId().equals(parentId)) {
return node;
}
}
return null;
}
#Override
public Collection<?> rootItemIds() {
return rootNodes;
}
#Override
public boolean isRoot(Object itemId) {
return rootNodes.contains(itemId);
}
#Override
public Item getItem(Object itemId) {
return new BeanItem<OutputNode>((OutputNode) itemId);
}
#Override
public boolean containsId(Object itemId) {
return hierarchy.containsKey(itemId);
}
#Override
public Collection<?> getItemIds() {
return hierarchy.keySet();
}
#Override
public int size() {
return hierarchy.size();
}
#Override
public boolean setParent(Object itemId, Object newParentId) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public boolean setChildrenAllowed(Object itemId, boolean areChildrenAllowed) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public Item addItem(Object itemId) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public Object addItem() throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public boolean removeItem(Object itemId) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public boolean removeAllItems() throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public Class<?> getType(Object propertyId) {
throw new UnsupportedOperationException();
}
#Override
public Collection<?> getContainerPropertyIds() {
throw new UnsupportedOperationException();
}
#Override
public Property<?> getContainerProperty(Object itemId, Object propertyId) {
throw new UnsupportedOperationException();
}
#Override
public boolean addContainerProperty(Object propertyId, Class<?> type, Object defaultValue) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
#Override
public boolean removeContainerProperty(Object propertyId) throws UnsupportedOperationException {
throw new UnsupportedOperationException();
}
}
Adding the container to the Tree like this:
OutputNodeHierachical dataSource = new OutputNodeHierachical(rootNodes);
Tree mainTree = new Tree();
mainTree.setSizeFull();
mainTree.setContainerDataSource(dataSource);
mainTree.addItemClickListener(new ItemClickListener() {
private static final long serialVersionUID = -413371711541672605L;
#Override
public void itemClick(ItemClickEvent event) {
OutputNode node = (OutputNode) event.getItemId();
openObject(node.getObjectId());
}
});

using RavenDB with ServiceStack

I read this post by Phillip Haydon about how to use NHibernate/RavenDB with ServiceStack.
I don't see the point about getting the IDocumentStore and open new session every time i need something from the db like this:
public class FooService : ServiceBase<Foo>
{
public IDocumentStore RavenStore{ get; set; }
protected override object Run(ProductFind request)
{
using (var session = RavenStore.OpenSession())
{
// Do Something...
return new FooResponse{/*Object init*/};
}
}
}
Why cant i just use one session per request and when the request is ended, commit the changes or roll them back according to the response status?
If my approach is fine, than how can i implement it?
here is my attempt:
I created this class:
public class RavenSession : IRavenSession
{
#region Data Members
private readonly IDocumentStore _store;
private IDocumentSession _innerSession;
#endregion
#region Properties
public IDocumentSession InnerSession
{
get { return _innerSession ?? (_innerSession = _store.OpenSession()); }
}
#endregion
#region Ctor
public RavenSession(IDocumentStore store)
{
_store = store;
}
#endregion
#region Public Methods
public void Commit()
{
if (_innerSession != null)
{
try
{
InnerSession.SaveChanges();
}
finally
{
InnerSession.Dispose();
}
}
}
public void Rollback()
{
if (_innerSession != null)
{
InnerSession.Dispose();
}
}
#endregion
#region IDocumentSession Delegation
public ISyncAdvancedSessionOperation Advanced
{
get { return InnerSession.Advanced; }
}
public void Delete<T>(T entity)
{
InnerSession.Delete(entity);
}
public ILoaderWithInclude<object> Include(string path)
{
return InnerSession.Include(path);
}
public ILoaderWithInclude<T> Include<T, TInclude>(Expression<Func<T, object>> path)
{
return InnerSession.Include<T, TInclude>(path);
}
public ILoaderWithInclude<T> Include<T>(Expression<Func<T, object>> path)
{
return InnerSession.Include(path);
}
public T Load<T>(string id)
{
return InnerSession.Load<T>(id);
}
public T[] Load<T>(params string[] ids)
{
return InnerSession.Load<T>(ids);
}
public T Load<T>(ValueType id)
{
return InnerSession.Load<T>(id);
}
public T[] Load<T>(IEnumerable<string> ids)
{
return InnerSession.Load<T>(ids);
}
public IRavenQueryable<T> Query<T, TIndexCreator>() where TIndexCreator : AbstractIndexCreationTask, new()
{
return InnerSession.Query<T, TIndexCreator>();
}
public IRavenQueryable<T> Query<T>()
{
return InnerSession.Query<T>();
}
public IRavenQueryable<T> Query<T>(string indexName)
{
return InnerSession.Query<T>(indexName);
}
public void Store(dynamic entity, string id)
{
InnerSession.Store(entity, id);
}
public void Store(object entity, Guid etag, string id)
{
InnerSession.Store(entity, etag, id);
}
public void Store(object entity, Guid etag)
{
InnerSession.Store(entity, etag);
}
public void Store(dynamic entity)
{
InnerSession.Store(entity);
}
#endregion
}
And now my service looks like this:
public class FooService : ServiceBase<Foo>
{
public IRavenSession RavenSession { get; set; }
protected override object Run(ProductFind request)
{
// Do Something with RavenSession...
return new FooResponse {/*Object init*/};
}
}
but i still need to find a way to know when the request is ended for commit/rollback the changes.
the best way i found is by using ResponseFilters:
public class AppHost : AppHostBase
{
public AppHost()
: base("", typeof (Foo).Assembly, typeof (FooService).Assembly)
{
}
public override void Configure(Container container)
{
// Some Configuration...
this.ResponseFilters.Add((httpReq, httpResp, respnseDto) =>
{
var currentSession = (RavenSession) this.Container.Resolve<IRavenSession>();
if (!httpResp.IsErrorResponse())
{
currentSession.Commit();
}
else
{
currentSession.Rollback();
}
});
// Some Configuration...
}
}
I am sure that there is a better way to do this but how?
I just included this on the Configure method for the AppHost
var store = new DocumentStore()
{
Url = "http://127.0.0.1:8080",
DefaultDatabase = "Test"
}.Initialize();
container.Register(store);
container.Register(c => c.Resolve<IDocumentStore>().OpenSession()).ReusedWithin(ReuseScope.Request);
You can put it aside on module and initialize it.
Then in your services just add a constructor that accepts IDocumentSession
public HelloService : Service {
private readonly IDocumentSession session;
public HelloService(IDocumentSession session) {
this.session = session;
}
}
And you're good to go.
Filtering the response in ServiceStack
The ways to introspect the Response in ServiceStack is with either:
The Response Filter or Response Filter Attributes or other custom hooks
Overriding AppHost.ServiceExceptionHandler or custom OnAfterExecute() hook
Some other notes that might be helpful:
ServiceStack's built-in IOC (Funq) now supports RequestScope
You can add IDisposable to a base class which gets called immediately after the service has finished executing, e.g. if you were to use an RDBMS:
public class FooServiceBase : IService, IDisposable
{
public IDbConnectionFactory DbFactory { get; set; }
private IDbConnection db;
public IDbConnection Db
{
get { return db ?? (db = DbFactory.OpenDbConnection()); }
}
public object Any(ProductFind request)
{
return new FooResponse {
Result = Db.Id<Product>(request.Id)
};
}
public void Dispose()
{
if (db != null) db.Dispose();
}
}
I tried the answer given by Felipe Leusin but it has not worked for me. The main thing that I want to achieve is having a single DocumentSession.SaveChanges call per request. After looking at the RacoonBlog DocumentSession lifecycle management and at ServiceStack request lifecycle events I put together a configuration that works for me:
public override void Configure(Funq.Container container)
{
RequestFilters.Add((httpReq, httpRes, requestDto) =>
{
IDocumentSession documentSession = Container.Resolve<IDocumentStore>().OpenSession();
Container.Register<IDocumentSession>(documentSession);
});
ResponseFilters.Add((httpReq, httpRes, requestDto) =>
{
using (var documentSession = Container.Resolve<IDocumentSession>())
{
if (documentSession == null)
return;
if (httpRes.StatusCode >= 400 && httpRes.StatusCode < 600)
return;
documentSession.SaveChanges();
}
});
var documentStore = new DocumentStore
{
ConnectionStringName = "RavenDBServer",
DefaultDatabase = "MyDatabase",
}.Initialize();
container.Register(documentStore);
I am using funq with RequestScope for my RavenSession, and now i update it to:
public class RavenSession : IRavenSession, IDisposable
{
#region Data Members
private readonly IDocumentStore _store;
private readonly IRequestContext _context;
private IDocumentSession _innerSession;
#endregion
#region Properties
public IDocumentSession InnerSession
{
get { return _innerSession ?? (_innerSession = _store.OpenSession()); }
}
#endregion
#region Ctor
public RavenSession(IDocumentStore store, IRequestContext context)
{
_store = store;
_context = context;
}
#endregion
#region IDocumentSession Delegation
public ISyncAdvancedSessionOperation Advanced
{
get { return InnerSession.Advanced; }
}
public void Delete<T>(T entity)
{
InnerSession.Delete(entity);
}
public ILoaderWithInclude<object> Include(string path)
{
return InnerSession.Include(path);
}
public ILoaderWithInclude<T> Include<T, TInclude>(Expression<Func<T, object>> path)
{
return InnerSession.Include<T, TInclude>(path);
}
public ILoaderWithInclude<T> Include<T>(Expression<Func<T, object>> path)
{
return InnerSession.Include(path);
}
public T Load<T>(string id)
{
return InnerSession.Load<T>(id);
}
public T[] Load<T>(params string[] ids)
{
return InnerSession.Load<T>(ids);
}
public T Load<T>(ValueType id)
{
return InnerSession.Load<T>(id);
}
public T[] Load<T>(IEnumerable<string> ids)
{
return InnerSession.Load<T>(ids);
}
public IRavenQueryable<T> Query<T, TIndexCreator>() where TIndexCreator : AbstractIndexCreationTask, new()
{
return InnerSession.Query<T, TIndexCreator>();
}
public IRavenQueryable<T> Query<T>()
{
return InnerSession.Query<T>();
}
public IRavenQueryable<T> Query<T>(string indexName)
{
return InnerSession.Query<T>(indexName);
}
public void Store(dynamic entity, string id)
{
InnerSession.Store(entity, id);
}
public void Store(object entity, Guid etag, string id)
{
InnerSession.Store(entity, etag, id);
}
public void Store(object entity, Guid etag)
{
InnerSession.Store(entity, etag);
}
public void Store(dynamic entity)
{
InnerSession.Store(entity);
}
#endregion
#region Implementation of IDisposable
public void Dispose()
{
if (_innerSession != null)
{
var httpResponse = _context.Get<IHttpResponse>();
try
{
if (!httpResponse.IsErrorResponse())
{
_innerSession.SaveChanges();
}
}
finally
{
_innerSession.Dispose();
}
}
}
#endregion
}
but this would not work because:
1) although i am using RequestScope, no one is register the IRequestContext of the request so funq cant resolve my RavenSession.
2) funq does not run the Dispose method after the request is done, which is odd.