Do we need to close DBCPConnectionPool object in Custom Processor Or Is it handled by Controller Service itself? - apache

I have created a Custom processor which take care of saving some records in mysql database. For setting up mysql database i am using DBCPConnectionPool object in my custom processor which does work of saving data to database tables correctly, But i am worried of pooling mechanism i am not closing this connection after my logic of saving is completed. This is working for 2 to 3 flowfiles but when i send multiple flowfile will it work correctly?
DBCPService dbcpService = context.getProperty(DBCP_SERVICE).asControllerService(DBCPService.class);
Connection con = dbcpService.getConnection();
I am looking for clarification as my currently flow is working correctly with less number of flowfile

You should be returning it to the pool, most likely with a try-with-resource:
try (final Connection con = dbcpService.getConnection();
final PreparedStatement st = con.prepareStatement(selectQuery)) {
}
You can always consult the standard processors to see what they do:
https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AbstractExecuteSQL.java#L223

Related

ArcGis Offline map layer changes synchronization

In my WPF application I’m trying to use off-line map functionality. Right now my feature service is configured for data sync and I’m able to create data replica on server and download local copy of geodatabase.
gdbSyncTask = await GeodatabaseSyncTask.CreateAsync(_featureServiceUri);
Envelope extent = new Envelope(xmin, ymin, xmax, ymax, new SpatialReference(wkidStart));
GenerateGeodatabaseParameters generateParams = await _gdbSyncTask.CreateDefaultGenerateGeodatabaseParametersAsync(extent);
_generateGdbJob = _gdbSyncTask.GenerateGeodatabase(generateParams, _gdbPath);
_generateGdbJob.JobChanged += GenerateGdbJobChanged;
_generateGdbJob.ProgressChanged += ((object sender, EventArgs e) =>
{
UpdateProgressBar();
});
_generateGdbJob.Start();
After initial synchronization, I’m able to successfully work with map in off-line mode. This includes operations like adding new geometries or editing existing polygons inside local DB.
However, when I’m trying to synchronize changes back to server – I’m getting no results.
To perform data synchronization with local database – I’m using the following code:
SyncGeodatabaseParameters parameters = new SyncGeodatabaseParameters()
{
GeodatabaseSyncDirection = SyncDirection.Bidirectional,
RollbackOnFailure = false
};
Geodatabase gdb = await Geodatabase.OpenAsync(this.GetGdbPath());
foreach (GeodatabaseFeatureTable table in gdb.GeodatabaseFeatureTables)
{
long id = table.ServiceLayerId;
SyncLayerOption option = new SyncLayerOption(id);
option.SyncDirection = SyncDirection.Bidirectional;
parameters.LayerOptions.Add(option);
}
_gdbSyncTask = await GeodatabaseSyncTask.CreateAsync(_featureServiceUri);
SyncGeodatabaseJob job = _gdbSyncTask.SyncGeodatabase(parameters, gdb);
job.JobChanged += SyncJob_JobChanged;
job.ProgressChanged += SyncJob_ProgressChanged;
job.Start();
Everything goes well. The synchronization ends with status “Succeeded”. The messages logged by the SyncGeodatabaseJob are like on the screen below:
However – when I open edited feature layer from server inside map web client I cannot found any of my local changes. In the serve database I can also see that no new records were created during synchronization.
Interesting think is that when I open “Replica” data inside web I can see the following information:
Replica Server Gen: 2
Creation Date: 2018/02/07 10:49:54 UTC
Last Sync Date: 2018/02/07 10:49:54 UTC
The “Last Sync Data” is equal to replica “Creation date” However, in the replica log in ArcMap I can see the following information:
Can anyone can tell me how should I interpret above described situation? Am I missing some steps in my code? Or maybe some configuration feature is missing on the server? It looks like data modifications are successfully pushed back to replica on server but after that replica is not synchronized with server database (should it work automatically?).
I’m a “fresh” person regarding ArcGis development so any help will be appreciated
Thanks for all the answers. It occurred that there is versioning enabled on the server database and the offline, versioned changes was not reconciled to the server.
After running reconcile/post script (http://desktop.arcgis.com/en/arcmap/10.3/manage-data/geodatabases/automate-reconcile-post-after-sync.htm) off-line changes started to be visibile to other system users.
The code looks ok on fast look so I would assume that there is something going on in the setup.
What do you get back from the sync operation after the sync has completed? Note that you can just use await syncJob.GetResultsAsync to start the job and wait the results.
How is the Feature Service set up on the server? Please refer https://enterprise.arcgis.com/en/server/latest/publish-services/linux/prepare-data-for-offline-use.htm for the different ways to set these things.

Simpleroleprovider causing remote transaction inside transactionscope

I am in the process of upgrading asp.net membership to the new simplemembership provider in MVC4. This is an Azure/Sql Azure app which runs fine on localhost but fails when deployed. I have code in a transaction as follows:
TransactionOptions toptions = new TransactionOptions();
toptions.IsolationLevel = System.Transactions.IsolationLevel.Serializable;
using (TransactionScope trans = new TransactionScope(TransactionScopeOption.Required, toptions))
{
try
{
... do a bunch of database stuff in a single dbContext ...
var roleprov = (SimpleRoleProvider)Roles.Provider;
string[] roles = roleprov.GetRolesForUser(Username);
// above line fails with The transaction manager has disabled its support for remote/network transactions. (Exception from HRESULT: 0x8004D024)
}
}
I am using this technique to populate the Roles classes. The stack trace seems to indicate that it is indeed trying to fire off a sub-transaction to complete that call. The simplemembership tables are in a different db. How can I retrieve role info from the role provider inside the context of a separate transaction?
The problem is that GetRolesForUser causes a new connection to open to a second database, and that in turn picks up that it is in a TransactionScope. In turn this (MSDN - System.Transactions Integration with SQL Server) then promotes to the DTC. You could try a few options:
Get roles before the transaction starts
You could retrieve string[] roles outside your TransactionScope. Is there a reason you need to get them inside the scope? Given that you say:
How can I retrieve role info from the role provider inside the context of a separate transaction
it sounds like you could get the role info before the TransactionScope and have no problems.
Turn off transactions on the simple membership connection string
You can tell a connection string not to take part in transactions by putting "enlist=false" (see SqlConnection.ConnectionString) in the connection string, so this might be one option for you if you never need transactions on the database you use for Simple Membership.
Try opening the Simple Membership connection before the transaction
For SimpleRoleProvider it creates it's database object, and then opens it the first time it uses it. But, it doesn't close it until .... Scratch that, the connection is opened on each call to GetRolesForUser so you are out of luck. I was thinking you could call GetRolesForUser once before TransactionScope is opened, and then again inside the scope using the already open connection - you can't.
Play with the IObjectContextAdapter
Disclaimer: I can't promise this will work as I can't test with your setup.
You can play tricks to prevent promotion with two connection strings by opening the non-transaction connection string outside the transaction scope first, and then the transaction shouldn't be promoted. This can also be used if you cause the same connection to Close and then Open inside the same transaction scope (which would otherwise cause promotion).
You could try this with your context, and see if that stopped the GetRolesForUser promoting the transaction, but I doubt that would work as GetRolesForUser causes the connection to open if it isn't already. As I can't test in your scenario, I will include it in case it helps.
using (var db = new ExampleContext())
{
var adapter = db as System.Data.Entity.Infrastructure.IObjectContextAdapter;
using (var conn = adapter.ObjectContext.Connection)
{
conn.Open();
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Required))
{
// perform operations
db.SaveChanges();
// perform more operations
db.SaveChanges();
// perform even more operations
db.SaveChanges();
// If you don't complete, the transaction won't commit and you will lose the changes
scope.Complete();
}
}
}

dataSource injection in a Grails service

I have a service with application scope, not transactional.
I have a service method which:
uses the injected dataSource to create a stored procedure call [using Sql.call{...}]. Executes and traverse the resultset.
Based on the resultset, I subdivide the resultsets into equal sizes chunks and process them in multiple threads.
Each thread tries to do Sql sql = new Sql(dataSource)
Here a deadlock occurs.
Why is that? Does dataSource not return a possibly new or an idle connection?
Try to look into Gpars : It's a groovy parallalization framework.
I run into exactly the same issue. After hours of searching i've found the solution.
In your Datasource.groovy configfile you are able to set the parameters for the connection pooling to the database.
I've changed the minIdle, maxIdle and maxActive settings of the http://commons.apache.org/proper/commons-dbcp/apidocs/org/apache/commons/dbcp/BasicDataSource.html so that my configfile looks something like this:
dataSource {
url = "jdbc:mysql://127.0.0.1/sipsy_dev?autoReconnect=true&zeroDateTimeBehavior=convertToNull"
driverClassName = "com.mysql.jdbc.Driver"
username = "sipsy_dev"
password = "sipsy_dev"
pooled = true
properties {
minEvictableIdleTimeMillis=1800000
timeBetweenEvictionRunsMillis=1800000
numTestsPerEvictionRun=3
testOnBorrow=true
testWhileIdle=true
testOnReturn=true
minIdle=100
maxIdle=250
maxActive=500
validationQuery="SELECT 1"
}
dialect = 'org.hibernate.dialect.MySQL5InnoDBDialect'
}
When you are not in a transaction, you have to release the connection that GroovySQL picks up from the datasource. The pool runs out of connections and that's why it locks up.
Inside a transaction TransactionAwareDataSourceProxy will take care of sharing the connection and therefore releasing the connection from GroovySQL isn't required in that case. See http://jira.grails.org/browse/GRAILS-5454 for details.
This is a better way to use GroovySQL in Grails since the OpenSessionInView (OSIV) interceptor will take care of closing the connection and it will share the same database connection as Hibernate. This method works in both cases: inside transactions and outside transactions.
Sql sql = new Sql(sessionFactory.currentSession.connection())

WebSharingAppDemo-CEProviderEndToEnd Queries peerProvider for NeedsScope before any files are batched to the server. This seems out of order?

I'm building an application based on the WebSharingAppDemo-CEProviderEndToEnd. When I deploy the server portion on a server, the code gives the error "The path is not valid. Check the directory for the database." during the call to NeedsScope() in the CeWebSyncService.cs file.
Obviously the server can't access the client's sdf but what is supposed to happen to make this work? The app uses batching to send the data and the batches have to be marshalled across to the temp directory but this problem is occurring before any files have been batched over. There is nothing for the server to look at to determine whether the peerProivider needs scope. What am I missing?
public bool NeedsScope()
{
Log("NeedsSchema: {0}", this.peerProvider.Connection.ConnectionString);
SqlCeSyncScopeProvisioning prov = new SqlCeSyncScopeProvisioning();
return !prov.ScopeExists(this.peerProvider.ScopeName, (SqlCeConnection)this.peerProvider.Connection);
}
I noticed that the sample was making use of a proxy to speak w/ the CE file but a provider (not a proxy) to speak w/ the sql server.
I switched it so there is a proxy to reach the SQL server and a provider to access the CE file.
That seems to work for me.
stats = synchronizationHelper.SynchronizeProviders(srcProvider, destinationProxy);
vs.
SyncOperationStatistics stats = syncHelper.SynchronizeProviders(srcProxy, destinationProvider);

How to have ADO.NET consume persisted XML ADO recordset for updates

I am working on a .NET 2.0 conversion of a multi-layer client-server application. For the time-being, we're converting the server-side to .NET but leaving client apps in COM (VB6). My current work is focused on getting data to/from the detached COM based clients.
The current version under development is using ADO(COM) recordsets persisted as XML and sent to the clients. Clients then load up the XML in to a recordset object for read/write access to data. Data is sent back to the server in the same format.
Upon receipt of a persisted XML recordset for update, we do some rather kludgy parsing to produce UPDATE statements which get pushed through ADO.NET to update the source database accordingly.
In (very rough) pseudo-code:
adodb.recordset oRS = load(ADOXML)
for each rec in oRS
{
string sSQL = "Update table set value = " + rec.ValueField + " where key = " + rec.KeyField
executeNonQuery(sSQL)
}
*DISCLAIMER: The actual logic is nowhere near this crude, but the end-result is the same.
I'm wondering if anyone has a better solution to get the updated data back in to the database...?
Thanks,
Dan
Have you considered using a TableAdapter to perform an update on the data once you have set all rows to modified?
If you have a preconfigured SQLTableAdapter which has update functions built in then by reading in the table from XML into the dataset you can then call the update function from the table adapter to update the central DB with every row in the table.
var tableadapter = new MyTableAdapter();
var dataset = new MyDataSet();
dataset.ReadXml("c:\myxmldata.xml");
foreach(var row in dataset.tables[0].rows)
{
row.SetModified();
}
tableadapter.Update(dataset);