LDAPException size limit exceeded - ldap

I am using unboundid ldap sdk for executing ldap query. I am facing a strange problem while running ldap search query. I am getting a Exception when i run query against a group which contains 50k entries. My Exception :
LDAPException(resultCode=4 (size limit exceeded), errorMessage='size limit exceeded')
at com.unboundid.ldap.sdk.migrate.ldapjdk.LDAPSearchResults.nextElement(LDAPSearchResults.java:254)
at com.unboundid.ldap.sdk.migrate.ldapjdk.LDAPSearchResults.next(LDAPSearchResults.java:279)
Now the strange thing is i already have set the maxResultSize to 100k in search constrains than why i am getting this error ?
My code is
ld = new LDAPConnection();
ld.connect(ldapServer, 389);
LDAPSearchConstraints ldsc = new LDAPSearchConstraints();
ldsc.setMaxResults(100000);
ld.setSearchConstraints(ldsc);
Anybody have any idea ?

Sorry for necroposting, but your topic with no answer is still the first in google.
Using unboundid you actually can get unlimited number of records in paging mode.
public static void main(String[] args) {
try {
int count = 0;
LDAPConnection connection = new LDAPConnection("hostname", 389, "user#domain", "password");
final String path = "OU=Users,DC=org,DC=com";
String[] attributes = {"SamAccountName","name"};
SearchRequest searchRequest = new SearchRequest(path, SearchScope.SUB, Filter.createEqualityFilter("objectClass", "person"), attributes);
ASN1OctetString resumeCookie = null;
while (true)
{
searchRequest.setControls(
new SimplePagedResultsControl(100, resumeCookie));
SearchResult searchResult = connection.search(searchRequest);
for (SearchResultEntry e : searchResult.getSearchEntries())
{
if (e.hasAttribute("SamAccountName"))
System.out.print(count++ + ": " + e.getAttributeValue("SamAccountName"));
if (e.hasAttribute("name"))
System.out.println("->" + e.getAttributeValue("name"));
}
LDAPTestUtils.assertHasControl(searchResult,
SimplePagedResultsControl.PAGED_RESULTS_OID);
SimplePagedResultsControl responseControl =
SimplePagedResultsControl.get(searchResult);
if (responseControl.moreResultsToReturn())
{
resumeCookie = responseControl.getCookie();
}
else
{
break;
}
}
}
catch (Exception e)
{
System.out.println(e.toString());
}
}

Check the server-side size limit setting. It prevails over the client-side setting which is what you're doing in your code.

Related

AsyncTask doInBackground() does not execute correctly on run, but works on debugger

#Override
protected ArrayList<HashMap<String, String>> doInBackground(Void... params) {
ArrayList<HashMap<String, String>> PLIST = new ArrayList<>();
HttpHandler sh = new HttpHandler();
String jsonStr = sh.makeServiceCall(jsonUrl);
ArrayList<String> URLList = new ArrayList<>();
if (jsonStr != null) {
placesList.clear();
try {
JSONObject jsonObj = new JSONObject(jsonStr);
// Getting JSON Array node
JSONArray placesJsonArray = jsonObj.getJSONArray("results");
String pToken = "";
// looping through All Places
for (int i = 0; i < placesJsonArray.length(); i++) {
JSONObject placesJSONObject = placesJsonArray.getJSONObject(i);
String id = placesJSONObject.getString("id");
String name = placesJSONObject.getString("name");
HashMap<String, String> places = new HashMap<>();
// adding each child node to HashMap key => value
places.put("id", id);
places.put("name", name);
PLIST.add(places);
}
//TODO: fix this...
if (SEARCH_RADIUS == 1500) {
Log.e(TAG, "did it get to 1500?");
try {
for (int k = 0; k < 2; k++) {
//error is no value for next_page_token... this
ERROR HERE
pToken = jsonObj.getString("next_page_token"); //if I place breakpoint here, debugger runs correctly, and returns more than 20 results if there is a next_page_token.
String newjsonUrl = "https://maps.googleapis.com/maps/api/place/nearbysearch/json?location="
+ midpointLocation.getLatitude() + "," + midpointLocation.getLongitude()
+ "&radius=" + SEARCH_RADIUS + "&key=AIzaSyCiK0Gnape_SW-53Fnva09IjEGvn55pQ8I&pagetoken=" + pToken;
URLList.add(newjsonUrl);
jsonObj = new JSONObject(new HttpHandler().makeServiceCall(newjsonUrl)); //moved
Log.e(TAG, "page does this try catch");
}
}
catch (Exception e ) {
Log.e(TAG, "page token not found: " + e.toString());
}
for (String url : URLList){
Log.e(TAG, "url is : " + url);
}
I made an ArrayList of URLS after many attempts to debug this code, I planned on unpacking the ArrayList after all the urls with next_page_tokens were added, and then parsing through each of them later. When running the debugger with the breakpoint on pToken = getString("next_page_token") i get the first url from the Logger, and then the second url correctly. When I run as is, I get the first url, and then the following error: JSONException: No value for next_page_token
Things I've tried
Invalidating Caches and restarting
Clean Build
Run on different SDK versions
Made sure that the if statement is hitting (SEARCH_RADIUS == 1500)
Any help would be much appreciated, thanks!
Function is called in a listener function like this.
new GetPlaces(new AsyncResponse() {
#Override
public void processFinish(ArrayList<HashMap<String, String>> output) {
Log.e(TAG, "outputasync:" );
placesList = output;
}
}).execute();
My onPostExecute method.
#Override
protected void onPostExecute(ArrayList<HashMap<String, String>> result) {
delegate.processFinish(result);
// Dismiss the progress dialog
if (pDialog.isShowing())
pDialog.dismiss();
}
It turns out that the google places api takes a few milliseconds to validate the next_page_token after it is generated. As such, I have used the wait() function to pause before creating the newly generated url based on the next_page_token. This fixed my problem. Thanks for the help.

Create and query a binary cache in ignite

I am trying to use BinaryObjects to create the cache at runtime. For example, instead of writing a pojo class such as Employee and configuring it as a cache value type, I need to be able to dynamically configure the cache with the field names and field types for the particular cache.
Here is some sample code:
public class EmployeeQuery {
public static void main(String[] args) throws Exception {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
if (!ExamplesUtils.hasServerNodes(ignite))
return;
CacheConfiguration<Integer, BinaryObject> cfg = getbinaryCache("emplCache", 1);
ignite.destroyCache(cfg.getName());
try (IgniteCache<Integer, BinaryObject> emplCache = ignite.getOrCreateCache(cfg)) {
SqlFieldsQuery top5Qry = new SqlFieldsQuery("select * from Employee where salary > 500 limit 5", true);
while (true) {
QueryCursor<List<?>> top5qryResult = emplCache.query(top5Qry);
System.out.println(">>> Employees ");
List<List<?>> all = top5qryResult.getAll();
for (List<?> list : all) {
System.out.println("Top 5 query result : "+list.get(0) + " , "+ list.get(1) + " , " + list.get(2));
}
System.out.println("..... ");
Thread.sleep(5000);
}
}
finally {
ignite.destroyCache(cfg.getName());
}
}
}
private static QueryEntity createEmployeeQueryEntity() {
QueryEntity employeeEntity = new QueryEntity();
employeeEntity.setTableName("Employee");
employeeEntity.setValueType(BinaryObject.class.getName());
employeeEntity.setKeyType(Integer.class.getName());
LinkedHashMap<String, String> fields = new LinkedHashMap<>();
fields.put("id", Integer.class.getName());
fields.put("firstName", String.class.getName());
fields.put("lastName", String.class.getName());
fields.put("salary", Float.class.getName());
fields.put("gender", String.class.getName());
employeeEntity.setFields(fields);
employeeEntity.setIndexes(Arrays.asList(
new QueryIndex("id"),
new QueryIndex("firstName"),
new QueryIndex("lastName"),
new QueryIndex("salary"),
new QueryIndex("gender")
));
return employeeEntity;
}
public static CacheConfiguration<Integer, BinaryObject> getbinaryCache(String cacheName, int duration) {
CacheConfiguration<Integer, BinaryObject> cfg = new CacheConfiguration<>(cacheName);
cfg.setCacheMode(CacheMode.PARTITIONED);
cfg.setName(cacheName);
cfg.setStoreKeepBinary(true);
cfg.setAtomicityMode(CacheAtomicityMode.ATOMIC);
cfg.setIndexedTypes(Integer.class, BinaryObject.class);
cfg.setExpiryPolicyFactory(FactoryBuilder.factoryOf(new CreatedExpiryPolicy(new Duration(SECONDS, duration))));
cfg.setQueryEntities(Arrays.asList(createEmployeeQueryEntity()));
return cfg;
}
}
I am trying to configure the cache with the employeeId (Integer) as key and the whole employee record (BinaryObject) as value. When I run the above class, I get the following exception :
Caused by: org.h2.jdbc.JdbcSQLException: Table "EMPLOYEE" not found; SQL statement:
select * from "emplCache".Employee where salary > 500 limit 5
What am I doing wrong here? Is there anything more other than this line:
employeeEntity.setTableName("Employee");
Next, I am trying to stream data into the cache. Is this the right way to do it?
public class CsvStreamer {
public static void main(String[] args) throws IOException {
Ignition.setClientMode(true);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
if (!ExamplesUtils.hasServerNodes(ignite))
return;
CacheConfiguration<Integer, BinaryObject> cfg = EmployeeQuery.getbinaryCache("emplCache", 1);
try (IgniteDataStreamer<Integer, BinaryObject> stmr = ignite.dataStreamer(cfg.getName())) {
while (true) {
InputStream in = new FileInputStream(new File(args[0]));
try (LineNumberReader rdr = new LineNumberReader(new InputStreamReader(in))) {
int count =0;
for (String line = rdr.readLine(); line != null; line = rdr.readLine()) {
String[] words = line.split(",");
BinaryObject emp = getBinaryObject(words);
stmr.addData(new Integer(words[0]), emp);
System.out.println("Sent data "+count++ +" , sal : "+words[6]);
}
}
}
}
}
}
private static BinaryObject getBinaryObject(String[] rawData) {
BinaryObjectBuilder builder = Ignition.ignite().binary().builder("Employee");
builder.setField("id", new Integer(rawData[0]));
builder.setField("firstName", rawData[1]);
builder.setField("lastName", rawData[2]);
builder.setField("salary", new Float(rawData[6]));
builder.setField("gender", rawData[4]);
BinaryObject binaryObj = builder.build();
return binaryObj;
}
}
Note: I am running this in cluster mode. Both EmployeeQuery and CsvStreamer I run from one machine, and I have ignite running in server mode in two other machines. Ideally I want to avoid the use of a pojo class in my application and make things as dynamic and generic as possible.
You are getting this exception because you didn't configure SQL scheme. In your case (you don't want to create pojo object and etc) I recommend to use SQL like syntacsis which was added to Apache Ignite since 2.0 version. I sure that the following example helps you with configuration: https://github.com/apache/ignite/blob/master/examples/src/main/java/org/apache/ignite/examples/datagrid/CacheQueryDdlExample.java

Issue : SQL Azure connection is broken . After reconnecting and accessing entity object , An Error occurred

Connected to website and keeping idle for 30 mins, then trying to access the entities I am getting the following error.
Entity framework An error occurred while executing the command definition. See the inner exception for details . Inner exception {“Invalid object name 'dbo.TableName'.”}
Sample Code
Static Class Azure
{
public static CrmEntities ConnectCustomerEntity()
{
CrmEntities customerEntity = null;
policy.ExecuteAction(() =>
{
try
{
var shardId = GetShardId();
customerEntity = new CrmEntities(ConnectionStringCustomerDB());
string federationCmdText = #"USE FEDERATION Customer_Federation(ShardId =" + shardId + ") WITH RESET, FILTERING=ON";
customerEntity.Connection.Open();
customerEntity.ExecuteStoreCommand(federationCmdText);
}
catch (Exception e)
{
customerEntity.Connection.Close();
SqlConnection.ClearAllPools();
//throw e;
}
});
return customerEntity;
}
public static CrmEntities DBConnect(CrmEntities _db)
{
try{
if (_db == null)
_db = Azure.ConnectCustomerEntity();
if ((_db.Connection.State == ConnectionState.Broken) || (_db.Connection.State == ConnectionState.Closed))
{
SqlConnection.ClearAllPools();
_db = Azure.ConnectCustomerEntity();
}
else
{ //This code is to find out any issues in connection pool database connection
string sqlCmdText = #"select top 1 Id from Project";
_db.ExecuteStoreCommand(sqlCmdText);
}
}
catch (Exception ex)
{
_db.Connection.Close();
SqlConnection.ClearAllPools();
_db = Azure.ConnectCustomerEntity();
}
return _db;
}
}
Mvc Controller. The following code I am gettting that exception, after 30 mins
public class FilterController : Controller
{
public ActionResult GetFilters(string entityName,string typeFilter)
{
_crmEntities=Azure.DBConnect(_db);
var query = _db.FilterFields.Where(filter => filter.TableId == tableId).ToList(); // Here I am getting that exception
}
}
I dont know, Why i m getting that exception. I tried all possibilities. Nothing helped. I really struck with this. If anybody knows please tell your views to come out from this exception
Thanks in Advance.
I think your session times out.
try to increase session timeout:
http://msdn.microsoft.com/en-us/library/system.web.sessionstate.httpsessionstate.timeout.aspx

com.sun.jersey.api.client.UniformInterfaceException (returned a response status of 400)

I am trying to set up file upload example using JAX RS. I could set up the project and successfully upload file in a server location. But i get the following error when file size is more than 10KB (weird!!)
com.sun.jersey.api.client.UniformInterfaceException: POST http://localhost:9090/DOAFileUploader/rest/file/upload returned a response status of 400
at com.sun.jersey.api.client.WebResource.handle(WebResource.java:607)
at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74)
at com.sun.jersey.api.client.WebResource$Builder.post(WebResource.java:507)
at com.sony.doa.rest.client.DOAClient.upload(DOAClient.java:75)
at com.sony.doa.rest.client.DOAMain.main(DOAMain.java:34)
I am new to JAX RS and i'm not sure what exactly the issue is. Do i need to set some parameters client side or server side (like size, timeout etc)?
This is the client side code calling webservice:
public void upload() {
File file = new File(inputFilePath);
FormDataMultiPart part = new FormDataMultiPart();
part.bodyPart(new FileDataBodyPart("file", file, MediaType.APPLICATION_OCTET_STREAM_TYPE));
WebResource resource = Client.create().resource(url);
String response = resource.type(MediaType.MULTIPART_FORM_DATA_TYPE).post(String.class, part);
System.out.println(response);
}
This is the server side code:
#Path("/file")
public class UploadFileService {
#POST
#Path("/upload")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public Response uploadFile(
#FormDataParam("file") InputStream uploadedInputStream,
#FormDataParam("file") FormDataContentDisposition fileDetail) {
String uploadedFileLocation = "e://uploaded/"
+ fileDetail.getFileName();
writeToFile(uploadedInputStream, uploadedFileLocation);
String output = "File uploaded to : " + uploadedFileLocation;
return Response.status(200).entity(output).build();
}
private void writeToFile(InputStream uploadedInputStream,
String uploadedFileLocation) {
try {
OutputStream out = new FileOutputStream(new File(
uploadedFileLocation));
int read = 0;
byte[] bytes = new byte[16000];
out = new FileOutputStream(new File(uploadedFileLocation));
while ((read = uploadedInputStream.read(bytes)) != -1) {
out.write(bytes, 0, read);
}
out.flush();
out.close();
} catch (IOException e) {
e.printStackTrace();
} } }
Please let me know what settings i have to change for file sizes greater than 10KB?
Thanks!
I use org.apache.commons.fileupload.servlet.ServletFileUpload in a Jersey context, and it works fine., and yes, it set the max file size, sorry I missed this before.
here is a snipet of code I use (this is a multipart form, so there are other fields along with the file)
private LibraryUpload parseLibraryUpload(HttpServletRequest request) {
LibraryUpload libraryUpload;
File libraryZip = null;
String name = null;
String version = null;
ServletFileUpload upload = new ServletFileUpload();
upload.setFileSizeMax(MAX_FILE_SIZE);
FileItemIterator iter;
try {
iter = upload.getItemIterator(request);
while (iter.hasNext()) {
....
if (item.isFormField()) {
....
}else{
BufferedInputStream buffer = new BufferedInputStream(stream);
buffer.mark(MAX_FILE_SIZE);
libraryZip = File.createTempFile("fromUpload", null);
IOUtils.copy(buffer, new FileOutputStream(libraryZip));
...
}
I have encountered the same problem with Jersey. I have activated jersey trace but nothing help me.
I have changed the library by an apache Library and I see than the problem with linked to a repository for temporary files for tomcat. The repository was not exist. For files under 10k, the repository was not used.
So, after the repository creation, I used jersey library and all works fine.

How to programmatically set the task outcome (task response) of a Nintex Flexi Task?

Is there any way of set a Nintex Flexi task completion through Sharepoint's web services? We have tried updating the "WorkflowOutcome", "ApproverComments" and "Status" fields without success (actually the comments and status are successfully updated, however I can find no way of updating the WorkflowOutcome system field).
I can't use the Nintex Web service (ProcessTaskResponse) because it needs the task's assigned user's credentials (login, password, domain).
The Asp.net page doesn't have that information, it has only the Sharepoint Administrator credentials.
One way is to delegate the task to the admin first, and then call ProcessTaskResponse, but it isn't efficient and is prone to errors.
In my tests so far, any update (UpdateListItems) to the WorkflowOutcome field automatically set the Status field to "Completed" and the PercentComplete field to "1" (100%), ending the task (and continuing the flow), but with the wrong answer: always "Reject", no matter what I try to set it to.
Did you try this code: (try-cacth block with redirection does the trick)
\\set to actual outcome id here, for ex. from OutComePanel control
taskItem[Nintex.Workflow.Common.NWSharePointObjects.FieldDecision] = 0;
taskItem[Nintex.Workflow.Common.NWSharePointObjects.FieldComments] = " Some Comments";
taskItem.Update();
try
{
Nintex.Workflow.Utility.RedirectOrCloseDialog(HttpContext.Current, Web.Url);
}
catch
{
}
?
Here are my code to change outcome of nintex flexi task. My problem is permission. I had passed admin token to site. It's solve the problem.
var siteUrl = "...";
using (var tempSite = new SPSite(siteUrl))
{
var sysToken = tempSite.SystemAccount.UserToken;
using (var site = new SPSite(siteUrl, sysToken))
{
var web = site.OpenWeb();
...
var cancelled = "Cancelled";
task.Web.AllowUnsafeUpdates = true;
Hashtable ht = new Hashtable();
ht[SPBuiltInFieldId.TaskStatus] = SPResource.GetString(new CultureInfo((int)task.Web.Language, false), Strings.WorkflowStatusCompleted, new object[0]);
ht["Completed"] = true;
ht["PercentComplete"] = 1;
ht["Status"] = "Completed";
ht["WorkflowOutcome"] = cancelled;
ht["Decision"] = CommonHelper.GetFlexiTaskOutcomeId(task, cancelled);
ht["ApproverComments"] = "cancelled";
CommonHelper.AlterTask((task as SPListItem), ht, true, 5, 100);
task.Web.AllowUnsafeUpdates = false;
}
}
}
}
}
}
public static string GetFlexiTaskOutcomeId(Microsoft.SharePoint.Workflow.SPWorkflowTask task, string outcome)
{
if (task["MultiOutcomeTaskInfo"] == null)
{
return string.Empty;
}
string xmlOutcome = HttpUtility.HtmlDecode(task["MultiOutcomeTaskInfo"].ToString());
if (string.IsNullOrEmpty(xmlOutcome))
{
return string.Empty;
}
XmlDocument doc = new XmlDocument();
doc.LoadXml(xmlOutcome);
var node = doc.SelectSingleNode(string.Format("/MultiOutcomeResponseInfo/AvailableOutcomes/ConfiguredOutcome[#Name='{0}']", outcome));
return node.Attributes["Id"].Value;
}
public static bool AlterTask(SPListItem task, Hashtable htData, bool fSynchronous, int attempts, int milisecondsTimeout)
{
if ((int)task[SPBuiltInFieldId.WorkflowVersion] != 1)
{
SPList parentList = task.ParentList.ParentWeb.Lists[new Guid(task[SPBuiltInFieldId.WorkflowListId].ToString())];
SPListItem parentItem = parentList.Items.GetItemById((int)task[SPBuiltInFieldId.WorkflowItemId]);
for (int i = 0; i < attempts; i++)
{
SPWorkflow workflow = parentItem.Workflows[new Guid(task[SPBuiltInFieldId.WorkflowInstanceID].ToString())];
if (!workflow.IsLocked)
{
task[SPBuiltInFieldId.WorkflowVersion] = 1;
task.SystemUpdate();
break;
}
if (i != attempts - 1)
{
Thread.Sleep(milisecondsTimeout);
}
}
}
var result = SPWorkflowTask.AlterTask(task, htData, fSynchronous);
return result;
}