I created a custom RXT asset to register my own objects. The system works fine when I add and browse objects.
But if I delete an object instance using the Management Console by Resources/Browse, browse to /system/governance/[objects]/[object], and try and create a new one, the system starts looping and displays the message:
"Please wait while the asset is being indexing"
This message does not disappear. From this start point, all new objects do not show in publisher and store, but do exist in /system/governance/[objects]/[object].
Please do a re-indexing resources
Follow the steps below to re-index the resources.
Delete the /solr/ directory.
Change the name (e.g. lastaccesstime to lastaccesstime_1) of the file in the registry which tracks the last access time of indexing the resources, by changing the value of the lastAccessTimeLocation property in the /repositiry/conf/registry.xml file as follows.
/_system/local/repository/components/org.wso2.carbon.registry/indexing/lastaccesstime_1
Restart the G-Reg server and wait for around 30 minutes. This time duration depends on number of resources that are there in the registry.
Ref: https://docs.wso2.com/display/Governance540/Upgrading+from+a+Previous+Release#UpgradingfromaPreviousRelease-Re-indexingresources
Related
I have created retention policy in SP2010 for Library and Document as folder based with some rules, also set information management policy and Expiration policy Timer job as in one after another running sequence, these all working fine for "Move in Recycle bin" option, but its not working for "Transfer To another Location" option, where I have already created the location as Drop off library using the Web service URL from Submission point.
If I am Sending the Document manual as "Send To " option its moving the document to Drop off library properly, but By Running the Above mentioned timer job , Documents are not moving to document even the Defined retention stage is already accorded.
and the "Compliance details" displaying status as Completed without moving the document and no error logging for same case.
Please guide me where If I am missing some thing in process...
Is this Drop off library located in the same site? If so this will not work because according to Microsoft, It was as per design. If you try to move documents to a different location using Retention policy, you have to move it a library in a different site collection. Preferably ‘Records center‘ site collection. Main idea of Microsoft is to have one Archival or Records center site collection for the whole organization.
So, if you are trying to move documents after expiration to a library in same site or site collection, you can use a workaround to start a workflow on expiration date which moves the document to archival library.
Hope this helps. Source
I have a JAVA application on a server that creates log files.
I have, in other server, an application (VB.NET) that process those logs to search some strings; if I manually select the file it works all OK.
Now I need that the second application automatically open every new log file on the remote server every one minute.
So I wonder if there's any way to know when a new log file is created or what are the new files created since last minute.
The files have the following format name server.log.*.log so the files with different format name must be ignored.
More info:
There's a second JAVA application on the first sever that delete logs older than one day.
You can use Directory.GetFiles every minutes to get the list of all the files in the folder. Compare it to the previous list (that you keep in memory) and process the new files.
An other option is to monitor the folder for any changes. This can be done using the System.IO.FileSystemWatcher class. By setting the Path and the proper NotifyFilter you can see which files were created with the Created event as soon (or almost) as they appear.
I am currently developing a service using the SharePoint 2010 Client Object Model to programmatically upload Excel worksheets to a Drop Off Library and then set the properties on the file. This process is working well. However, the Drop Off Library is governed by Content Organizer Rules that aren't being applied to the uploaded file. I have examined every property I thought I could have missed:
ContentTypeId is being properly set
_ModerationStatus is being set to 0
The two properties required to invoke the rule are being set to valid values
Update is being called on the ListItem
The file is checked in after the ListItem is updated
The list doesn't have minor versioning enabled so I don't make any calls to publish.
What's most frustrating is that if I edit the document properties using the Web UI and check it back in without making any changes, the file is moved to its final location. What might I have overlooked that is preventing Content Organizer Rules from being applied to newly uploaded files when using SP2010 COM?
The ultimate answer to this question turned out to be that everything was indeed being set correctly. However, one cannot force the evaluation of content management rules programmatically. The information I required was provided by a post from Steve Curran on this MSDN thread.
In SharePoint 2010 Central Administration under the "Monitoring" section there is a control panel for "Timer Jobs" that includes an item to "Review job definitions." On this panel, there should be a job named "Content Organizer Processing." This is a nightly task that will run and clean up content according to the rules you have established in your site. After uploading a file to the drop off library programmatically, you will likely find that hitting the "Run Now" button for this job will cause the file to be moved to its final destination if the properties are set correctly.
The solution was to change the frequency of this job under the Recurring Schedule section from a nightly process to one that is executed every 15 minutes (or whatever interval you determine will work best).
A word of caution: Be certain to note that if you send automated e-mail to the site administrator or a mailing list when files are left in the drop off library that do not have their properties set correctly, these will start arriving with the same frequency as the job's execution.
This article may help.
Basically, it does not appear to be supported in the 2010 COM so you have to work around it, unfortunately.
On my site a user may upload a file (pic, zip, audio, video, whatever). He then may decide to replace it with a newer revision. This user may upload a file, make a post then decide to put up a new revision replacing the old (lets say its a large zip or tar.gz file). Theres a good chance people may be downloading it if he sent out an email or even im for the home user.
Problem. I need to replace the file and people may be downloading and it may be some minutes before it is deleted. I dont want my code to stall until i cant delete or check every second to see if its unused (especially bad if another user can start and he takes long creating a cycle).
How do i delete the file while users are downloading the file? i dont care if they stop i just care that the file can be replaced and new downloads are the new revision.
What about referencing the files indirectly?
A mapping script, maps a virtual file entry from your site to a real file . If the user wants to upload a new revision of his file you just update the mapping, not the real file.
You can install a daily task that scans all files and deletes all files without a mapping and without open connections.
lajuette's answer is right, the easiest solution is to work around the file locking altogether:
When a user uploads file foo.zip, internally store it as foo-v1.zip.
Create a mapping file somewhere (database, code, whatever) that maps foo.zip to foo-v1.zip.
Rather than exposing a direct link to the file, expose a link to a service that gets the file: mysite.com/Download?foo.zip or something. This service uses the mapping to determine which version of the file to send to the client.
When a new version is uploaded, create foo-v2.zip and update the mapping file.
It wouldn't be that hard to write a scheduled task that cleans up old, un-mapped files.
If your oppose to a database and If the filenames are in a fix format (such as user/id.ext) you could append the id with a revision number and enumerate the folder using a pattern (user/id-*) and use the latest revision.
Currently working on an application, that help you to take backup of the files in you machine at the server (hosted by the company itself), so that you can recover data after any hdd crash. I have implemented Single Instance feature, across the users.
Single Instance : A file uploaded already at the server, wouldn't be uploaded again. Whenever any other instance of the exact file uploaded will not be actual upload but some database changes and linked to the same previously uploaded file.
Issue arise when same file (that has not already been uploaded before) is uploaded simultaneously by more than one users, On Start file wouldn't be detected for an instance (as database is updated only after successful upload/backup). All are running, at once. What will be the best way to implement single instance in this way.
I am thinking when I let all the instance upload as it is. So more than one instance of the file will reside at the server. But whenever another backup of the same file will be taken afterwards, I will remove all the previous instances and link them up with the one. This will not let user double uploads and also less complex on the cost of some disc space that too for a while probably (till next upload of the same file will be done)
Thanks for your thoughts in advance.
Calculate the hash (signature) of the file before upload and store it in the DB.
Then - start uploading.
if a similar file will be mark for uploading during the upload of the first file (you will know b/c you already saved the hash) - you will hold the 2nd file upload, until the first one finish successfully, and then link, if the 1st on fails, you can go to the 2nd source and upload it.