Sanity.io backup and restore (or auto-saving doc IDs) - backup

Question #1: is it possible to restore deleted items from backup in Sanity.io?
To my understanding, restoring a backup is done by exporting all documents from a dataset's history, and importing it.
Restore - there's one way to do it: https://www.sanity.io/docs/importing-data.
Export - there're two ways to export data:
Export all currently-existing-data: https://www.sanity.io/docs/export.
Export one historical document by its ID: https://www.sanity.io/docs/history-api.
IDs of deleted items do not appear in currently-existing-data (because they are deleted, duh), and without them, I can't get historical documents.
Also, there's a Gotcha section saying:
Gotcha
Current Access Control means if you're able to access the document today, you'll be able to access all the previous revisions of the document.
Question #2: if restoring deleted items from backup is NOT possible due to those missing document IDs - is there a way to automatically save all document IDs (either every hour or whenever a change occurs)?
I guess that if there's a mechanism that also saves the last time an ID was seen, you can also know more or less its deletion time...

I saw that the Sanity.io project has a webhook that gets triggered when changes occur (under 'Settings' tab --> 'API' sub-tab).
I guess this could be set to call a service that gets all documents and save their ID with the current timestamp.

Related

How to purge deleted documents from Lotus Domino application

I am working with a vendor to debug an issue with a Lotus Domino 6.5 application. The vendor had originally asked me to make a copy of the application by going to File > Database > New Copy > Design Only, and then to copy some sample documents (with sensitive info removed) from the original database to the new database. But when I copy documents in this way, the views do not display correctly.
I then tried making a copy of the application by going to File > Database > New Copy > Design and Documents, and then deleting all but a few documents. When I copy documents in this way, the views display correctly, but the database size is huge, which makes me think that the original documents have not really been deleted. For privacy reasons, all deleted documents need to be unrecoverable.
My question: How can I purge documents that have been marked for deletion in this way?
While in the database, choose File - Application - Properties from the menu, click the last tab (the propeller-hat) and deselect these two database properties if they are selected:
"Don't overwrite free space" - Obviously you want to overwrite the free space.
"Allow Soft Deletions" - Deselecting this will purge any documents that are deleted but recoverable.
Once you've done that, choose File - Replication - Options for this Application..., click the "Other" tab, and change the date/time to the current date/time. This gets rid of the "deletion stubs" which remain for a time so that, if replication occurs with a different machine, that machine learns that the document was deleted rather than telling yours "Oh! Here's a brand new document!" This is completely unnecessary unless you want to ensure that even the count or internal ids of deleted documents are removed. The actual content will be gone.
After that, compact the database because you'll have free space or, if you want to be really paranoid, create a copy of your copy.
It's been a long time since I did any Lotus Notes development, but you can compact the database.
Open a database.
Choose File - Database - Properties.
Click the Info tab.
Click % used.
If the percentage of a database in use drops below 90% (it contains more than 10% unused space), click Compact to compact the database.

Script to find absolute path / Location of Analysis Services Database?

Whenever I restore an AS Database, the DB files are created in a new folder by name DBName_[1-n] wherethe number is incremented by 1 after every restore. I am currently looking for a script to copy the files[or this ASDBName_[n]] dynamically to another server.
Is there a way to find the file path of the ASDatabase through DMVs/ AMO or any other manner?
Regards,
Sasi.
The numbering is used by AS to handle transactions: Each write operation creates a new copy with a new number, while the old version can still be used for read access. If, finally, the write operation - be it a restore or a processing or a structural change - succeeds, AS switches all users to the new version, and can then delete the old version in the background. If anything goes wrong during the write operation, the new version can just be deleted by AS without affecting anybody using the previous version. This can happen on database level, and as well at sub-object level (if you e. g. process only a dimension, or add a measure to a measure group).
This also means that in order to be sure you copy the database, you have to detach it - which makes sure that it is in a consistent state, and not a half written rest stays around. Then you could copy it to a new server, and attach it there. And, as long as the database is detached, there should be only one version, so you could just take the one and only folder of name "DBName.<n>.db".
I do not think there is a documented possibility to find the exact name. At least, Microsoft does not document one at http://technet.microsoft.com/en-us/library/cc280670.aspx. They just state "Use any operating system mechanism or your standard method for moving files to move the database folder to the new location."

How to disable index updating for a single item in Sitecore programmatically

In my Sitecore web application, I'm creating a new item and do several updates to that item in various places in the code and finally conclude the saving process. This many times of alterations to the item causes to have new table records in the History table to be created for index updating as below,
Created
Saved
Saved
Saved
Saved
Saved
Saved
Saved
This many entries causes the indexing process to check many entries which are not required, but I actually want is to have only two records as,
Created
Saved
How can I disable the creation of the Saved entries temporarily for an item(like having kind of IndexUpdateDisabled() context)?
Ideally, you should call item.Editing.BeginEdit() only once, at the beginning of your flow. Once you reach the final step, you complete the edit with item.Editing.EndEdit();
If this for some reason isn't doable in your setup, you could resort to making silent updates. At each step, call item.Editing.EndEdit(false, true) and only on the final step use the parameterless overload item.Editing.EndEdit()

Lotus Notes: Replication conflict caused by agent and user running on the document at same time

In one of the lotus notes db, too frequent replication/save conflicts are caused reason being a scheduled agent and any user working on the document at the same time.
is there any way to avoid this.
Thanks,
H.P.
Several options in addition to merging conflicts:
Change the schedule The best way to avoid it is to have your scheduled agents running at times when users are not likely to be accessing the system. If the LastContact field on a Client document is updated by an agent every hour as it checks all Contact documents, maybe the agent should run overnight instead.
Run the agent on user action It may also be the case that the agent shouldn't be running on a schedule, but should be running when the user takes some action. For example, run the agent to update the Client document when the user saves the supporting Contact document.
Break the form into smaller bits A third thing to consider is redesigning your form so that not every piece of data is on a main form. For example, if comments on recent contacts with a client are currently held in a field on the Client document, you might change the design to have a separate ClientMeeting form from which the comments on the meeting are displayed in an embedded view or computed text (or designed using Xpages).
Despite the fact that I am a developer, I think rep/saves are far more often the result of design decisions than anything else.
You can use the Conflict Handling option on the form(s) in question and select either Merge Conflicts or Merge/No Conflicts in order to have Notes handle merging of edit conflicts.
From the Help database:
At the "Conflict Handling" section of the Form Info tab, choose one of the following options for the form:
Create Conflicts -- Creates conflicts so that a replication conflict appears as a response document to the main document. The main document is selected based on the time and date of the changes and an internal document sequence number.
Merge Conflicts -- If a replication conflict occurs, saves the edits to each field in a single document. However, if two users edit the same field in the same document, Notes saves one document as a main document and the other document as a response document marked as a replication conflict. The main document is selected based on the criteria listed in the bullet above.
Merge/No Conflicts -- If replication occurs, saves the edits to each field in a single document. If two users edit the same field in the same document, Notes selects the field from the main document, based on time and date, and an internal document sequence number. No conflict document is generated, instead conflicting documents are merged into a single document at the field level.
Do Not Create Conflicts -- No merge occurs. IBM® Lotus® Domino(TM) simply chooses one document over another. The other document is lost.
In later versions of Notes there is the concept of document locking, and used properly that can prevent conflicts (but also add complexity).
Usually most conflicts can be avoided by planning to run the agents late at night when users aren't on the system. If that's not an option, then locking may be the best solution. If the conflicts aren't too many, you might benefit from adding a view filtered to show only conflicts, which would make findind and resolving them easier.
IMHO, the best answer to conflicts between users and agents is to make sure that they are operating on different documents. I.e., there are two documents with a common key. They can be parent and child if it would be convenient to show them that way in a view, but they don't have to be. Just call them DocA and DocB for the purposes of this discussion.
DocA is read and updated by users. When a user is viewing DocA, computed field formulas can pull information from DocB via DbLookup or GetDocField. Users never update DocB.
DocB, on the other hand, is read and updated by agents, but agents only read DocA. They never update them.
If you design your application any other way, then you either have to use locking -- which can create the possibility of not being able to update something when you need to, or accepting the fact that conflicts can happen occasionally and will need to be resolved.
Note that even with this strategy, you can still have conflicts if you have multiple replicas of the database. You can use the 'Conflict Handling' section of the Form properties to help minimize replication conflicts, as per #Per Henrik Lausten's answer, but since you are talking about an existing please also see my comment to his answer for additional info about what you would have to do in order to use this feature.
If this is a mission critical application, consider creating a database with lock-documents. That means, every time a user opens a document, a separate lock-document is created.
Then code the agent to see if lock-documents exist for every document that the agent wants to modify. If it does, skip that document.
Document-close should remove the doc-lock.
The lock-doc should be created on document open, not just read. This way, when a user has the document open in read mode, the agent will not be able to modify as well. This is to prohibit, that the user might change to editmode afterwards and make changes.
If the agent has a long modification time, it should create lock-docs as well.

How to find the document visitior's count?

Actually I am in need of counting the visitors count for a particular document.
I can do it by adding a field, and increasing its value.
But the problem is following.,
I have 10 replication copies in different location. It is being replicated by scheduled manner. So replication conflict is happening because of document count is editing the same document in different location.
I would use an external solution for this. Just search for "visitor count" in your favorite search engine and choose a third party tool. You can then display the count on the page if that is important.
If you need to store the value in the database for some reason, perhaps you could store it as a new doc type that gets added each time (and cleaned up later) to avoid the replication issues.
Otherwise if storing it isn't required consider Google Analytics too.
Also I faced this problem. I can not say that it has a easy solution. Document locking is the only solution that i had found. But the visitor's count is not possible.
It is possible, but not by updating the document. Instead have an AJAX call to an agent or form with parameters on the URL identifying the document being read. This call writes a document into a tracking DB with one or two views and then determines from those views how many reads you have had. The number of reads is the return value of the AJAX form.
This can be written in LS, Java or #Formulas. I would try to do it 100% in #Formulas to make it as efficient as possible.
You can also add logic to exclude reads from the same user or same source IP address.
The tracking database then replicates using the same schedule as the other database.
Daily or Hourly agents can run to create summary documents and delete the detail documents so that you do not exceed the limits for #DBLookup.
If you do not need very nearly real time counts (and that is the best you can get with replicated system like this) you could use the web logs that domino generates by finding the reads in the logs and building the counts in a document per server.
/Newbs
Back in the 90s, we had a client that needed to know that each person had read a document without them clicking to sign or anything.
The initial solution was to add each name to a text field on a separate tracking document. This ran into problems when it got over 32k real fast. Then, one of my colleagues realized you could just have it create a document for each user to record that they'd read it.
Heck, you could have one database used to track all reads for all users of all documents, since one user can only open one document at a time -- each time they open a new document, either add that value to a field or create a field named after the document they've read on their own "reader tracker" document.
Or you could make that a mail-in database, so no worries about replication. Each time they open a document for which you want to track reads, it create a tiny document that has only their name and what document they read which gets mailed into the "read counter database". If you don't care who read it, you have an agent that runs on a schedule that updates the count and deletes the mailed-in documents.
There really are a lot of ways to skin this cat.