Synchronize 2 SharePoint Libraries on Different Servers - sharepoint-2010

I need to copy or synchronize 2 libraries between 2 different servers. Here are more details:
1- I have an InfoPath form that is submitted to a document library, Lib #1.
2- The InfoPath form is published as a content type and Lib #1 is configured to deal with this content type.
3- The users will have the ability to add new items to Lib #1.
4- At the end of each day, or may be every hour or so, I need to copy the newly created items or sync Lib #1 with another library on a different server.
5- The content type will be available in both servers.
I am new to SharePoint so I appreciate if you can show me some hints on how to achieve that.
Clarification
I don't know which route to take, I am thinking of utilizing any of these (if possible):
1- Replicating Database: Use replication to copy data from one SharePoint database to the other
2- File System: I am not sure if the forms are saved in file system but if they are, I could copy the forms from one server to another.
3- Programatically: I am reading about SharePoint server events. I can program against the List Item Added event and try to copy the item from one server to anther.
4- Built in: May be there is a built in tool that I am not aware of that can help me copy items.

Why do you want to do that ? If the target is entirely different server, forms won't open from there unless you relink the documents and you have same form template deployed on that server.
To do this job, you have two options:
Realtime - Use ItemAdded event handler, to copy the item to target as soon as it is created.
Timer Job - You can create a timer job which will run as per some schedule and copy the items from one library to other.
Reject the DB and file system approach. Accessing DB directly is not supported. And these are not stored on filesystems. SharePoint API is only way. You can use either in Event handler or timer job.

Related

MS- Acess database interface update from local

I am extremely new to Ms-Access. I have a central back end access database in server computer. And all the users have the front end user interface installed on their system.
Now, whenever I make any changes to the interface in my local, i need to re-install the updated interface on each of their system. Is their any way that i can do so that i will make the changes only on my local and it will be automatically reflected on all the users' systems.
Thank you.
Ok there are a couple of options that you can do to either fully or paritally automate this process.
Partial Automation
If you don't have a lot of users and you don't want to do a great deal of coding you can write a simple batchfile or vbs file which you set up on the users desktop as an icon. Batch file code would show the following type of information.
#Echo Off
REM Copy your file from server location to local user machine
xcopy "F:\ServerDirectory\databasename.mdb" "C:\ClientDirectory\databasename.mdb" /E /Y /R
Set this up on the users machine as an icon and whenever you want them to update their front end ask them to double click the icon. This will overwrite their client with whatever you place in the location on the server. It is advisable to create all table links to the database back end having UNC paths as well.
I have used this successfully for various applications - I make changes to the front end place in appropriate location on the server and then do a quick e-mail to people just to ask them to double click the bat file icon.
Full Automation
Programmatically set version control up using visual basic so the client checks version number of the client against a server number and if the client is not the latest will download a new version.
This is more involved and full instructions are available here.
Front End Auto Update
When you deploy an MS Access solutions like this, you need to decide whether to share the client MDB file between all users, or distribute copies to each user. It sounds like you have taken the second option. Each choice has merits and disadvantages. If you stay with the current approach, you might look at a scripting option to deploy updated client MDB files between users.

Custom Building Block Template wont load reliably

My small collection of document-specific macros and quickpart building blocks is growing! I'm starting to share these with employees, and am looking to be able to set up each remote computer once only. From there on, update collections on a network path. And because each computer looks to the shared location, everyone should always be working with up to date macros and quickparts etc.
So. What I already know:
- Required macros are saved in a separate module, ready to be shared/exported.
- Macros themselves occasionally reference local paths on my computer.
- I will need to reference paths with generic code or use Environ variables.
- Building blocks and quickparts are saved in a separate template file (currently located in Appdata, along with default building block file).
What I dont know:
a) How to point Word to a network path to retrieve macros from custom macro files. (Would I just have to import a fresh macro file at every important update, on each PC?)
b) What's the best way to load a building block item from a CUSTOM path?
My custom BuildingBlock template file is not loaded properly on occasion:
Dim objTemplate As Template
Dim objBB As BuildingBlock
'set template to store the building block
Set objTemplate = Templates("C:\Users\[USER]\AppData\Roaming\Microsoft_
\Document Building Blocks\1033\CustomBBlocks.dotx")
Set objBB = objTemplate.BuildingBlockEntries.Item("[EntryName]")
I know this because the code spits out a 'CollectionDoesntExist' error unless I click the Quickparts gallery prior to running the code for the first time. So it's like Word cant be bothered to open the template file and look inside unless I do it from the UI first.
Of course, if I first open the Quickparts gallery from the UI, prior to running my code, Word seems to figure it out, and inserts the correct Building Block entry without any issue.
In the past I've worked on a product that allows building blocks for Word too. Some sites have hundreds of templates and maybe 1.000 elements (see Composition). The approach we've taken was successful and was different.
You are trying to deploy software elements (macros) across a large number of workstations. You can try to get it working using the possibilities of Microsoft Word and Windows, but it will be sensitive to problems when things change. For instance, switching to Office 2013, splitting a domain into two, work at home without VPN, etc.
Option 1 - DIY deployment: Better put the macros and other stuff behind a webpage, webservice or alike. Deploy on each workstation a generic program that pulls in everything and deploys it locally. You might want to hand over some parameters to the webpage being called to restrict the amount of data. You might want to cache things locally.
Option 2 - Use ClickOnce: write a clickonce deployment script, include the necessary references and put it on a shared network drive or http address. ClickOnce automagically upgrades your software when it finds a new version. It even works across the internet. And it does nothing when there is no new version.
Option 3 - Database: put the elements centrally in a database, allowing end users to change building blocks through forms. Have Microsoft Word in combination with a ClickOnce program pull them in.
For Composition we've used option 2 and 3.

SSRS 2008: How to generate multiple reports immediately?

Im building a site which brings up SSRS reports by opening new windows with the report url and report parameters. I can currenlty open a window for each report they want to run.
However, they also want the option to save the reports to a file share or Sharepoint of their choice, instead of having a bunch of browser window pop-ups for each report.
I understand I can use SSRS web services to setup a schedule (to run in a couple minutes from the time of request) which can save those files to a file share (or Sharepoint) but that seems like a hack to get a one time generating of reports onto a file share or sharepoint.
Is there any other way to instantly generate a bunch of reports, one time, immediately, without having to set them up on a scheduler that is set to run a couple minutes from the time they set it up?
"Note, they DO NOT want one report that has all the reports in it, these are seperate reports that are already built, and they want one file/window per report."
Not sure what you want when you say you want them all at once but one file window/per report? What presentation layer is showing this? You can make three seperate web calls at the same time to the webservice instead of the hosting site:
h ttp://(servername)/(ReporstServer)/PathtoReport1
h ttp://(servername)/(ReporstServer)/PathtoReport2
h ttp://(servername)/(ReporstServer)/PathtoReport3
instead of
h ttp://(servername)/(Reports)
If you just mean 'separate pages' on an Excel workbook you can do that with one report nesting other sub reports. You can build a master report that has rectangle objects that define pages as their properties and place a sub report in each of these rectangles.
Or you could make an html page that references the calls three seperate times in a 'form' object of the HTML doing a 'post' command.
< Form id="SSRSRender" action="http://(servername)/(reportServer)/(report) method="post" target="self">
"However, they also want the option to save the reports to a file share or Sharepoint of their choice, instead of having a bunch of browser window pop-ups for each report.
I understand I can use SSRS web services to setup a schedule (to run in a couple minutes from the time of request) which can save those files to a file share (or Sharepoint) but that seems like a hack to get a one time generating of reports onto a file share or sharepoint."
That's not a hack, that is the preferred method of saving a file is using the built in web service scheduler. Once a report is hosted (on a server hosting the SSRS) it can have configs set for SMTP send outs, file saves, and snapshots made.
If that is not enough you can create your own proxy classes if you want in C# or VB.NET and try to build your own front end talking to SSRS through SOAP requests to the Web Service.

SP2010 Client Object Model: Uploading File to Drop Off Library Doesn't Apply Content Organizer Rules

I am currently developing a service using the SharePoint 2010 Client Object Model to programmatically upload Excel worksheets to a Drop Off Library and then set the properties on the file. This process is working well. However, the Drop Off Library is governed by Content Organizer Rules that aren't being applied to the uploaded file. I have examined every property I thought I could have missed:
ContentTypeId is being properly set
_ModerationStatus is being set to 0
The two properties required to invoke the rule are being set to valid values
Update is being called on the ListItem
The file is checked in after the ListItem is updated
The list doesn't have minor versioning enabled so I don't make any calls to publish.
What's most frustrating is that if I edit the document properties using the Web UI and check it back in without making any changes, the file is moved to its final location. What might I have overlooked that is preventing Content Organizer Rules from being applied to newly uploaded files when using SP2010 COM?
The ultimate answer to this question turned out to be that everything was indeed being set correctly. However, one cannot force the evaluation of content management rules programmatically. The information I required was provided by a post from Steve Curran on this MSDN thread.
In SharePoint 2010 Central Administration under the "Monitoring" section there is a control panel for "Timer Jobs" that includes an item to "Review job definitions." On this panel, there should be a job named "Content Organizer Processing." This is a nightly task that will run and clean up content according to the rules you have established in your site. After uploading a file to the drop off library programmatically, you will likely find that hitting the "Run Now" button for this job will cause the file to be moved to its final destination if the properties are set correctly.
The solution was to change the frequency of this job under the Recurring Schedule section from a nightly process to one that is executed every 15 minutes (or whatever interval you determine will work best).
A word of caution: Be certain to note that if you send automated e-mail to the site administrator or a mailing list when files are left in the drop off library that do not have their properties set correctly, these will start arriving with the same frequency as the job's execution.
This article may help.
Basically, it does not appear to be supported in the 2010 COM so you have to work around it, unfortunately.

Infopath data connection to sharepoint: how to avoid hard coded list id?

I've created a infopath 2010 form with a connection to a SP list. This connection allows me to populate a drop down list. This is working as expected if I work on an existing site.
Now I want to publish this form as a task form of a workflow feature. the workflow is part of a site template that also defines some list instances. As list instances have new IDs each time they are created, the form data connection won't work (listID and spweb absolute url are hard coded in the data connection in the xsf file).
Is there a clean way to allow me to populate a DDL in infopath without the actual list id ?
In fact, can I bind to "lists/mylist" instead of {myguid} ?
thx
(angry against Microsoft for using guids everywhere without the ability to control them).
I finally followed this approach :
In my forms, I converted the data sources to datasources shared in the host sp site. This generated for me the udcx files.
Then, I created, in VS 2010, a feature with a module to provision a DataConnection library, holding all this udcx files. In this udcx file, I replaced the GUID with tokens like $listguid$ or $weburl$
I also wrote a feature receiver to replace, after provisioning the module, my tokens with the actual values
quite painful and very disappointed with this big holes in SP development processes