SharePoint Integrated Reports missing drop down to manage data sources - sharepoint-2010

I have installed and configured SSRS using SharePoint integrated deployment mode and have been able to successfully run a report from SharePoint. I created a custom deployment application that will upload all reports and datasets as well as create all data sources and make the proper connections between them when necessary.
I have one report that failed and I need to manually mess with the reports connection to a data source but I found that the drop down does not contain the options to let me manage its shared data sources (see example below).
In this image you can see the option that I am missing. Please excuse the colors, this is the best image I could find online in a pinch.
This is only happening in one environment so there must be a configuration change I am not thinking of to show these options. Here are the things I have already checked:
The account I am using is in the sites Owners group and has full control of everything, including the report file.
The item is being uploaded as a Document content type for some reason, but I edited properties and changed that to Report Builder Report content type.
The Report Server Integration site collection feature has been activated.
All of the Reporting Service content types have been added to the list.

I would revert to deployment from BIDS to debug this issue. It will perform some validation during that process and possibly return meaningful errors.

So this turned out to be caused by one of our customizations. We had an old custom javascript function that was named the same as a SharePoint javascript function that has something to do with those drop down actions. Hope this helps someone else.

Related

Can I use Sanity.io for my own data structures or just pre defined ones like 'Blog'

I've just setup a sample blog installation. I thought I would be able to define my own data structures but can't see it.
There is a settings page but I can't figure out what it is showing me
Please can someone offer me some guidance.
Yes, you can absolutely define your own types.
What I suspect you're missing is that there's no web UI for doing so; the Studio app, which is what you're viewing, is for editing content/data.
Sanity is JavaScript-driven, so you'll find the schemas and type definitions under the schemas folder in the sample app. As you edit the files under this folder, the UI will automatically reload to reflect the changes.

Pentaho report contents not getting displayed in Pentaho user console? Why ? please answer

I'm facing an error with a .prpt file.
While running the prpt in report designer it just works fine .
But when the same prpt ran in PUC (user console), the report contents are not displayed.
It is a simple report containing total 8 columns with only text and number fields in the detail section and only labels (formulated also) in headers.
Please help with your valuable tips
Check the server logs - there'll be an exception. I've seen this before - frequently it is due to layout issues, not sure why but for some reason in some cases the server bombs, yet the client does manage to render it. Double check the obvious like overlapping elements etc.
Also check the version of your PRD client, and the exact version of the reporting libraries on the server, they MUST match.
I've seen this happen when you have a prpt file that was created in an older (or newer) version of PRD than the server. You'll get javascript errors and onscreen you'll just see a blank page/tab. Make sure your version of PRD matches the server version.

How to add datasource to the code in ssrs?

When you create a shared data source,it puts this in the code of a report:
<DataSources>
<DataSource Name="KISdfgdfgQL1">
<DataSourceReference>KISdfgdfgQL1</DataSourceReference>
<rd:SecurityType>None</rd:SecurityType>
<rd:DataSourceID>45ad295c-cc2f-438sdfsdf3833230</rd:DataSourceID>
</DataSource>
</DataSources>
I just created a new datasource, and want to use it inside an existing report.
When I tried to deploy the solution I am getting this message:
Error 1 [rsInvalidDataSourceReference] The dataset ‘Community’ refers to the data source “my_new_datasource”, which does not exist.
I did indeed create the new datsource, however, it did not create the tags for it inside the code:
<rd security type...
< datasource id...
How can I force SSRS to use this new datasource that I've created?
im using vs 2010
Generally when you deploy to an SSRS server with a Shared Data Source I find that if you reference something that has yet to be deployed due to a change it will bomb. If you change an existing Shared Data Source it may bomb as the default behavior for SSRS Deployment is to NOT overwrite datasources. Even if you have the shared Datasource already existing. I am not certain but this could be due to the naming of the 'alias' of your shared data source being different or simply any changes may change the GUID of the datasource. You can fix this manually generally by:
Open up the report on SSRS Server hosting the report. (Even if it bombs) Generally it is somewhere like http:// (server)/(Reports)/(path to report).
You can see right above the report parameters or view top a hyperlink with that report name , click it
You now have a management screen. Click 'Data Sources'
If you have a shared Data source a radio button labeled 'A shared data source' will be highlighted. (You may have one or many of these)
If your datasource is not working click to it manually by hitting 'Browse'. Generally the default deployment is under 'Data Sources' off of the root of the SSRS Site.
Once you get the correct data source click ok.
DO NOT FORGET to hit apply in the next window when you get back to 'Data Sources' screen.
(OPTIONAL) you can hit test if your datasource has stored credentials.
If this does not work I would suggest redeploying the shared data source if this is not a production environment and it will not harm anything. Getting SSRS servers to play nice with shared data sources and shared data sets when you make changes to them is sometimes easy but often a minor change makes either the new report or the existing ones get mad. My general rule is get your shared data sources/sets correct and do not change them ever. When you add a source control to the mix it gets even worse.

SP2010 Client Object Model: Uploading File to Drop Off Library Doesn't Apply Content Organizer Rules

I am currently developing a service using the SharePoint 2010 Client Object Model to programmatically upload Excel worksheets to a Drop Off Library and then set the properties on the file. This process is working well. However, the Drop Off Library is governed by Content Organizer Rules that aren't being applied to the uploaded file. I have examined every property I thought I could have missed:
ContentTypeId is being properly set
_ModerationStatus is being set to 0
The two properties required to invoke the rule are being set to valid values
Update is being called on the ListItem
The file is checked in after the ListItem is updated
The list doesn't have minor versioning enabled so I don't make any calls to publish.
What's most frustrating is that if I edit the document properties using the Web UI and check it back in without making any changes, the file is moved to its final location. What might I have overlooked that is preventing Content Organizer Rules from being applied to newly uploaded files when using SP2010 COM?
The ultimate answer to this question turned out to be that everything was indeed being set correctly. However, one cannot force the evaluation of content management rules programmatically. The information I required was provided by a post from Steve Curran on this MSDN thread.
In SharePoint 2010 Central Administration under the "Monitoring" section there is a control panel for "Timer Jobs" that includes an item to "Review job definitions." On this panel, there should be a job named "Content Organizer Processing." This is a nightly task that will run and clean up content according to the rules you have established in your site. After uploading a file to the drop off library programmatically, you will likely find that hitting the "Run Now" button for this job will cause the file to be moved to its final destination if the properties are set correctly.
The solution was to change the frequency of this job under the Recurring Schedule section from a nightly process to one that is executed every 15 minutes (or whatever interval you determine will work best).
A word of caution: Be certain to note that if you send automated e-mail to the site administrator or a mailing list when files are left in the drop off library that do not have their properties set correctly, these will start arriving with the same frequency as the job's execution.
This article may help.
Basically, it does not appear to be supported in the 2010 COM so you have to work around it, unfortunately.

Infopath data connection to sharepoint: how to avoid hard coded list id?

I've created a infopath 2010 form with a connection to a SP list. This connection allows me to populate a drop down list. This is working as expected if I work on an existing site.
Now I want to publish this form as a task form of a workflow feature. the workflow is part of a site template that also defines some list instances. As list instances have new IDs each time they are created, the form data connection won't work (listID and spweb absolute url are hard coded in the data connection in the xsf file).
Is there a clean way to allow me to populate a DDL in infopath without the actual list id ?
In fact, can I bind to "lists/mylist" instead of {myguid} ?
thx
(angry against Microsoft for using guids everywhere without the ability to control them).
I finally followed this approach :
In my forms, I converted the data sources to datasources shared in the host sp site. This generated for me the udcx files.
Then, I created, in VS 2010, a feature with a module to provision a DataConnection library, holding all this udcx files. In this udcx file, I replaced the GUID with tokens like $listguid$ or $weburl$
I also wrote a feature receiver to replace, after provisioning the module, my tokens with the actual values
quite painful and very disappointed with this big holes in SP development processes