I try to run a patch request from Raven Studio on a large collection (over 8000000 documents) and the patch seems doesn't work, I don't see any changes on the documents but if I run it on a single document it works correctly. For a better test I created an equal collection with 100 documents and the patch works correctly.
In the Raven Studio under Status -> Running Tasks I can see my patch task running but without any effect.... any suggestions?
Thanks in advice.
Raven Studio version 3.0
Related
This error just started occurring on a few developers' systems across several packages, but I can't track down a specific cause or update. We have SSIS processes created across various targets (SQL 2012 and up), but when I open them in Visual Studio 2019 this error occurs:
Error loading XXXXXXX.dtsx: There was an exception while loading Script Task from XML: System.Exception: The Script Task "ST_36ae893a14204fac97ce8ce3b4ce8ebb" uses version 16.0 script that is not supported in this release of Integration Services. To run the package, use the Script Task to create a new VSTA script. In most cases, scripts are converted automatically to use a supported version, when you open a SQL Server Integration Services package in %SQL_PRODUCT_SHORT_NAME% Integration Services.
at Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask.LoadFromXML(XmlElement elemProj, IDTSInfoEvents events)
I can open the script task, but it's as if it's new, none of the existing code is there. Some of the older packages I can open in like Visual Studio 2017 and they work, but in Visual Studio 2019 not so much. Even some packages built in Visual Studio 2019 are doing this. Here's my dev environment:
Microsoft Visual Studio Enterprise 2019
Version 16.11.17
VisualStudio.16.Release/16.11.17+32630.194
Microsoft .NET Framework
Version 4.8.04084
SQL Server Data Tools 16.0.62205.05200
Microsoft SQL Server Data Tools
SQL Server Integration Services 16.0.948.0
Microsoft SQL Server Integration Services Designer
I've tried changing the Target server to different versions, but it seems once the issue occurs it resets the script task removing all code. I'm really confused.
Any thoughts? Thanks.
Not sure if this is a good answer, but it does seem to fix the issue. I'm using SQL Server Integration Services 16.0.948.0 (v4.3), but if I go back down to 15.0.2000.180 (v3.16) the issue seems to go away. So, it's something with version 4.0 and up. Not ideal to go backwards as we're losing some of the updates - but it gets me going again. If anyone has other suggestions, please let me know.
I just migrated from SSDT 2015 to SSDT 2019 and tried to open a project created in SSDT 2015. One of the packages has a data flow task with a script component in it. The script component fails to build with the error:
Could not find part of the path
'C:\Users\xxxxxx\AppData\Local\Temp\2\Vsta\c2e811fdc5974e2ca3f7cb5426c82033_out'
I tried to delete the .vs folder in my project but that didn't work. The script still fails to rebuild. The script has a lot of classes in it so I'd rather not start from scratch and copy everything into it. Any idea what could be wrong?
Appears this issue may be caused by an incompatibility between Visual Studio 16.9 and SQL Server Integration Services Projects. The issue is described here: https://marketplace.visualstudio.com/items?itemName=SSIS.SqlServerIntegrationServicesProjects
Recommendation is to regress back to Visual Studio 16.8 or earlier.
The new version of data tools fixed it for me without rolling back to 16.8. This was a difficult issue to resolve. Nowhere in the MS documentation do they mention the tools update fixes that specific error. Thanks for the link!
I have seen other posts about this topic where some of the suggestions lead people to check the ProtectionLevel to DontSaveSensitive. I have made sure that is set to DontSaveSenistive, as well as I have checked permissions and made sure where the files/dtsx files are getting called from have ample permissions set for the service account which owns the SQL Agent.
The odd thing is this process was working fine until i went into one of the previous dtsx files and had to update a datatype precision to go from a limit of 1 character to 30 characters. That was literally the only change made to the process, but now I am getting this error. I have gotten this error before, which is when I was set on the path to checking protection level and permissions/ownership. For some reason it went away and began working when i made those changes. None of that stuff (permissions/ownership) is incorrect this time around yet I am getting that same error.
Another weird thing about the process is that it is only the last step which is failing (the FTP step.) When I try to go in and execute the psftp.exe and put in the command which is being passed normally through SSIS execute process task step, the psftp.exe is telling me that the port number is incorrect..yet when I test connection on the connection manager inside VS with the exact same port, it says connection successful.
This error is vague and confusing!
I would love some guidance on some more things to try.
thank you !
SSIS tooling and version
SQL Server 2008 and 2008 R2 can only be edited using Visual Studio 2008 which has the Business Intelligence Design Studio templates installed. Those can only be acquired by having the physical SQL Server iso handy. Developer edition will work, but you need some form of licensed media to get BIDS working.
Visual Studio 2013 is going to attempt to upgrade 2008/R2 to the internals for a SQL Server 2012 installation. There is no going backwards/downgrading once this is done.
Any tooling (dtexec, dtutil, etc) you use must be from the same version otherwise, the first thing the binaries do is update the package to match that version. For execution (dtexec), each time you run a package, there is a delay as the original is upgraded in memory to match and then execution begins (assuming all goes well). It sounds like it's not based on
The package failed to load due to error 0xC0010014...
For deployment (dtutil), you only pay the price of upgrading once and then it's upgraded forever. Which probably isn't what you wanted. Be aware that tools like Visual Studio and SSMS "know" which version of tooling they are associated with so deploying from SSMS 2016 can result in the binaries for SQL Server 2016 SSIS upgrading your 2008 package to 2016 format and then attempting to deploy the now upgraded bits to your 2008 box. It's all very frustrating and not intuitive.
From your comment "In the 2008 version, the play button is greyed out..." That indicates you have opened a File in Visual Studio that is an SSIS package. Visual Studio will open it and paint all the icons on there but it can't actually run a package unless you have an Integration Services project open (and have the BI templates installed).
Assuming you have source control, you can rollback the change that broke everything and try to edit the package properly.
Execute process task
You have an Execute Process Task that is invoking psftp.exe and it's generating a 1 versus a 0. Is that bad? Based on previous workings with SFTP clients, they're rather picky so running it as me on production would fail since I didn't have whatever bits associated to my domain account but the service account had all the right things in their profile and it would run just fine - same machine, same package, just different user.
I have written dozens of custom code analysis rules. The rules were developed targeting Visual Studio 2010. As required, the assembly has a reference to version 10.0 of FxCopSdk, Microsoft.Cci, and Microsoft.VisualStudio.CodeAnalysis. They run correctly in Visual Studio 2010 and build properly in TFS 2010.
I'd like to migrate to Visual Studio 2012. When I run the custom rules on an existing solution using VS 2012, however, I get CA0062 errors. The root cause is a CA0053 error loading the custom rules assembly. I understand that these references to the three assemblies need to be updated to version 11 for Visual Studio 2012. This can be done using version redirects in config files. I can get this to work locally by redirecting the Visual Studio 2012 IDE and FxCopCmd binaries, but am running into trouble when checking code into TFS 2010.
There are two apparent solutions we have considered, but neither is very palatable. The first is to require each developer to redirect locally, and then modify the TFS build agents to redirect as well. The second is to maintain two branches of the custom code analysis rules, one targeting version 10 (VS2010) and the other targeting version 11 (VS2012).
Is there a better way to do this, or do we need to all upgrade to TFS 2012 and Visual Studio 2012 simultaneously?
You can try to manually edit the project file and write two include blocks (one for VS2010 and one for VS2012), then define conditions to use the correct one. You only have to somehow determine if You want to build for VS2010 or VS2012 in msbuild.
Between your approaches and the one proposed by ZFE, you pretty much have all the potential candidates. Given the choices, I would strongly recommend branching since there is no official SDK for FxCop with backward-compatibility guarantees.
If you're lucky, you won't hit any behavioural or API surface changes that affect your rules, and the only difference between your two branches will be the references, so any merges will be trivial. However, any time investment you make in an alternate approach now will be lost if you need to branch later, and the likelihood of eventually needing to branch is non-negligeable.
I have setup a tfs 2012 server with the build controller and the build agent. I have a simple visual studio 2012 solution with a windows console application project and a test project on a client machine. Test impact analysis is enabled in the build definition.
Simply put, the impacted tests list that should appear on each build performed on the tfs server is never populated. I have tried to change the test runner from vs runner to mstest to no avail.
Please advise. Thanks.
EDIT: I installed VS 2012 on the server. This enabled code coverage, but still no impact analysis.
Fortunately I fixed the 'problem'.
For TIA to work, you need to successfully complete a test case in test manager, passing all test steps. Only then does the analysis file get generated. Also, I had problems with the video recording module of the test controller, which apparently also prevents the file from being generated sometimes, though this could just be a coincidence.
After successfully generating the file, subsequent builds will now have a baseline to compare against.
Have fun.