This is a bit odd for me, I've worked through several micro-services with unit tests and haven't experienced a problem like this. The issue is that my unit tests pass locally but some fail on our build server. The oddity about this is that if I single out a failing test in the build script it will pass. If I run it with a test that was running before it I get the failure result. If I remote into the test server and access the test result file and rerun all the tests, they will all pass. So to me this says it has to do with my build environment - likely the "runner" context. The specific error I get on failed tests is:
System.Data.Entity.Core.EntityException: The underlying provider failed on Open. ---> System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections.
Again, a test had ran and passed that was accessing the DB right before this failing test. Additionally it should be noted that these tests are using stored procedures to access the data and also are using LINQ to SQL (.dbml). I initially thought it had to do with the context not properly being disposed of but after several iterations of IDisposable implementation and peppering using statements throughout the data access code I think I have ruled that out. I even went so far as to rip out the .dbml reference and new up an entity model (.edmx) but ended up getting the same results the end, after much simplification of the problem. I can reproduce the issue with just 2 unit tests now, one will pass, one will fail. When ran separately they both pass, when ran manually either locally or on the build server both will pass.
Dev Server
We have our dev environment setup to be a remote server. All devs use VS 2013 Ultimate. All devs use a shared instance of localdb. This seems to be working fine, I am able to develop and test against this environment. All my tests pass here for the solution in question. Then I push code upstream to the build server.
Build Server
This is a windows 2012 server with GitLabs installed, every commit to our dev branches run build via the .gitlab-ci.yml build script. For the most part this is just simple msbuild -> mstest calls, nothing too fancy. This server also has its own shared instance of localdb running with matching schemas from the Dev environment. Several other repositories have passing builds/unit tests utilizing this setup. The connection strings for accessing data are all using integrated security, and the gitlab runner service account has full privs to the localdb. The only thing I can identify as notably different about the solution in question is the heavy use of sprocs, however like I was saying some of these unit tests do pass and they are all using sprocs. Like I also mentioned, after the build fails if I manually go in and access the test results file and manually invoke the tests on the build server they suddenly all pass.
So I'm not really sure what is going on here, has anyone experienced this sort of behavior before? If so, how did you solve it? I'd like to get these tests passing so my build can pass and move on.
Ok I have got the unit tests to pass but it was a weird thing that I ended up doing to get them to pass and I'm not quite sure why this worked. Even though the runners account had FULL "Server Role" privs on the localdb instance (all the boxes checked) I decided to throw up a hail marry and went through the process of "mapping" the user to the DB in question, setting his default schema to dbo and giving him full privs (all boxes checked). After this operation...the tests pass. So I'm clearly not understanding something about the way permissions are propagated in localdb, I was under the assumption that a server role of god-like would imply full privs to individual dbs but I guess not? I'm no DBA, I'm actually going to chat with our DBA and see what he thinks. Perhaps this has something to do with the sproc execution? Maybe that requires special privs? I do know in the passed that I've had to create special roles to execute sprocs, so they are a bit finicky. Anyways, unit tests pass so I'm happy now!
Related
I have a JOB done in SPOON, which is executed without problems in the command line, but I would like to know if there is any software in which I can execute these JOBS and go to see the execution visually. The idea is that for the most pleasant exploitation area these tasks are executed.
You have two solutions:
Carte:
Use the carte server which is shipped with the PDI. Install the PDI on any server, launch carte (specifying the port), then you can execute/view/stop/restart job/transformation from any browser. Documentation is here.
Of course you can launch a job/transformation from your own PDI. Just define a new Slave server, on the left panel, tab view, default username/password = cluster/cluster. Then each time you run a job/transformation, choose the carte server, instead of Pentaho/local in the Run configuration.
Loggin
If you just want to follow job/transformation, you may use the database logging: Right-click any where, Parameters, Logging, Job/Transformation, then define a database, a table and a logging interval of 2 seconds.
Then every two seconds, the line_read, line_written, errors, and log_field are written to a database. This database can be read by an external process and displayed on the screen or on a browser.
This method is used in the github/ETL-pilot which uses a tomcat (because you probably have a tomcat already running with a Pentaho server), but can easily be adapted to a nodejs or any other server. (If you do it and OpenSource it, please add a link to your work on our github).
I have setup a build server at the company I work for.
This build server interactively works with Visual Studio Team Services.
Building works great and so does publish. The issue I am running in to is the ability to run "dotnet test" as a different user.
This is needed because currently the user the agent runs under is a service account. It has access to IIS and has the ability to move files where they need to be. But it does not have access to the database.
Since we have a few integration tests that we utilize, it errors out when connecting to the database because it is trying to connect as the service user.
So far I have not found a way to run "dotnet test" as a different user, specifically one that has access to query the database.
I tried utilizing the VSTS Task "Run Powershell on Remote Machines" since it lets me supply a username and password. But it seems to have issues trying to remotely connect to itself (which is probably understandable).
I am at a loss. I have no idea how to get this to work. Except giving the service user the ability to run those queries on the database.
Any help is greatly appreciated!
SQL authentication is the better way. So change connectionstring to use SQL authentication.
Server=myServerName\myInstanceName;Database=myDataBase;User Id=myUsername;
Password=myPassword;
Authentication article: Choose an Authentication Mode
You could start a process with the desired identity by passing appropriate credentials, e.g.
param($user, $pwd)
Start-Process $DOTNET_TEST_COMMAND -WorkingDirectory $DESIREDCURRENTWORKINGDIR -Credential (New-Object System.Management.Automation.PSCredential -ArgumentList #($user,(ConvertTo-SecureString -String $pwd -AsPlainText -Force)))
My opinion is that during a build only unit tests should be executed, as you could have side effects on the shared build machine if you execute more convoluted tests as functional tests.
Instead than running the functional tests on the build machine, I would suggest to use the Run Functional Tests task during a Release of VSTS, as it would allow you to:
distribute the tests on a pool of machines where you installed the test agent by means of the Deploy Test Agent task);
provide the credentials of the identity the tests run for, this functionality is present out of the box in the task, i.e. solve your problem at the root;
I have an SSIS Package that runs via a SQL job on a SSIS server (Server A) that executes a stored procedure on the database server (server B). This stored procedure does a Bulk insert from a file share that is located on the SSIS Server (Server A). However, every time that the stored procedure runs it fails with the follow error:
Execute Membership Process:Error: Executing the query "exec storedprocname ?, ?" failed with the following error: "Cannot bulk load because the file "\ServerA\TestLoads\Membership\Processing\filename.csv" could not be opened. Operating system error code 5(Access is denied.).". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I am pretty positive that the issue is related to permissions. If I store the files on the database server (Server B) then it processes. Or if I run the proc manually, it will work.
When I execute the job on Server A it executes as the service account for that server. That account has full access to Server A and Server B (Admin in SQL and on the server). I believe what is happening is the credentials get passed the first time, but they are not continued once the stored proc runs. I ran wireshark on Server A (SSIS Server) so that I could see what was access the share and try to get some more information. What I found was that there was no account information being passed, it was just blank.
I went through a lot of steps just to try to see if could get that work such as granting everyone access to the share, enabling the guest account, allowing anonymous users, etc. Not stuff I would want to do, but trying to narrow down the issue. None of those worked.
I tried modify the stored proc to use WITH EXECUTE AS OWNER. Still did not work, but got a different error. Also tried a variety of other accounts to execute as and got the same error each time.
Execute Membership Process:Error: Executing the query "exec [storedprocname] ?, ?" failed with the following error: "Could not obtain information about Windows NT group/user '', error code 0x5.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Tried a variety of solutions that I found online to get this to work and nothing so far has done it.
I understand that is not an ideal solution. I was under the impression that the developers where using SSIS to load the file initially and then using SQL for the rest of the process which would have worked. But because SQL has to touch the file system it keeps failing. And at this stage, there is not the option of rewriting this. Additionally, this process will work if we move the files to the database server (Server B), but that eliminates a large need for us in having the SSIS server in the first place which was to get files being processed off of the database server
Any ideas on if there is a way to get the current solution to work? Basically, what I think I am needing is to run the SSIS package and for a way to pass credentials via the stored proc to the file share during that process.
We are using Windows Server 2012 R2 on both servers and SQL Server 2012 sp3 Developer edition.
Thanks for the help!
I've had this issue before, and I still don't fully understand Kerberos authentication, but that fixed it for me. It's something to do with "double-hop" of authentication i.e. creds going from SSIS, through SQL Server, to a network Server.
Try setting up Kerberos Authentication for SQL Server. There are detailed step-by-step instructions with screenshots here => Setup Kerberos Authentication for SQL Server
I understand this is like a "link-only" answer, but I don't want to copy-paste & plagiarize the author's original works i.e. blog post, hence the link.
Its been quite sometime that I am trying to publish the data base for my website using webdeploy in VS 2013 but I keep getting the following error:
Web deployment task failed. (Could not generate deployment script.
Internal Error. The database platform service with type Microsoft.Data.Tools.Schema.Sql.Sql120DatabaseSchemaProvider is not valid. You must make sure the service is loaded, or you must provide the full type name of a valid database platform service.
I have tried installing the new updates. I am making sure that the local server is running during deployment. I have made sure that I enter the full server name rather than a dot. I have literally done everything that I could think of but to no avail.
Can anyone help me, please?
The issue might be related to VisualStudioVersion environment variable, which needs to be set to VisualStudioVersion=11.0 or VisualStudioVersion=12.0 to support SQL Server 2014. Check this answer, also here is the info how to target the VisualStudioVersion.
I have a deployment package that needs to run against about 3 different enviroments.
I want to specify a sql script to run (source) with the enviroments database (destiniation).
I don't want to specify the connection string in the deploy script because it contains sql login info.
I would like to be able to read a setting from the destination for the connection string.
Can I mark this a parameter to be specified when unpackaging the deployment package on the server? If so, how so I use the parameter in the dest:sql="connection string"?
Any suggestions would be great.
Scott Guthrie has a pretty good write up on this sort of thing here. He specifically mentions the changing of parameters both in prompts for the admin and in an automated fashion via the command line within deployment and/or automation scripts.