Diagnostic Tool in VS does not catch ADO.NET when using ASYNC methods - debug-diagnostic-tool

I am developing a web API and using asynchronous as much as i can.
I am using unitOfWork Pattern with Repository Pattern and EF 6.1.3.
I know that in Diagnostic Tools in VS2015 we can see ADO.NET events and we can check the queries performed against the database (SQL).
My problem is that i can´t see them...maybe because they are asynchronous.
Is there a way to check the queries performed ?
I know other tools to do that (SQL Profiler / EF interceptors) but i would like to use the Diagnostic Tools in VS.

Related

How can I connect to an external SQL database using Blazor without using packages (like Entity Framework)

I'm not even sure if this is possible but Google has been unable to help me. It may just be because Blazor is so new. Anyway, I've got a premade database and I want to connect to it directly like how you can open a connection, run some SQL, then close a connection in ASP.NET. I, unfortunately, can't just make a new database using code-first as most tutorials tell you to do.
Two options that spring to mind is the Entity Framework Core (Database First) or Dapper.
I'm actually connecting to an existing database using Dapper in my Blazor projects and there are better Dapper examples/tutorials available however the below is a basic example.
https://github.com/DotNetDublin/BlazorServerSide/tree/main/BlazorServerSide
If you don't want to use either Entity Framework or Dapper you can use ADO.NET.
The below tutorial is for MVC however the code for interacting with the database would be the same. See StudentDataAccessLayer.
https://www.c-sharpcorner.com/article/crud-operations-using-asp-net-core-and-ado-net/
use the ef command Scaffold-DbContext

What are the different approaches for deploying DB changes using TFS 2015?

Currently, we are manual running DB scripts (SQL Server 2012) outside of our CI/CD deployment. What are ways (including toolsets) can we automate deployment of DB changes using TFS 2015 Update 3?
There are really two approaches here, both of which work with TFS. Really TFS just facilitates the execution of any scripting that you will use to update your database, including your custom, handcrafted scripts.
There is the state based approach, which uses a comparison technology to look at your VCS/dev/test/staging database and compare this to production. SQL Source Control and the DLM Automation Suite from Redgate Software does this, as do other comparison tools. What you would do is use a command line or programmatic interface to set your source and target, capture the output and then use this as an artifact in your release process. I might include a review of the artifact as a scripting choice in your flow.
Note, there are some problems that State based comparisons don't do well with. Renames, splits, merges, data movement, a few others. Some comparison tools have ways around this, some do not. Be aware this may be an issue. If you have a more mature database, perhaps not, but you should consider this. SQL Source Control allows custom migration scripts, which can handle these issues.
The other approach is a script runner or migration strategy where each change you make to a dev database is captured as an ordered script and a framework executes these in order, if they are needed. This is preferred by some people since you can see exactly what code will be executed at dev and deployment time. ReadyRoll from Redgate Software, Liquibase, Rails Migrations, DBUp, FlywayDB, all use this strategy.
Neither of these is better or worse. Both work, both have pros and cons, but really the choice comes down to your comfort level and preference.
Disclosure: I work for Redgate Software.
If deploy DB changes just mean using SQL Server Database Projects (.sqlproj files) with Team Foundation Build in Team Foundation Server.
There are several ways can achieve this:
Use MSBuild task with some arguments to publish your SQL project
during build.
Add a deploy target in your sqlproj file,run the target after build
completes.
Or add a "Batch Script" step in your build
definition to run "SqlPackage.exe" to publish the .dacpac file.
More details please refer to this blog: Deploying SSDT During Local and Server Build .
As for using TFS2015, you can also try to use SQL Server Database Deployment task.
Use this task to deploy a SQL Server database to an existing SQL
Server instance. The task uses a DACPAC and SqlPackage.exe, which
provides fine-grained control over database creation and upgrades.

Creating MVC 4 data access layer using SQL Server 2000

I am a novice ASP.Net developer/student/intern. Currently, I am in the process of creating my first MVC web app with:
VS 2012 Ultimate C#
SQL Server 2000
My database is on a SQL Server 2000 with no chance of upgrading to >= 2005 in the foreseeable future. Thus preventing me from using EF, which all the books and examples I’ve read so far have used.
I have created a C# SQL helper class for connections, commands etc. that I have used previously as a data access layer for other basic web form applications.
What's the best way to incorporate it into my current MVC DAL?
Also,are there any examples or documentation outlining the basic steps in creating a sound MVC DAL that adheres to MVC best practices while using server 2000 and without EF.?
Any suggestions/guidance would be greatly appreciated.
Thank you,
Todd
Technically speaking, MVC has nothing to do with your DAL. As far as your MVC app is concerned, it shops at the repository layer and does not care what happens in the warehouse (data layer).
Your DAL can be built using ADO.NET. Although this technology is old, its perfectly usable. If you check out http://www.dofactory.com/Default.aspx, you will see an app that is build using multiple DAL technologies. One is for Entity Framework and another is for ADO.Net ... and the cool part is they are BOTH hitting the same database.

SQL Server Profiler deprecation - Replacement?

I am developing ASP.NET and SQL Server applications, sometimes i am having trouble with a SQL Query, and i would like to see the SQL Servers "response" and not just the ASP.NET error message (Which is not always very helpfull)
The Profiler.exe tool in SQL Server is capable of this, but im reading on MSDN that Microsoft is planning to deprecate the tool.
We are announcing the deprecation of SQL Server Profiler for Database Engine Trace Capture and
Trace Replay. These features will be supported in the next version of SQL Server, but will be
removed in a later version. The specific version of SQL Server has not been determined. The
Microsoft.SqlServer.Management.Trace namespace that contains the Microsoft SQL Server Trace and
Replay objects will also be deprecated. Note that SQL Server Profiler for the Analysis Services
workloads is not being deprecated, and will continue to be supported.
http://msdn.microsoft.com/en-us/library/ms181091.aspx
They don't mention what tool will replace profiler.exe
Does anyone know anything about that?
And are there any alternatives to profiler if i want to see (recent?) unsuccessful queries?
While it is safe to continue using trace for the next few versions, Profiler is never the answer (some evidence here and also here). If you're going to use trace, use a server-side trace. Just don't write new code that will utilize trace and expect to live beyond a few versions.
The long-term answer is to use extended events. A blog you'll want to watch is SQL Server MVP Jonathan Kehayias of SQLskills.com. He has done a great job explaining extended events in laymen's terms and providing many, many ready-to-use examples. He also has great courses on PluralSight (which you can currently get for free through Visual Studio Dev Essentials):
SQL Server: Introduction to Extended Events
SQL Server: Advanced Extended Events
Another person to learn a lot from is Erin Stellato. She has since moved on to Microsoft but her blog posts at SQLskills remain, and they are rich with info.

Tools to Replay Load on a SQL Server

Has anyone come across any good tools (preferably but not necessarily FOSS) that can read a SQL Server (2005/2008) trace file and execute the commands against another database. We are attempting to perform some performance testing on our SQL servers and would like to replicate an actual load.
I have come across but not yet used:
JMeter
ReplayML
Preferably, the application would be able to use threading to mimic user connections and query execution on the SQL Server.
You can replay a SQL Server Profiler trace against another server using the SQL Server Profiler itself.
See the following Microsoft Reference as a starting point.
http://msdn.microsoft.com/en-us/library/ms189604.aspx
Quest Software also have a tool called Benchmark Factory that can be used to perform SQL Server load testing.
http://www.quest.com/benchmark-factory/
One of the best tools is actually freely available from Microsoft. The RML Utilities are targeted at SQL2005 & SQL2008 and are specifically designed for this type of testing.
You can download the tools from http://www.microsoft.com/downloads/details.aspx?FamilyId=7EDFA95A-A32F-440F-A3A8-5160C8DBE926&displaylang=en
We have used them to solve several performance and locking issues.
Note: Capturing trace files using the SQL Profiler GUI can add to performance problems due to the way the GUI and Trace backend interact. The RML Utilites include a script that can capture traces directly from the SQL Server without using the GUI.
You can replay trace files directly in SQL Profiler, although I've only used it a couple of times for that, so I don't know what all of the limitations are on it.
team system has an add-on that you can find on codeplex: It is called ->
sql load test
Let me know if that works well for you.
I know this is a really old question but after searching for some time i discovered a new open source tool.
https://github.com/spaghettidba/WorkloadTools which works great