sql server bcp xml data - sql

I have a table which has a column which is of type xml.
I have to extract data from this table and load the data into another environment.
i am using bcp to extract and laod the target table but there are some special characters that is causing some issues when i bcp them into the target table. are there any workarounds
thanks
Ben

A custom CLR-SP provided me with the best solution. Now I can write XML-typed data directly to a file from TSQL, provided the SQL service account has permission to the file. This allows the simple syntax:
exec dbo.clr_xml2file #xml, #path, #bool_overwrite
The SP:
CREATE PROCEDURE [dbo].[clr_xml2file]
#xml [xml],
#file [nvarchar](max),
#overwrite [bit]
WITH EXECUTE AS CALLER
AS
EXTERNAL NAME [CLR_FileIO].[FreddyB.FileIO].[Xml2File]
The C# for the CLR DLL:
using System;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.IO;
using System.Security.Principal;
using System.Text;
using System.Xml;
using System.Xml.XPath;
using Microsoft.SqlServer.Server;
namespace FreddyB
{
public class FileIO
{
public static void Xml2File(
SqlXml xml,
SqlString sFile,
SqlBoolean bOverwrite
) {
SqlPipe sqlpipe = SqlContext.Pipe;
try
{
if (xml == null || xml.IsNull || xml.Value.Length == 0) {
sqlpipe.Send("Cannot write empty content to file : \n\t"
+sFile.Value);
return;
}
if (File.Exists(sFile.Value) & bOverwrite.IsFalse) {
sqlpipe.Send("File already exists : \n\t"+sFile.Value);
return;
}
int iFileSize = 0;
FileStream fs = null;
try {
byte[] ba = Encoding.UTF8.GetBytes(xml.Value);
iFileSize = ba.Length;
fs = new FileStream(sFile.Value, FileMode.Create, FileAccess.Write);
fs.Write(ba, 0, ba.Length);
sqlpipe.Send("Wrote "
+String.Format("{0:0,0.0}",iFileSize/1024)
+" KB to : \n\t"
+sFile.Value);
}
catch (Exception ex)
{
sqlpipe.Send("Error as '"
+WindowsIdentity.GetCurrent().Name
+"' during file write : \n\t"
+ex.Message);
sqlpipe.Send("Stack trace : \n"+ex.StackTrace);
}
finally
{
if (fs != null) {
fs.Close();
}
}
}
catch (Exception ex)
{
sqlpipe.Send("Error writing to file : \n\t"
+ex.Message);
}
}
}
}

Related

"Validation failed for one or more entities. ERROR

I am creating web API to save the uploaded file in my local storage. When I testing my code it gives an error as ExceptionMessage": "Validation failed for one or more entities. See EntityValidationErrors' property for more details."
Can anyone help to fix this issue. Thanks in advance.
Controller(FileUploadController)
using System;
using System.Net.Http;
using System.Threading.Tasks;
using System.IO;
using System.Collections.Generic;
using System.Web.Http;
using VantageCore.BL;
namespace VantageCoreApi.Controllers.Api
{
public class FileUploadController : ApiController
{
[HttpPost]
[Route("api/FileUpload")]
public async Task<IHttpActionResult> UploadFile(string FileName, int Id)
{
try
{
List<string> ids = new List<string>();
var provider = new MultipartMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);
var referenceId = FileName.Split('_')[0];
foreach (var file in provider.Contents)
{
Guid guid;
ids.Add(Guid.TryParse(await new FileUploadMgt().ReceiveFile(file, FileName, Id), out guid) ? FileName : "Error");
}
return Ok(ids);
}
catch (Exception e)
{
return InternalServerError(e);
}
}
public string SaveFile(byte[] File, string path)
{
string Result = "";
try
{
//LOCAL SERVER PATH
var fs = new BinaryWriter(new FileStream(#"F:\Testfolder" + path, FileMode.Append, FileAccess.Write));
fs.Write(File);
fs.Close();
Result = path;
}
catch (Exception ee)
{
Result = ee.ToString();
}
return Result;
}
}
}
BL (FileUplodMgt.cs)
using System;
using System.Threading.Tasks;
using System.Collections.Specialized;
using System.Configuration;
using System.IO;
using System.Net.Http;
using VantageCore.Entity.Model;
using File = VantageCore.Entity.Model.File;
namespace VantageCore.BL
{
public class FileUploadMgt
{
public async Task<string> ReceiveFile(HttpContent receivedFile, string receivedFileName, int Id)
{
if (receivedFile != null)
{
var fileId = Guid.NewGuid();
using (var c = new DBEntities())
{
NameValueCollection appSettings = ConfigurationManager.AppSettings;
string folder = appSettings["TestPath"];
var fileName = fileId.ToString() + Path.GetExtension(receivedFileName).ToLower();
var file = Path.Combine(folder, fileName);
bool exists = Directory.Exists(folder);
if (!exists) Directory.CreateDirectory(folder);
using (var fs = new BinaryWriter(new FileStream(file, FileMode.Create, FileAccess.Write)))
{
fs.Write(await receivedFile.ReadAsByteArrayAsync());
}
string extention = Path.GetExtension(file);
receivedFileName = Path.GetFileNameWithoutExtension(receivedFileName).Length <= 32
? Path.GetFileNameWithoutExtension(receivedFileName)
: Path.GetFileNameWithoutExtension(receivedFileName).Substring(0, 31) + "~";
var newFile = new File
{
Uid = fileId,
FileExtention = extention,
FileName = receivedFileName,
FileSize = (int)(receivedFile.Headers.ContentLength / 1024),
CreatedDate = DateTime.UtcNow
};
c.Files.Add(newFile);
c.SaveChanges();
}
return fileId.ToString();
}
else
{
return "Error,Invalid file Or file size exceeded";
}
}
}
}
You could try as below to observe the error message when you debug and share it;
try
{
c.SaveChanges();
}
catch (DbEntityValidationException e)
{
foreach (var eve in e.EntityValidationErrors)
{
}
}

SQL for json path returning invalid json format and incomplete json result [duplicate]

I am using AZURE SQL (SQL Server 2016) and creating a query to give me output in JSON object. I am adding FOR JSON PATH at the end of query.
When I execute the procedure without adding FOR JSON PATH to the query, I get 244 rows (no of records in my table); but when I execute the procedure by adding FOR JSON PATH I get message 33 rows and also I get JSON object which is truncated.
I tested this with different types of queries including simple query selecting only 10 columns, but I always get less number of rows with FOR JSON PATH and JSON object truncated at the end.
Here is my query
SELECT
[Id]
,[countryCode]
,[CountryName]
,[FIPS]
,[ISO1]
,[ISO2]
,[ISONo]
,[capital]
,[region]
,[currency]
,[currencyCode]
,[population]
,[timeZone]
,[timeZoneCode]
,[ISDCode]
,[currencySymbol]
FROM
[dbo].[countryDB]
Above query returns 2 rows.
And I use following query to get output in JSON
SELECT
[Id]
,[countryCode]
,[CountryName]
,[FIPS]
,[ISO1]
,[ISO2]
,[ISONo]
,[capital]
,[region]
,[currency]
,[currencyCode]
,[population]
,[timeZone]
,[timeZoneCode]
,[ISDCode]
,[currencySymbol]
FROM
[dbo].[countryDB]
FOR JSON PATH
Above query returns 33 rows and output is
[{"Id":1,"countryCode":"AD","CountryName":"Andorra","FIPS":"AN","ISO1":"AD","ISO2":"AND","ISONo":20,"capital":"Andorra la Vella","region":"Europe","currency":"Euro","currencyCode":"EUR","population":67627,"timeZone":2.00,"timeZoneCode":"DST","ISDCode":"+376"},{"Id":2,"countryCode":"AE","CountryName":"United Arab Emirates","FIPS":"AE","ISO1":"AE","ISO2":"ARE","ISONo":784,"capital":"Abu Dhabi","region":"Middle East","currency":"UAE Dirham","currencyCode":"AED","population":2407460,"timeZone":4.00,"timeZoneCode":"STD","ISDCode":"+971"},{"Id":3,"countryCode":"AF","CountryName":"Afghanistan","FIPS":"AF","ISO1":"AF","ISO2":"AFG","ISONo":4,"capital":"Kabul","region":"Asia","currency":"Afghani","currencyCode":"AFA","population":26813057,"timeZone":4.50,"timeZoneCode":"STD","ISDCode":"+93"},{"Id":4,"countryCode":"AG","CountryName":"Antigua and Barbuda","FIPS":"AC","ISO1":"AG","ISO2":"ATG","ISONo":28,"capital":"Saint Johns","region":"Central America and the Caribbean","currency":"East Caribbean Dollar","currencyCode":"205","population":66970,"timeZone":-4.00,"timeZoneCode":"STD","ISDCode":"+1"},{"Id":5,"countryCode":"AI","CountryName":"Anguilla","FIPS":"AV","ISO1":"AI","ISO2":"AIA","ISONo":660,"capital":"The Valley","region":"Central America and the Caribbean","currency":"East Caribbean Dollar","currencyCode":"205","population":12132,"timeZone":-4.00,"timeZoneCode":"STD","ISDCode":"+1"},{"Id":6,"countryCode":"AL","CountryName":"Albania","FIPS":"AL","ISO1":"AL","ISO2":"ALB","ISONo":8,"capital":"Tirana","region":"Europe","currency":"Lek","currencyCode":"ALL","population":3510484,"timeZone":2.00,"timeZoneCode":"DST","ISDCode":"+355"},{"Id":7,"countryCode":"AM","CountryName":"Armenia","FIPS":"AM","ISO1":"AM","ISO2":"ARM","ISONo":51,"capital":"Yerevan","region":"Commonwealth of Independent States","currency":"Armenian Dram","currencyCode":"AMD","population":3336100,"timeZone":5.00,"timeZoneCode":"DST","ISDCode":"+374"},{"Id":8,"countryCode":"AN","CountryName":"Netherlands Antilles","FIPS":"NT","ISO1":"AN","ISO2":
I am trying to get output directly in JSON
When FOR JSON queries are returned to the client, the JSON text is returned as a single-column result set. The JSON is broken into fixed-length strings and sent over multiple rows.
It's really hard to see this properly in SSMS, as SSMS concatenates the results for you in "Results to Grid", and truncates each row in "Results to Text".
Why? Dunno. My guess is that only .NET clients know how to efficiently read large streams from SQL Server, and 99% of the time users will still just buffer the whole object. Breaking the JSON over multiple rows gives clients a simple API to read the data incrementally. And in .NET the fact that the de facto standard JSON library is not in the BCL means that SqlClient can't really have a first-class JSON API.
Anyway, from C#, you can use something like this to read the results:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Common;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp3
{
class SqlJSONReader: TextReader
{
SqlDataReader rdr;
string currentLine = "";
int currentPos = 0;
public SqlJSONReader(SqlDataReader rdr)
{
this.rdr = rdr;
}
public override int Peek()
{
return GetChar(false);
}
public override int Read()
{
return GetChar(true);
}
public int GetChar(bool Advance)
{
while (currentLine.Length == currentPos)
{
if (!rdr.Read())
{
return -1;
}
currentLine = rdr.GetString(0);
currentPos = 0;
}
int rv = (int)currentLine[currentPos];
if (Advance) currentPos += 1;
return rv;
}
public override void Close()
{
rdr.Close();
}
}
class Program
{
static void Main(string[] args)
{
using (var con = new SqlConnection("server=.;database=master;Integrated Security=true"))
{
con.Open();
var sql = #"
select o.object_id as [obj.Id], replicate('n', 2000) as [obj.foo], c.name as [obj.col.name]
from sys.objects o
join sys.columns c
on c.object_id = o.object_id
for json path;
"
;
var cmd = new SqlCommand(sql, con);
var sr = new StringBuilder();
using (var rdr = cmd.ExecuteReader())
{
using (var tr = new SqlJSONReader(rdr))
{
using (var jr = new Newtonsoft.Json.JsonTextReader(tr))
{
while (jr.Read())
{
Console.WriteLine($" {jr.TokenType} : {jr.Value}");
}
}
}
}
Console.WriteLine(sr.ToString());
}
}
}
}
With thanks to #David Browne. I found I had to use 'print' instead of 'select'
declare #json varchar(max) = (SELECT * FROM dbo.AppSettings FOR JSON AUTO)
print #json
Separation of Concerns dictates returning a string and parsing the JSON separately. The below snippet can be used with no dependency on JSON.net which can be be used separately or a different Json Deserializer can be used (e.g. the one built into RestSharp) and does not require the SqlJSONReader class.
try {
using (var conn = new SqlConnection(connectionString))
using (var cmd = new SqlCommand(sql, conn)) {
await conn.OpenAsync();
logger.LogInformation("SQL:" + sql);
var rdr = await cmd.ExecuteReaderAsync().ConfigureAwait(false);
var result = "";
var moreRows = rdr.HasRows;
while (moreRows) {
moreRows = await rdr.ReadAsync();
if (moreRows) result += rdr.GetString(0);
}
return result;
}
}
catch (Exception ex) {
//logger.LogError($"Error accessing Db:{ex}");
return null;
}
sql Query result of for json path is a long string that is divided to multi column or block
something like this statement: "i want to get result of for json path"
"i want"+
" to ge"+
"t resu"+
"lt of "+
"for js"+
"on path"
each block is equal to max size of column in sql
so just get them all to a list
public IHttpActionResult GetAdvertises()
{
var rEQUEST = db.Database.SqlQuery<string>("SELECT
ID,CITY_NAME,JSON_QUERY(ALBUM) AS ALBUM FOR JSON PATH").ToList();
foreach(string req in rEQUEST)
{
HttpContext.Current.Response.Write(req);
}
return Ok();
}
If your query returned more then 2033 charters then there will be rows. Each row contains 2033 charters data another row contains remaining data. So you need to merge to get actual json. As seen below code sample.
dynamic jsonReturned = unitOfWork
.Database
.FetchProc<string>("storedProcedureGetSaleData", new { ProductId = productId });
if (Enumerable.Count(jsonReturned) == 0)
{
return null;
}
dynamic combinedJson = "";
foreach (var resultJsonRow in jsonReturned)
{
combinedJson += resultJsonRow;
}
return combinedJsonResult;

Cannot get Azure Managed Instance SQL server using GetByResourceGroup

I have c# code that exports azure db to blob storage. This code works with standalone sql server and elastic pool dbs but when I try this code with Managed Instance, I am not even getting SQL server details. Is this because of security settings of Managed Instance? I am running the code in VM from which I can connect to managed instance using SSMS. Below is the line where I get null in case of managed instance. azure is IAzure. I get SQL server details using below code in case of standalone sql server and elastic pool but not in case of managed instance.
using AzureDatabaseExport.Models;
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent.Core;
using Microsoft.Azure.Management.Sql.Fluent;
using Microsoft.Azure.Management.Storage.Fluent;
using Microsoft.Extensions.Configuration;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using System.IO;
using System.Configuration;
using QM.ETL.DAL.Helpers;
using System.Collections;
using System.Collections.Specialized;
namespace AzureDBExport_ConsoleApp
{
public class AzureDatabaseExportService : IAzureDbService
{
IAzure azure;
ISqlServer sqlServer;
public static NameValueCollection _configSettings;
//Backup backup = new Backup();
public AzureDatabaseExportService(ICollection configSettings)
{
_configSettings = configSettings as NameValueCollection;
string subscriptionId = _configSettings[AzureConstants.SubscriptionId];
string clientId = _configSettings[AzureConstants.ClientId];
string clientSecret = _configSettings[AzureConstants.ClientSecret];
string tenantId = _configSettings[AzureConstants.TenantId];
var credentials = SdkContext.AzureCredentialsFactory.FromServicePrincipal(
clientId, clientSecret, tenantId,
environment: AzureEnvironment.AzureGlobalCloud);
azure = Azure.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(credentials).WithSubscription(subscriptionId);
}
private ISqlServer GetSqlServer(string sqlServerResourceGroup, string sqlServerName)
{
try
{
return azure.SqlServers.GetByResourceGroup("sqlServerResourceGroup",
"sqlServerName");
}
catch (Exception ex)
{
throw new Exception("Error getting Azure SQL server resource info. ", ex);
}
}
private IStorageAccount GetStorageAccount()
{
try
{
return azure.StorageAccounts.GetByResourceGroup(
_configSettings[AzureConstants.StorageAccountResourceGroup],
_configSettings[AzureConstants.StorageAccountName]);
}
catch (Exception ex)
{
throw new Exception("Error getting Azure storage account info. ", ex);
}
}
private ISqlDatabase GetSqlDatabase()
{
try
{
return sqlServer.Databases.Get(_configSettings[AzureConstants.SqlDatabaseName]);
}
catch (Exception ex)
{
throw new Exception("Error getting Azure Database info. ", ex);
}
}
/// <summary>
/// Exports a copy of database to blob storage
/// </summary>
/// <param name="backup"></param>
/// <returns>Returns status of export</returns>
public string ExportAzureDatabase()
{
string fileName = DateTime.Now.ToString("yyyy’-‘MM’-‘dd’T’HH’:’mm’:’ss");
sqlServer = GetSqlServer(_configSettings[AzureConstants.SqlServerResourceGroup],
_configSettings[AzureConstants.SqlServerName]);
IStorageAccount storageAccount = GetStorageAccount();
ISqlDatabase sqlDatabase = sqlServer.Databases.Get(_configSettings[AzureConstants.SqlDatabaseName]);
try
{
ISqlDatabaseImportExportResponse exportedSqlDatabase = sqlDatabase.ExportTo(
storageAccount,
_configSettings[AzureConstants.StorageContainerName],
fileName)
.WithSqlAdministratorLoginAndPassword(
_configSettings[AzureConstants.SqlAdminUsername],
_configSettings[AzureConstants.SqlAdminPassword])
.Execute();
return fileName;
}
catch (Exception ex)
{
throw new Exception("Error exporting database to blob stoage. ", ex);
}
}
}
}
sqlServerName);
sqlServerName);

FOR JSON path returns less number of Rows on AZURE SQL

I am using AZURE SQL (SQL Server 2016) and creating a query to give me output in JSON object. I am adding FOR JSON PATH at the end of query.
When I execute the procedure without adding FOR JSON PATH to the query, I get 244 rows (no of records in my table); but when I execute the procedure by adding FOR JSON PATH I get message 33 rows and also I get JSON object which is truncated.
I tested this with different types of queries including simple query selecting only 10 columns, but I always get less number of rows with FOR JSON PATH and JSON object truncated at the end.
Here is my query
SELECT
[Id]
,[countryCode]
,[CountryName]
,[FIPS]
,[ISO1]
,[ISO2]
,[ISONo]
,[capital]
,[region]
,[currency]
,[currencyCode]
,[population]
,[timeZone]
,[timeZoneCode]
,[ISDCode]
,[currencySymbol]
FROM
[dbo].[countryDB]
Above query returns 2 rows.
And I use following query to get output in JSON
SELECT
[Id]
,[countryCode]
,[CountryName]
,[FIPS]
,[ISO1]
,[ISO2]
,[ISONo]
,[capital]
,[region]
,[currency]
,[currencyCode]
,[population]
,[timeZone]
,[timeZoneCode]
,[ISDCode]
,[currencySymbol]
FROM
[dbo].[countryDB]
FOR JSON PATH
Above query returns 33 rows and output is
[{"Id":1,"countryCode":"AD","CountryName":"Andorra","FIPS":"AN","ISO1":"AD","ISO2":"AND","ISONo":20,"capital":"Andorra la Vella","region":"Europe","currency":"Euro","currencyCode":"EUR","population":67627,"timeZone":2.00,"timeZoneCode":"DST","ISDCode":"+376"},{"Id":2,"countryCode":"AE","CountryName":"United Arab Emirates","FIPS":"AE","ISO1":"AE","ISO2":"ARE","ISONo":784,"capital":"Abu Dhabi","region":"Middle East","currency":"UAE Dirham","currencyCode":"AED","population":2407460,"timeZone":4.00,"timeZoneCode":"STD","ISDCode":"+971"},{"Id":3,"countryCode":"AF","CountryName":"Afghanistan","FIPS":"AF","ISO1":"AF","ISO2":"AFG","ISONo":4,"capital":"Kabul","region":"Asia","currency":"Afghani","currencyCode":"AFA","population":26813057,"timeZone":4.50,"timeZoneCode":"STD","ISDCode":"+93"},{"Id":4,"countryCode":"AG","CountryName":"Antigua and Barbuda","FIPS":"AC","ISO1":"AG","ISO2":"ATG","ISONo":28,"capital":"Saint Johns","region":"Central America and the Caribbean","currency":"East Caribbean Dollar","currencyCode":"205","population":66970,"timeZone":-4.00,"timeZoneCode":"STD","ISDCode":"+1"},{"Id":5,"countryCode":"AI","CountryName":"Anguilla","FIPS":"AV","ISO1":"AI","ISO2":"AIA","ISONo":660,"capital":"The Valley","region":"Central America and the Caribbean","currency":"East Caribbean Dollar","currencyCode":"205","population":12132,"timeZone":-4.00,"timeZoneCode":"STD","ISDCode":"+1"},{"Id":6,"countryCode":"AL","CountryName":"Albania","FIPS":"AL","ISO1":"AL","ISO2":"ALB","ISONo":8,"capital":"Tirana","region":"Europe","currency":"Lek","currencyCode":"ALL","population":3510484,"timeZone":2.00,"timeZoneCode":"DST","ISDCode":"+355"},{"Id":7,"countryCode":"AM","CountryName":"Armenia","FIPS":"AM","ISO1":"AM","ISO2":"ARM","ISONo":51,"capital":"Yerevan","region":"Commonwealth of Independent States","currency":"Armenian Dram","currencyCode":"AMD","population":3336100,"timeZone":5.00,"timeZoneCode":"DST","ISDCode":"+374"},{"Id":8,"countryCode":"AN","CountryName":"Netherlands Antilles","FIPS":"NT","ISO1":"AN","ISO2":
I am trying to get output directly in JSON
When FOR JSON queries are returned to the client, the JSON text is returned as a single-column result set. The JSON is broken into fixed-length strings and sent over multiple rows.
It's really hard to see this properly in SSMS, as SSMS concatenates the results for you in "Results to Grid", and truncates each row in "Results to Text".
Why? Dunno. My guess is that only .NET clients know how to efficiently read large streams from SQL Server, and 99% of the time users will still just buffer the whole object. Breaking the JSON over multiple rows gives clients a simple API to read the data incrementally. And in .NET the fact that the de facto standard JSON library is not in the BCL means that SqlClient can't really have a first-class JSON API.
Anyway, from C#, you can use something like this to read the results:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Common;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApp3
{
class SqlJSONReader: TextReader
{
SqlDataReader rdr;
string currentLine = "";
int currentPos = 0;
public SqlJSONReader(SqlDataReader rdr)
{
this.rdr = rdr;
}
public override int Peek()
{
return GetChar(false);
}
public override int Read()
{
return GetChar(true);
}
public int GetChar(bool Advance)
{
while (currentLine.Length == currentPos)
{
if (!rdr.Read())
{
return -1;
}
currentLine = rdr.GetString(0);
currentPos = 0;
}
int rv = (int)currentLine[currentPos];
if (Advance) currentPos += 1;
return rv;
}
public override void Close()
{
rdr.Close();
}
}
class Program
{
static void Main(string[] args)
{
using (var con = new SqlConnection("server=.;database=master;Integrated Security=true"))
{
con.Open();
var sql = #"
select o.object_id as [obj.Id], replicate('n', 2000) as [obj.foo], c.name as [obj.col.name]
from sys.objects o
join sys.columns c
on c.object_id = o.object_id
for json path;
"
;
var cmd = new SqlCommand(sql, con);
var sr = new StringBuilder();
using (var rdr = cmd.ExecuteReader())
{
using (var tr = new SqlJSONReader(rdr))
{
using (var jr = new Newtonsoft.Json.JsonTextReader(tr))
{
while (jr.Read())
{
Console.WriteLine($" {jr.TokenType} : {jr.Value}");
}
}
}
}
Console.WriteLine(sr.ToString());
}
}
}
}
With thanks to #David Browne. I found I had to use 'print' instead of 'select'
declare #json varchar(max) = (SELECT * FROM dbo.AppSettings FOR JSON AUTO)
print #json
Separation of Concerns dictates returning a string and parsing the JSON separately. The below snippet can be used with no dependency on JSON.net which can be be used separately or a different Json Deserializer can be used (e.g. the one built into RestSharp) and does not require the SqlJSONReader class.
try {
using (var conn = new SqlConnection(connectionString))
using (var cmd = new SqlCommand(sql, conn)) {
await conn.OpenAsync();
logger.LogInformation("SQL:" + sql);
var rdr = await cmd.ExecuteReaderAsync().ConfigureAwait(false);
var result = "";
var moreRows = rdr.HasRows;
while (moreRows) {
moreRows = await rdr.ReadAsync();
if (moreRows) result += rdr.GetString(0);
}
return result;
}
}
catch (Exception ex) {
//logger.LogError($"Error accessing Db:{ex}");
return null;
}
sql Query result of for json path is a long string that is divided to multi column or block
something like this statement: "i want to get result of for json path"
"i want"+
" to ge"+
"t resu"+
"lt of "+
"for js"+
"on path"
each block is equal to max size of column in sql
so just get them all to a list
public IHttpActionResult GetAdvertises()
{
var rEQUEST = db.Database.SqlQuery<string>("SELECT
ID,CITY_NAME,JSON_QUERY(ALBUM) AS ALBUM FOR JSON PATH").ToList();
foreach(string req in rEQUEST)
{
HttpContext.Current.Response.Write(req);
}
return Ok();
}
If your query returned more then 2033 charters then there will be rows. Each row contains 2033 charters data another row contains remaining data. So you need to merge to get actual json. As seen below code sample.
dynamic jsonReturned = unitOfWork
.Database
.FetchProc<string>("storedProcedureGetSaleData", new { ProductId = productId });
if (Enumerable.Count(jsonReturned) == 0)
{
return null;
}
dynamic combinedJson = "";
foreach (var resultJsonRow in jsonReturned)
{
combinedJson += resultJsonRow;
}
return combinedJsonResult;

SQL Server 2016 RC1 : Unable to debug SSIS package

I installed SQL server 2016 RC1 in Windows 10 (formatted and installed the OS). I first installed VS2015 with latest updates and then the SQL. I am not able to debug the SSIS packages and I get following error.
Method 'SaveAndUpdateVersionToXML' in type 'Microsoft.DataTransformationServices.Project.DebugEngine.InterfaceWrappers.Sql2014ApplicationClassWrapper' from assembly 'Microsoft.DataTransformationServices.VsIntegration, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' does not have an implementation. (Microsoft Visual Studio)
I installed the latest version of SQL Server Data Tools
Is anyone facing similar issue? Any solution for this problem?
to run the package you might wana use this c# solution:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using dts1=Microsoft.SqlServer.Dts.Runtime.Wrapper;
using System.Timers;
using System.Threading;
using ustimer=System.Windows.Forms;
namespace integration
{
public partial class integration : Form
{
public integration()
{
InitializeComponent();
}
public int c = 0;
public string path = #"c:\users\Package.dtsx";
private void button1_Click(object sender, EventArgs e)
{
dts1.IDTSPackage100 pkg;
dts1.IDTSApplication100 app;
dts1.DTSExecResult pkgResults;
app = new dts1.Application();
pkg = app.LoadPackage(path, true, null);
try
{
pkgResults = pkg.Execute(null, null, null, null, null);
if (pkgResults == dts1.DTSExecResult.DTSER_SUCCESS)
{
MessageBox.Show("works");
}
else
{
MessageBox.Show("failed");
}
}
catch
{
//
}
}
public void run_pkg(string path, bool feedback = true)
{
dts1.IDTSPackage100 pkg;
dts1.IDTSApplication100 app;
dts1.DTSExecResult pkgResults;
app = new dts1.Application();
pkg = app.LoadPackage(path, true, null);
try
{
pkgResults = pkg.Execute(null, null, null, null, null);
if (feedback == true)
{
if (pkgResults == dts1.DTSExecResult.DTSER_SUCCESS)
{ MessageBox.Show("worked"); }
else
{ MessageBox.Show("failed"); }
}
}
catch
{
//
}
}}
This update fixed the issue for me: https://blogs.msdn.microsoft.com/ssdt/2016/04/05/ssdt-preview-update-rc2/