select statement times out - sql

I need to populate dropdown menues with a table that contains 4 million rows.
How can I do this without having it time out on the select statement?
Is need SQL Injections. ? or anything else ?
Now I tried only get top 100 rows . But my project have lot of users and lot of details in database . So i need to show all values in dropdownlist , my current code is here :
protected void SearchButton_Click(object sender, EventArgs e)
{
var search = YourSeachTextBox.Text.Trim();
if(!String.IsNullOrEmpty(search) && search.Length > 3)
{
using(SqlConnection sqlConnection = new SqlConnection("Your Connection String"))
{
var query = "SELECT TOP 100 * FROM [YourTable] WHERE UserName LIKE #Search";
SqlCommand sqlCommand = new SqlCommand(query,sqlConnection);
sqlCommand.Parameters.AddWithValue("#Search", search + "%");
}
}
}

i assume that you intend to populate drop downs with parts aof 4 million rows?
Then you do have to create indexes on those columns that help to separate them!
If you really intend to populate them with most of the contents at once, things will have to timeout for sure as your clients browsers won't get to handle this!

Related

SSIS update a column with a result of an sql select

I was wondering if there's a way of updating a column inside a dataflow task by running a select on every row?
Here's the situation :
Let's take this as our start position. I collect info from 2 files, then I merge them, and I add a column with the derived column tool. Is there a way of populating this column by performing a select on every row using the values of the the row?
Ex :
SELECT Count(*) AS cnt
FROM TABLE T
WHERE T.COLUMN1 = ROW.COLUMN3
AND T.COLUMN2 = ROW.COLUMN5
I don't know if I'm just not phrasing my need properly but I couldn't get any results
Thank you
You should be able to do this with a Lookup Transformation.
EDIT based on comment:
If you don't want to use a lookup due to the size of the table, you can do exactly what you want with a Script Component. You can create and execute your SQL Command for each row of the dataflow just like you would in any .net application.
I was able to do it with a Script Component
1- I've removed the Derived Column
2- I've created a string variable where I stored the query whith a wildcard string to replace every value that I need to get from the row.
3- I've passed this variable allong with one containing the connecection string info to the Script Component
4- I've added a new column to the Output Columns of the Script Component
5- Added using System.Data.OleDb;
6- Created 2 variables :
string jourFerieQuery;
string dbcsoledbschema;
7- Updated the PostExecute() to put the values of my SSIS variables into the script variables :
public override void PostExecute()
{
base.PostExecute();
jourFerieQuery = Variables.jourFerieQuery;
dbcsoledbschema = Variables.dbcsoledbschema;
}
8- Added a method :
int GetData(string cs, string query)
{
OleDbConnection conn = new OleDbConnection(cs);
conn.Open();
OleDbCommand cmd = new OleDbCommand(query, conn);
DataTable dt = new DataTable();
dt.Load(cmd.ExecuteReader());
conn.Close();
return (int)dt.Rows[0][0];
}
9- Updated the Input0_ProcessInputRow(Input0Buffer Row) :
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string query = jourFerieQuery.Replace("[1]", Row.CODDEVI).Replace("[2]", Row.DATCRBEZEROCONGE.ToString());
Row.Keep = GetData(dbcsoledbschema, query);
}
My query returns a count that's why the method I've added returns an int
You can do that through a script component (transformation).
Add an output of Ct.
I use System.Data.OleDB as it matches the SSIS package connection string.

SQL Server query poor performance

We are working with an app, .NET, in which when you press a button a DevExpress form is opened and a SQL Server query is executed, so it can fill with data some comboboxes. Application is working fine in lots of customers, but in a particular one it´s taking more than a minute in loading the form. I can see in the performance monitor that SQL Server is taking a lot of CPU when I want to load the form.
I executed the query directly in SQL Server Management Studio, taking no more than a second, however I tried having a look at SQL Activity Monitor and what I can see here (not happening to other customers, same IO, same SQL, same everything) is this:
So the thing I can see here, that I don´t understand, is why is this query having so much executions? Why is it taking so long to retrieve data?
Here it´s the execution plan of this query:
Select *
From cuinac_pos
Where [group] in (Select [group]
From proc_groups
Where Code = 13100271)
Thank you for any help you can give me, and please if I can give any more info do not hesitate to ask.
Once again, thanks!
AFTER ADDING THE EXECUTION PLAN SUGGESTED INDEX
EXECUTION PLAN FOR QUERY
Select count(*)
From proc_groups
Where Code = 13100271
Definition of the index in proc_groups:
Example of the code:
private static void LoadDTPurchaseHerdRelation(Int32 status, Int32 herdNumber)
{
try
{
StringBuilder sb = new StringBuilder();
sb.Append(" Select gr.[group] as HerdId, gr.code as HerdNumber, bo.code as PurchaseCode");
sb.Append(" From cuinac_pos bo ");
sb.Append(" inner join proc_groups gr on bo.code=gr.code ");
if (herdNumber == 0)
{
string s1 = " Where (gr.created between '2015-12-09' And '2016-01-08') ";
sb.Append(s1);
if (status != 4)
{
string s2 = string.Format(" AND bo.purchasestatus = {0} ", status);
sb.Append(s2);
}
sb.Append(" order by bo.code ");
}
else
{
string s3 = string.Format(" Where gr.code = '{0}' ", herdNumber);
sb.Append(s3);
}
DTPurchaseHerdRelation.Clear();
using (ConnectionScope cs = new ConnectionScope())
{
SqlDataAdapter adapter = new SqlDataAdapter(sb.ToString(), (SqlConnection)cs.Connection);
adapter.Fill(DTPurchaseHerdRelation);
}
}
catch (Exception ex)
{
}
}
}
}
Execution plan for query
Select * From cuinac_pos Where [group] in (Select [group] From proc_groups Where Code = N'13100271')
Solved:
I finally got it by adding indexes suggested in the answer marked as correct, and adding in the code, in the queries which searched by nvarchar value "Code", an N before rhe value as suggested in comments by shriop. Thank you all for your effort!
For this query:
Select *
From cuinac_pos
Where [group] in (Select [group] From proc_groups Where Code = 13100271 );
The optimal indexes are proc_groups(code, group) and cuinac_pos(group). Having those indexes might help.
EDIT:
For performance, this might be better:
Select *
From cuinac_pos cp
Where exists (Select 1
From proc_groups pg
Where pg.Code = 13100271 and pg.[group] = cp.[group]
);
with an index on `proc_groups(group, code)
Whenever I read something like "fast in SSMS but slow in application" I have to think about this:
http://www.sommarskog.se/query-plan-mysteries.html
This applies especially to older DBs, which exist from SQL Server version to SQL Server version and are upgraded via scripts, and where the data reading is done through Stored Procedures.
Most of the time this behaviour is solveable with a SET ARITHABORT ON as first line of your SQL code.
You can put this into you SPs directly, or set it in your application through the Connection as default.
Good luck and happy coding

SqlCommandBuilder With Convert Statement

I have an application that's built in .NET language.
In this application we mainly read/write to the database (SQL Server 2005).
Sometimes (for a single input) I just use the SQL query, for example:
commandText = "INSERT INTO Test_Table (Number) VALUES ('10')";
command = new System.Data.SqlClient.SqlCommand(commandText, connection);
command.ExecuteNonQuery();
If I want to update a bunch of records in my database, I use the SqlCommandBuilder class, as in this example:
adapter = new System.Data.SqlClient.SqlDataAdapter("SELECT * FROM Test_Table WHERE Number = '9'",connection);
commandbuilder = new System.Data.SqlClient.SqlCommandBuilder(adapter);
dataset = new System.Data.DataSet();
adapter.Fill(dataset,"Example");
dataset.Tables["Example"].Rows[0].["Number"] = 10;
adapter.update(dataset,"Example");
These work great. But now, for some reason I need to insert/update datetimes and use the CONVERT function on it.
The single SQL query works great:
t = System.DateTime.Now;
commandText = "INSERT INTO Test_Table (DateTime) VALUES (datetime, 't.toString()', 103)";
command = new System.Data.SqlClient.SqlCommand(commandText, connection);
command.ExecuteNonQuery();
This works without a problem, however I have no idea how to achieve the same thing using my SqlCommandBuilder scripts. I could change everything to single query, but this would take a week.
I have already tried the following, without success:
t = System.DateTime.Now;
adapter = new System.Data.SqlClient.SqlDataAdapter("SELECT * FROM Test_Table WHERE Number = '9'", connection);
commandbuilder = new System.Data.SqlClient.SqlCommandBuilder(adapter);
dataset = new System.Data.DataSet();
adapter.Fill(dataset, "Example");
dataset.Tables["Example"].Rows[0].["DateTime"] = "CONVERT(datetime,'" + t.toString() + "',103);
adapter.update(dataset, "Example");
This line of code is weird:
dataset.Tables["Example"].Rows[0].["DateTime"] = "CONVERT(datetime,'" + t.toString() + "',103);
Does it compile? Specifically, this:
.Rows[0].["DateTime"]
I think it should be:
.Rows[0]["DateTime"]
But regardles of the syntax...I don't think this is the right way to go. The datatable (in the dataset) expects a datetime object (btw, don't name your attributes by their datatype, it causes confusion) and you are providing it with something that is incompatible. Sytem.DateTime.Now returns a DateTime object, then you are concatenating it with string (again, does this compile?) and I assume you expect it to be injected into the INSERT statement?
Since you said that it would take a week to change everything, I assume that you have a lot of similar code to repair.
I see three possible solutions, all require some work:
Create a database trigger
https://msdn.microsoft.com/en-us/library/ms189799.aspx
Add a default value to the DateTime field in the database and remove the DateTime from the select query.
http://www.w3schools.com/sql/sql_default.asp
https://www.youtube.com/watch?v=INfi7jkdXC8
(start watching at around 2:00)
You can write a function that does the actual text replacing but it can get tricky:
dasdas as
private string ChangeDate(string insertQuery)
{
// find the location of the date column
// replace the actual value with the "CONVERT(datetime,'" + actualValue + "',103)"
// return the new value and store it in the sqlcommandbuilder.insertstatement
}
Admittedly, all three require work and are not really "elegant". I would go for option 2, because it seems less work. But I don't know if this solves your problem..

Value copy function batch processing

I'm trying to copy values from one ID to another ID.
The ID and the timestamp are primary keys.
String sqlString = "SELECT * FROM Values WHERE ID = #ID";
var sqlCommand = new SqlCommand(sqlString, sqlConnect);
sqlCommand.Parameters.Add("#ID", SqlDbType.UniqueIdentifier).Value = ID;
using (var reader = sqlCommand.ExecuteReader())
{
while (reader.Read())
{
time = reader["Time"];
dt = DateTime.Parse(time.ToString(), CultureInfo.InvariantCulture);
value = reader["Value"];
insertData(IdDestination, dt, value);
}
}
This function is working, but if I have 10.000 table rows, then it's very slow.
I was thinking about using SqlBulkCopy, but this doesn't work for me because I need to change the Destination ID. Another problem is that sometimes, the values I want to copy already exist, so I need a rollback or commit. Update is also not working because of losing important data.
Does anyone have an idea about what kind of batch processing would work for me?
I'd like to insert 1000 rows or more at the same time without having to call my insert function a thousand times.

Generate sql insert script from excel worksheet

I have a large excel worksheet that I want to add to my database.
Can I generate an SQL insert script from this excel worksheet?
I think importing using one of the methods mentioned is ideal if it truly is a large file, but you can use Excel to create insert statements:
="INSERT INTO table_name VALUES('"&A1&"','"&B1&"','"&C1&"')"
In MS SQL you can use:
SET NOCOUNT ON
To forego showing all the '1 row affected' comments. And if you are doing a lot of rows and it errors out, put a GO between statements every once in a while
You can create an appropriate table through management studio interface and insert data into the table like it's shown below. It may take some time depending on the amount of data, but it is very handy.
There is a handy tool which saves a lot of time at
http://tools.perceptus.ca/text-wiz.php?ops=7
You just have to feed in the table name, field names and the data - tab separated and hit Go!
You can use the following excel statement:
="INSERT INTO table_name(`"&$A$1&"`,`"&$B$1&"`,`"&$C$1&"`, `"&$D$1&"`) VALUES('"&SUBSTITUTE(A2, "'", "\'")&"','"&SUBSTITUTE(B2, "'", "\'")&"','"&SUBSTITUTE(C2, "'", "\'")&"', "&D2&");"
This improves upon Hart CO's answer as it takes into account column names and gets rid of compile errors due to quotes in the column. The final column is an example of a numeric value column, without quotes.
Depending on the database, you can export to CSV and then use an import method.
MySQL - http://dev.mysql.com/doc/refman/5.1/en/load-data.html
PostgreSQL - http://www.postgresql.org/docs/8.2/static/sql-copy.html
Use the ConvertFrom-ExcelToSQLInsert from the ImportExcel in the PowerShell Gallery
NAME
ConvertFrom-ExcelToSQLInsert
SYNTAX
ConvertFrom-ExcelToSQLInsert [-TableName] <Object> [-Path] <Object>
[[-WorkSheetname] <Object>] [[-HeaderRow] <int>]
[[-Header] <string[]>] [-NoHeader] [-DataOnly] [<CommonParameters>]
PARAMETERS
-DataOnly
-Header <string[]>
-HeaderRow <int>
-NoHeader
-Path <Object>
-TableName <Object>
-WorkSheetname <Object>
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).
ALIASES
None
REMARKS
None
EXAMPLE
ConvertFrom-ExcelToSQLInsert MyTable .\testSQLGen.xlsx
You could use VB to write something that will output to a file row by row adding in the appropriate sql statements around your data. I have done this before.
Here is another tool that works very well...
http://www.convertcsv.com/csv-to-sql.htm
It can take tab separated values and generate an INSERT script. Just copy and paste and in the options under step 2 check the box "First row is column names"
Then scroll down and under step 3, enter your table name in the box "Schema.Table or View Name:"
Pay attention to the delete and create table check boxes as well, and make sure you examine the generated script before running it.
This is the quickest and most reliable way I've found.
You can use the below C# Method to generate the insert scripts using Excel sheet just you need import OfficeOpenXml Package from NuGet Package Manager before executing the method.
public string GenerateSQLInsertScripts() {
var outputQuery = new StringBuilder();
var tableName = "Your Table Name";
if (file != null)
{
var filePath = #"D:\FileName.xsls";
using (OfficeOpenXml.ExcelPackage xlPackage = new OfficeOpenXml.ExcelPackage(new FileInfo(filePath)))
{
var myWorksheet = xlPackage.Workbook.Worksheets.First(); //select the first sheet here
var totalRows = myWorksheet.Dimension.End.Row;
var totalColumns = myWorksheet.Dimension.End.Column;
var columns = new StringBuilder(); //this is your columns
var columnRows = myWorksheet.Cells[1, 1, 1, totalColumns].Select(c => c.Value == null ? string.Empty : c.Value.ToString());
columns.Append("INSERT INTO["+ tableName +"] (");
foreach (var colrow in columnRows)
{
columns.Append("[");
columns.Append(colrow);
columns.Append("]");
columns.Append(",");
}
columns.Length--;
columns.Append(") VALUES (");
for (int rowNum = 2; rowNum <= totalRows; rowNum++) //selet starting row here
{
var dataRows = myWorksheet.Cells[rowNum, 1, rowNum, totalColumns].Select(c => c.Value == null ? string.Empty : c.Value.ToString());
var finalQuery = new StringBuilder();
finalQuery.Append(columns);
foreach (var dataRow in dataRows)
{
finalQuery.Append("'");
finalQuery.Append(dataRow);
finalQuery.Append("'");
finalQuery.Append(",");
}
finalQuery.Length--;
finalQuery.Append(");");
outputQuery.Append(finalQuery);
}
}
}
return outputQuery.ToString();}
Here is a link to an Online automator to convert CSV files to SQL Insert Into statements:
CSV-to-SQL
This query i have generated for inserting the Excel file data into database
In this id and price are numeric values and date field as well. This query summarized all the type which I require It may useful to you as well
="insert into product (product_id,name,date,price) values("&A1&",'" &B1& "','" &C1& "'," &D1& ");"
Id Name Date price
7 Product 7 2017-01-05 15:28:37 200
8 Product 8 2017-01-05 15:28:37 40
9 Product 9 2017-01-05 15:32:31 500
10 Product 10 2017-01-05 15:32:31 30
11 Product 11 2017-01-05 15:32:31 99
12 Product 12 2017-01-05 15:32:31 25
I had to make SQL scripts often and add them to source control and send them to DBA.
I used this ExcelIntoSQL App from windows store https://www.microsoft.com/store/apps/9NH0W51XXQRM
It creates complete script with "CREATE TABLE" and INSERTS.
I have a reliable way to generate SQL inserts batly,and you can modify partial parameters in processing.It helps me a lot in my work, for example, copy one hundreds data to database with incompatible structure and fields count.
IntellIJ DataGrip , the powerful tool i use.
DG can batly receive data from WPS office or MS Excel by column or line.
after copying, DG can export data as SQL inserts.