select statement working unexpectedly - sql

when i click SELECT TOP 1000 row from table then it only shows some records like 3 records
but when i manually run query on same table then it shows all records like many 1000s records which i always want.
Select * from dbo.HrEmployee
why ? Help please, i'm using SQL SERVER 2012

It look like you have created two copies of the same database, the one is in the “intended” database and the second has been created in the master database. 3 records were then inserted into the intended table and the rest were inserted into the master.dbo.HrEmployee.
When you use the select top 1000 you are running the query against the correct database even though it only has 3 records and when you run the second query you are running it against the same table in Master

Related

Prevent new data from showing between running of query and completion

I have a simple set of SQL queries (SELECT * FROM TABLE) for transporting data from a number of related tables, that take about 30 seconds to run. I would like the data returned to exclude any changes that take place during that 30 seconds. The query can't return an error when there is new data inserted and transactions can't be blocked or locked from occurring. I also don't have access to any server settings and am looking for a SQL command to achieve this.
If a field called TRN_ID and is in TABLE_SHORT and TABLE_LONG, and TRN_ID is created during that 30 second window, I want to ensure that the TRN_ID in my 2 queries will be consistent.
Edit: Here's my order of operations:
Run query - SELECT * FROM TABLEA
Insert new row into TABLEA & TABLEB
Select from related table TABLEB ensuring that only rows related to the rows present in my query in 1. are included.

How can we get complete records of a column from netezza db table.? By default i am getting only 1000 records

How can we get complete records of a column from netezza db table.? By default i am getting only 1000 records.
I am not using any limit keyword to minimize the tailor the output.
For instance,
SELECT Account_number from Customers
In output grid i am getting only 1000 records and After downloading the same output the records remains the same.
I Want all the records, I have checked the Account_Number column contains more than 20k records.
To overcome with such issue, We Shall create a Temporary table and dump your query output in the temp table.
Post table is created then go to
query >Current query >Query option dialog will appear.
In that check the option 'keep connection open between executions'
This you have to do it everytime you fire the sql query. It is bit irritating. But you have to manage. 😊

How to create an error table that only logs last 100 entries

Is there a way to create a simple ERROR table that only logs the last 100 entries or do I have to write sql that after an insert, deletes any entries older than number 100?
I am using a derby database in the java project.
delete from table
where row_number() over (order by id desc)<100
Schedule this to run via SQL Server Agent every 15 mins or so and you will always have top 100 entries in the table.

Merge Statement VS Lookup Transformation

I am stuck with a problem with different views.
Present Scenario:
I am using SSIS packages to get data from Server A to Server B every 15 minutes.Created 10 packages for 10 different tables and also created 10 staging table for the same. In the DataFlow Task it is selecting data from server A with ID greater last imported ID and dumping them onto a Staging table.(Each table has its own stagin table).After the DataFlow task I am using a MERGE statement to merge records from Staging table to Destination table where ID is NO Matched.
Problem:
This will take care all new records inserted but if once a record is picked by SSIS job and is update at the source I am not able to pick it up again and not able to grab the updated data.
Questions:
How will I be able to achieve the Update with impacting the source database server too much.
Do I use MERGE statement and select 10,000 records every single run?(every 15 minutes)
Do I use LookUp transformation to do the updates
Some tables have more than 2 million records and growing, so what is the best approach for them.
NOTE:
I can truncate tables in destination and reinsert complete data for the first run.
Edit:
The Source has a column 'LAST_UPDATE_DATE' which I can Use in my query.
If I'm understanding your statements correctly it sounds like you're pretty close to your solution. If you currently have a merge statement that includes the insert (where source does not match destination) you should be able to easily include the update statement for the (where source matches destination).
example:
MERGE target_table as destination_table_alias
USING (
SELECT <column_name(s)>
FROM source_table
) AS source_alias
ON
[source_table].[table_identifier] = [destination_table_alias].[table_identifier]
WHEN MATCHED THEN UPDATE
SET [destination_table_alias.column_name1] = mySource.column_name1,
[destination_table_alias.column_name2] = mySource.column_name2
WHEN NOT MATCHED THEN
INSERT
([column_name1],[column_name2])
VALUES([source_alias].[column_name1],mySource.[column_name2])
So, to your points:
Update can be achieved via the 'WHEN MATCHED' logic within the merge statement
If you have the last ID of the table that you're loading, you can include this as a filter on your select statement so that the dataset is incremental.
No lookup is needed with the 'WHEN MATCHED' is utilized.
utilizing a select filter in the select portion of the merge statement.
Hope this helps

SQL Database Migration - Performance issue (Copying data from one database to another)

I have created windows application(C#.Net) to migrate data from one database to another database.
Here i select all rows of customer table from Database ABC and Insert all rows into the Dealers table of database XYZ.
My problem is that when i select top 1000 rows from table, it takes 2 mins and 35 seconds to insert these 1000 records.
but when i select top 5000 rows from table, it takes 15 mins (not 10 mins) to insert these 5000 records
Is there any way to optimize the performance so that i can insert all records/data very quickly.
(Note: Here for every record in foreach loop,i created sqlparameter and insert sql statement)
Also i have used the progress bar,but when i lose focus from my windows application,it becomes non-responsive(Not Responding as window title) and i can't see progress bar's status/progress(but process of inserting data is working in background,but not in UI).
How to solve these two problems??
You can create single sql commands for hundred records and fire it, instead of single sql command for single record, this will work as bulk.
http://blogs.microsoft.co.il/blogs/bursteg/archive/2007/12/05/sql-server-2008-t-sql-insert-multiple-rows.aspx
for progress bar you can use thread for updating.
http://msdn.microsoft.com/en-us/library/ms951089.aspx