I have recently run into a problem with SQL tables where I had a mistake in my code that dropped a column in the table and after that recreated the column. This process was repeated many times until I discovered it.
But as a result it appears that the SQL table haven't properly deleted the columns. For instance I get this message when I want to add a new column:
"Warning: The table "Matches" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit."
And I also have trouble updating column values because they exceed the maximum limits.
And to be clear, the table does not have more than 30 (visible) rows and I only made each column a maximum of 40 varchars, so it appears that the dropped rows still exist somewhere. But how do I delete them?
Thanks in advance.
Related
I have a 'Customer' table with almost 1.2million records in which one column 'customer_records' of type ntext , which contains xml data. I need to replace a url value in all existing records of the column. I have tried with below replace query but it is taking around 20 minutes of time to execute the query.
Update Customer
SET Customer_records = Cast(Replace(Cast( Customer_records As nvarchar(max)),
N'http://testuser.testcompany.net', N'https://replaceurl.testcompany.net') As ntext)
The CPU consumption is utilized to the maximum during the update which is causing the concern.
Out of 1.2 million records actual updated records are 600 thousand records, but query needs to read every records to find and replace the url text.
Please in any way this can be performed more effectively using the 'Replace'.
Try Python and pandas dataframe. Or run the update with 'WITH (TABLOCK)' option.
I am facing the column length issue in my table and I want to change column type to big int from int and table rows around 1K million records but when ever I tried to change data type it is taking to much time and it is eating my machine all space, what is best way and fast way to change the column data type on this big table, currently table has no indexes.
Currently Column length
ID 4(Length) 10(prec)
Tried
I added new column and set their datatype to big int and made update query
Suggestion
Select insert into newtable is fact way but can we set column type with this query ?
Table Size :
Issue resolved and its takes around less than 2 hours and resolved by following steps.
Create empty table with new Column Data type
Made around 21 batch scripts
Create clustered index on big int column
I have a table in SQL server, where one column contains excel files.
Now we need to remove those excel files only without deleting the entire row. because the size of this table is increasing day by day, we need to
remove old data to decrease the size of this table
Id file_name Code
1001 abc.xlsx A1
1002 das.xlsx A2
1003 kap.xlsl A3
I have done the below
Update rec_table
set file_name = Null where id = '1001'
Will this help to reduce the size of the table?
Thanks
In SQL Server the size of a Table is calculated by adding the Sie of every row in the table. Ie if a Table is having 10 rows, then the Total size of the table would be the sum of the total size of the 10 rows.
For a Row, the Total size is calculated by calculating the size of every column.
For Example in your case, the Size of the row with ID 1001 will be the
Size of the value in Column ID + Size Of Value in Column File_Name + Size of Value in COlumn Code
so if a Column Holds the value NULL for a Particular Row, then that COlumn will have a Data size of 0
updating the values to NULL for a particular column will reduce the size of the column, But how much it gets reduced will depend on the Type of the Column and Data stored in it
Which means, If your Column File_name holds a Data of 100 bytes for Row id 1001, then updating the Value to NULL will reduce the Table size by 100 Bytes
You may use the following queries to find out the table / Row Size
For getting the Size and Details for the whole table
dbcc showcontig ('Person.Person') with tableresults
To Get the Data Size of a Particular column for each Row in the Database
SELECT DATALENGTH(FirstName) FROM Person.Person
Will this reduce the size of the table? Probably.
Will this release free space to the database? Probably not -- until you compact the database.
Is this an expensive operation? Very.
Databases store tables on data pages. Each data page contains one or more rows. If you have wide columns, then these might be stored on their own data pages.
The number of rows that fit on a page depends on the size of the rows. A page is about 8k bytes. If a row is 100 bytes, then a table with 1 row occupies the same space as one with 50 rows.
When you remove a column from a table, the entire table needs to be rewritten. This is a very expensive operation. And it might take a long time. Often it is faster to select the columns that you do want and reload the original table.
Removing a column is -- to me -- a very curious way to reduce the size of a table. More typically, older data would be removed. The most efficient method is to partition the table by time, using whatever the appropriate date/time column might be. Then you can quickly recover space by dropping a partition.
Oracle has a max column limit of 1000 and even with all columns defined as VARCHAR(4000) I was able to create the table and load huge amounts of data in all fields.
I was able to create a table in SQL Server with 500 varchar(max) columns, however when I attempt to insert data, I got the following error:
Cannot create a row of size 13075 which is greater than the allowable
maximum row size of 8060.
When I made the table 200 columns I was able to insert huge amounts of data.
Is there a way to do this in SQL Server?
I ran some test and it seems we have an overhead of 26 bytes on each populated varchar(max) column.
I was able to populate 308 columns.
If you'll divide your columns between 2 tables you'll be fine (until the next limitation - which will come).
P.s.
I seriously doubt the justification for this table structure.
Any reason not saving the data as rows instead of columns?
I'm currently doing an investigation to determine the update statistic duration for tables of different sizes (100K, 1M, 10M, 100M, etc rows) with a table that has one column.
Currently I have tables created with the correct row sizes, however they do not have one column, rather 5, 6, 7 etc.
I was wondering whether there was a way to update a certain column in a table or a certain number of columns in a table.
Thanks.