Historical data usage of a table in MSSQL - sql

I'm a beginner in SQL and I executed the below Stored Procedure in MSSQL to get the disc space related stuff for a table. This one is giving the current status of this table. If I run the same tomorrow, It gives me the similar stuff and We can insert this into a table for the regular monitoring.
EXEC sp_spaceused '[dbo].[TableName]'
But I'm looking for the same data from last one year to see how this table is growing by each month, May be I want to extend it to all tables in my DB.
I tried running this Stored Procedure regularly, it is going to help for future statistics, but not for historical.

the subject is extensively discussed in the following post DB Admin - Which table is causing the hike
hope this helps
NR

Related

SQL Server: copy newly added rows from one table and insert into another automatically

I need to perform some calculations using few columns from a table. This database table that gets updated every couple of hours generates duplicates on couple of columns every other day. There is no way tell which one is inserted first which affects my calculations.
Is there a way to copy these rows into a new table automatically as data gets added every couple of hours and perform calculations on the fly? This way whatever comes first will be captured into a new table for a dashboard and for other business use cases.
I thought of creating a stored procedure and using a job scheduler to perform this. But I do not have admin access and can not schedule jobs. Is there another way of doing this efficiently? Much appreciated!
Edit: My request for admin access is being approved.
Another way as to stated in the answers, what you can do is:
Make a temp table.
Make a prod table.
Use stored procedure to copy everything from the temp table into prod table after any load have been done.
Use the same stored procedure to clean the temp table after the load is done.
Don't know if this will work, but this is in general how we are dealing with huge amount of load on a daily basis.

ssrs multiple datasets from a single stored procedure

I would like to know if there was a trigger which i can use to drop a temporary sql table when a report is closing in ssrs?
Background
I deserialise data in a stored procedure and after deserialising i want to reuse the data, so the plan is to put the deserialised data in to a temporary table and create multiple stored procedures with different result sets (To be used with multiple datasets in ssrs report) and finally drop the temporary table with deserialised data.
Currently i am thinking may be i should create a sql agent job which will drop the table everyday at a certain time (say 1am in the morning). However I have a sneaky feeling that there might be a better way to do this?
Any help is greatly appreciate. Thanks.
My assumptions:
You can't de-serialise the data when you run the report because it
takes too long?
You are happy for the data to be out of date for your report - because you are thinking of creating a table at 01:00 and potentially running the report at e.g. 17:00
You use the phrase temporary table but that's not what you mean. A temporary table in SQL server is one which is available for the lifetime of the connection and is then dropped.
If my assumptions are correct your plan is fine. You wouldn't drop the table though - Have a look at TRUNCATE TABLE.
You may want to use SSIS, however it's not worth it for really simple procedures.

Find out the SP due to which the columns of tables are updated

I have the database in which two columns of the tables are updated, but i dont know which SP is executing to update those columns. how can i find out the SP due to which the columns of tables are updated???? Thanks in advance.
Thanks to your comment clarifying the database you are using, I will suggest installing a tool like RedGate's SQL Search, which is free. Use it to search through your stored procs for the name of the column, and find the code where the update is occurring.
If you cannot deduce which proc is doing the update then you will have to add some logging mechanisms as suggested by gilly3. Perhaps add new columns such as LastUpdateTime and LastUpdateProc, and add code in all procs to update these columns. Additionally, your procs can track their start/stop times by writing to another table meant for tracking proc execution.

DROP TABLE or DELETE TABLE? Which is best practice?

Working on redesigning some databases in my SQL SERVER 2012 instance.
I have databases where I put my raw data (from vendors) and then I have client databases where I will (based on client name) create a view that only shows data for a specific client.
Because of the this data being volatile (Google Adwords & Google DFA) I typically just delete the last 6 days and insert 7 days everyday from the vendor databases. Doing this gives me comfort in knowing that Google has had time to solidify its data.
The question I am trying to answer is:
1. Instead of using views, would it be better use a 'SELECT INTO' statement and DROP the table everyday in the client database?
I'm afraid that by automating my process using the 'DROP TABLE' method will not scale well longterm. While testing it myself, it seems that performance is improved because it does not have to scan the entire table for the date range. I've also tested this with an index on the 'date' column and performance still seemed better with the 'DROP TABLE' method.
I am looking for best practices here.
NOTE: This is my first post. So I am not too familiar with how to format correctly. :)
Deleting rows from a table is a time-consuming process. All the deleted records get logged, and performance of the server suffers.
Instead, databases offer truncate table. This removes all the rows of the table without logging the rows, but keeps the structure intact. Also, triggers, indexes, constraints, stored procedures, and so on are not affected by the removal of rows.
In some databases, if you delete all rows from a table, then the operation is really truncate table. However, SQL Server is not one of those databases. In fact the documentation lists truncate as a best practice for deleting all rows:
To delete all the rows in a table, use TRUNCATE TABLE. TRUNCATE TABLE
is faster than DELETE and uses fewer system and transaction log
resources. TRUNCATE TABLE has restrictions, for example, the table
cannot participate in replication. For more information, see TRUNCATE
TABLE (Transact-SQL)
You can drop the table. But then you lose auxiliary metadata as well -- all the things listed above.
I would recommend that you truncate the table and reload the data using insert into or bulk insert.

SQL Server how to get last inserted data?

I ran a large query (~30mb) which inserts data in ~20 tables. Accidentally, I selected wrong database. There are only 2 tables with same name but with different columns. Now I want to make sure that no data is inserted in this database, I just don't know how.
If your table has a timestamp you can test for that.
Also sql-server keeps a log of all transactions.
See: https://web.archive.org/web/20080215075500/http://sqlserver2000.databases.aspfaq.com/how-do-i-recover-data-from-sql-server-s-log-files.html
This will show you how to examine the log to see if any inserts happened.
Best option go for Trigger
Use trigger to find the db name and
table name and all the history of
records manipulated