Contact Us

Sometimes is nice to have a tool/report which could allow you to see how much your backup storage is degraded over time. Especially by fragmentation and auto growth/shrink operations. But it often requires extensive spending of administrators’ time to setup baseline monitoring, collecting data and also analyzing them. I founded one useful way how to save your time.

It also has some prerequisites. I assuming you are not cleaning up your backup history table (as usual :) ) and backing up your databases to storage which hold some database files. So it is mainly useful in smaller environments with smaller storage where data files are sharing same DAS or centralized SAN solution with shared raidgroup.


Then you can use my script to extract that information: 
StorageThr.sql


As a result you will get a list of values with a throughput_in_MB_per_min column per every backup made. Then it can be exported to the reports or graphs like this one:

More tips and tricks

First SMT release of 2023
by Jiri Dolezalek on 10/01/2023

First SMT release of 2023 has been made available

Read more
SQL 2012 allow only 20 cores
by Michal Tinthofer on 01/10/2012

Anyway this post should focus on some different distressing news about SQL 2012.  If you have current software assurance (SA) for SQL Server 2008 R2, this allows you to slide into SQL Server 2012 while maintaining CAL licensing (by the way this is not pos

Read more
SQL Day 2025
by Mikuláš Mráz on 29/05/2025

Last week, we had the incredible opportunity to sponsor SQL Day 2025 in Wrocław, Poland - one of the biggest data community conferences in Central Europe.

Read more