If you are dealing with oversized SharePoint On-Premises content databases, you are not alone and chances are that the issue started long ago when the initial architecture and quota planning were overlooked. Maybe no one expected a team site to grow into a 200 GB or as in our experience even 2TB. Maybe retention and versioning settings were never reviewed.
Now, you are experiencing the real cost of poor initial planning: long backup times, bloated transaction log files and sluggish performance, especially during critical operations like disaster recovery or content database upgrades. When a content database grows beyond reasonable limits and lacks proper maintenance routines, it becomes increasingly un-scalable and fragile. Restoring even a single document can become a painful and time-consuming process. As a result, your ability to meet recovery time objectives and service level agreements (SLAs) is seriously compromised.
We advise keeping databases under 100 GB and proactively splitting large site collections to avoid issues, rather than waiting until it becomes urgent.
A single SharePoint content database can hold multiple site collections, especially if you haven’t added new content databases. While content databases can be split, dividing a single site collection is complex and not easily done. Therefore, we strongly recommend avoiding the use of subsites.
As a SharePoint on-premises administrator, proactively monitoring your server farms is critical to prevent issues with large content databases. Using a custom dashboard, similar to the one implemented in our organization, allows you to track database performance and growth effectively. Regular monitoring helps you avoid problematic situations, such as struggling to reduce database size or dealing with stalled farm upgrades that appear stuck.
Moreover, your role is also to educate developers, project managers and if you can users also to keep your farms healthy.
Find large content databases
Before you dive into fixes or cleanup, you need to know what you’re dealing with. How big are your content databases, really?
To get a clear picture, open PowerShell ISE with a power user. You can run a code like this :
Add-PSSnapin Microsoft.SharePoint.PowerShell
$db_list = Get-SPContentDatabase
foreach($db in $db_list)
{
$db.Name
$db.DiskSizeRequired/1GB
“”
}
This gives you a quick overview of all SharePoint databases and their approximate size in gigabytes.
You could also retrieve the largest content database sizes like this:
Add-PSSnapin Microsoft.SharePoint.PowerShell
$db_list = Get-SPContentDatabase
foreach($db in $db_list) {
$db_size = $db.DiskSizeRequired/1GB
if($db_size -gt 50)
{ $db.Name
$db.DiskSizeRequired/1GB
""
}}
You can also check the size directly in SQL Server Management Studio (SSMS) :
Open SSMS and connect to your SharePoint SQL instance.
Right-click on the content database > Reports > Standard Reports > Disk Usage.
You will see data size, log size, index size, and free space.
Anything over 100 GB should raise a flag especially if it contains a single site collection.
Knowing your numbers is step one. From there, you can decide whether to split, clean up, or monitor more closely.
Reduce database size
Keep an Eye on the SQL Transaction Log
One of the first things to check when a SharePoint content database gets too big, is the transaction log file (ldf
file). If it is growing out of control, your disk space might start disappearing fast…
Ask yourself a few key questions:
- Is the recovery model set to Full? It probably is, especially in production.
- Are transaction log backups running on a regular basis ideally every few minutes?
- Have you checked with your DBA or backup team to make sure everything is in place?
If you’re using the Full recovery model but not backing up the log frequently, SQL Server will keep every transaction since the last log backup… which means the .ldf
file will keep growing indefinitely.
If you are not using log shipping or point-in-time recovery and your SLA allows it, consider switching to the simple recovery model to avoid runaway log file growth, but in most cases content databases should be in full recovery model.
If the data file (.mdf) itself is too large, you’ll need to implement one of the available solutions, such as splitting the content database or site collections, to manage its size effectively.
Check Storage Metrics
Before you move the site collection to another database, I advise you to monitor the site collection size metrics:
Using powershell $site.Usage.Storage/1GB, this result might differ from the SQL size.
Using the site storage metrics through site colllection administration settings or appending /_layouts/storman.aspx to the site url.

Now, you can understand which library or file is occupating a large space, perhaps you could move the large libraries to another site collection…
Check recycle bin
Perhaps you need to empty the first or secondary recycle bin to release space.
You could also achieve this using a custom powershell script.
Versioning Cleanup
Old versions can take huge space, especially if you have no versioning number limit.
If it is the case you could set a limit to the version number and then delete the old version with a custom PowerShell script.
Retention Policies
- If not enabled, enable them on key libraries.
- Retention policies help clean up or archive unused content.
- Central Admin or Compliance settings.
Media Files Review and Audit
- Use tools or PowerShell to check for:
- Large ppt/x, pdf files
- Videos
- Recommend linking to video platforms (e.g., Stream, YouTube, or internal servers) rather than storing videos in document libraries.
Move large site collection to a dedicated content database
You need to add a new content database that will contain one of the largest site collection.
First, ensure proper Auto growth in SQL, and simple recovery model.
Then notify the site collection users that the site collection will be in read only mode during its transfer to the new database.
Then use the Move-SPSite PowerShell command to move this site to the destination database.
In SQL, set the recovery model to Full.
What if the site collection content database is still greater than 100GB?
You have a few options briefly listed here:
- Review the architecture and split this site collection to multiple small ones per years or topics.
- Promote subsites to site collections, there are some ways to achieve this
- Delete irrelevant data.
- Refine your retention policy.
- If you have lots of media files consider using a video library solution like the one we developed.
Prevent large databases
Automating Quotas in Site Creation Scripts
When adding new site collections whether through central administration UI or in PowerShell, make sure that you limit its size by assigning a quota template.
When to Leave It Alone
If a DB is large but stable and unused, so you do not need a full recovery model. It is supported by Microsoft till you do not need to recover data from it. You can support the fact that it will take a very long time to upgrade.
Need help on this topic, this is one of our specialty, please contact me