I get asked quite alot at work to monitor folders and delete files older then x days or stop them growing out of control. These folders are normaly FTP backups of things that do not age out. Like config backups (NSX im looking at you).
There are a couple of ways to do this. The simplest being:
- Delete files older then x days
- Keep a number of files (x), deleteing older files first
These two requirements meet most of my needs and do not have to be very complex. Some times I may need to add things like over rides for files of a certain extantion or string in the name.
But for now Ill just publish the two scripts listed above.
Note: in both scripts I have left the -Whatif switch on the remove-item command. This means the scripts will only list what they will delete. Remove the -whatif switch to actualy delete things.
The fist script will simplely list all files in a directory tree and delete any file with a LastWriteDate older then x days.
$rootdir = "A:\Path\To\The\Folder"
$daysToKeep = 30
$Files = Get-ChildItem -Path $rootdir -Recurse -File
$DateNow = get-date
foreach ($file in $files){
if ($file.LastWriteTime -lt ($DateNow.AddDays(-$daysToKeep)))
{
$file | remove-item -Force -Confirm:$false -WhatIf
"Deleting to delete " + $file.fullname
}
}
I use something like this on file repositrys that hold tempory data. Also on FTP servers that hold backup configs.
The second script will list the files in a diectory by LastWriteTime (newest first) then delete them while skipping the fist x files.
$rootdir = "A:\Path\To\The\Folder"
$keepFiles = 12
$Files = Get-ChildItem -Path $rootdir -File
$filesToDelete = $Files | sort LastWriteTime -Descending | select -Skip $keepFiles
if ($filesToDelete){$filesToDelete | remove-item -Force -Confirm:$false -WhatIf}
This script is also used to control config backups.
These scripts are just the basic code needed to control a file repoisity. I can add things like logging, file overrides etc