top of page
Jv Cyberguard

Building a File Integrity Monitoring Tool with PowerShell


Introduction


Hey everyone! 👋 I'm excited to share a project I've been working on that's both a fun challenge and a crucial tool in cybersecurity—building a File Integrity Monitoring (FIM) solution using PowerShell.


We all know how vital it is to keep our systems secure, and file integrity monitoring plays a significant role in that. It helps us detect unauthorized changes to critical files, which could indicate a security breach or system compromise.


But here's the twist—I decided to take on this project not just to bolster my security toolkit, but also to sharpen my PowerShell skills. Talk about killing two birds with one stone! Let's dive into how I brought this idea to life, the hurdles I faced, and the solutions I crafted along the way.


The Game Plan

I set out with a clear goal: build a PowerShell script that monitors a specified directory for any file additions, modifications, or deletions. The script would:


  • User Prompt: Allow the user to specify the directory and files they want monitored.

  • Create a Baseline: Generate a snapshot of all files and their hash values in the target directory.

  • Monitor Changes: Continuously watch for any changes compared to the baseline.

  • Alert on Events: Notify me whenever there's a new file, a modification, or a deletion.


But as with any project, the devil is in the details. Let's break down how I approached each part.



Building the Baseline


First things first—I needed a reliable way to create a baseline of all the files in the directory, complete with their hash values for integrity checks. The baseline here would be the file hash, since as long as the file remains unaltered the file has will remain the same. Changes to the file will result in changes to the hash which is a deviation from our defined baseline.


The Approach

My goal for this program was to ensure interactivity and usability. You will see this reflected throughout. We also created 5 test documents that we will monitor.


To allow the user to build the baseline for the requested target directory we coded the following:

  1. A User Interface: Present options to the user and capture their choice.

  2. Erase Existing Baseline: Remove the old baseline if the user opts to create a new one.

  3. Scan the Directory: Use Get-ChildItem to recursively list all files.

  4. Calculate Hashes: Employ Get-FileHash with SHA256 to generate hash values which will be the baseline for these files.

  5. Store Data: Save the file paths and hashes into a text file for easy retrieval.


Below is evidence of this achieved followed by the portion of script responsible for it.


The Code

Here's the code for creating the baseline, including the user prompts. For this to work properly make sure you have this folder structure and are running the script from within the FIM folder in PS.

 
#Writes user instructions to the screen. 
#Gives the user the option to Create a new baseline for a file or Monitor files against a known baseline
Write-Output "File Integrity Monitor Application `n"
Write-Output "Choose from the options below by entering the number matching your choice:"
Write-Output "`t1.Create a new baseline"
Write-Output "`t2.Monitor files against saved baseline"
Write-Output "`tEnter 0 to quit."
$action = Read-Host "Option: "
$baselineFile = '.\Baseline\baseline.txt'


#Functions used in the code are below. 

<#In PowerShell, the output from cmdlets or expressions within a function is automatically collected and returned.#>

#function that deletes old baseline if user decides to collect a new baseline. Essentially Overwrites
function Erase-Baseline-If-It-Exists{
$baselineexists = Test-Path .\baseline\baseline.txt
if ($baselineexists){Remove-Item .\baseline\baseline.txt}
}

#function to get files in directory
function Get-Files-In-Directory{
    param($directory)
    Get-ChildItem $directory -Recurse -File #File switch parameter ensures only files are retrieved and not directories
}

#Function to calculate File hash
function Calculate-File-Hash{
    param($f)
    Get-FileHash -Path $f -Algorithm SHA256
}

#If user chooses option 1, the following code creates a new baseline
if ($action -eq 1){

    #Calls function to overwrite existing baseline
    $proceed = Read-Host "Please confirm that you understand that creating a new one will overwrite any old?(Y/N)"
    $proceed = $proceed.ToUpper()
    if($proceed -eq 'Y'){
        Erase-Baseline-If-It-Exists
    }
    
    #Prompts user for target directory
    $target_dir = Read-Host "Please enter the complete path of the directory"
    
    #Checks that the path is valid using a Ternary/Conditional operator.
    $message = (Test-Path $target_dir) ? "Path found" : "Path not found"
    Write-Output $message
    
    #Retrieves all files in the directory and subdirectory as needed
    $retrievedFiles = Get-Files-In-Directory $target_dir
    
    #Loops through all the retrieved files to calculate the SHA256 hash, and writes it to the baseline.txt file
    foreach ($file in $retrievedfiles){
        $hash = Calculate-File-Hash $file
        "$($File.FullName)|$($hash.Hash)" | Out-File -Path $baselineFile -Append -Encoding utf8
    }
Write-Host "Baseline created and saved to $baselineFile"
}
 

Personal Considerations

Including user prompts enhances the script's usability, making it more interactive and accessible and using functions enhances the programs readability and makes code modular. The prompts guide the user through:


  • Selecting an Action: Creating a new baseline or monitoring files.

  • Confirming Overwrites: Ensuring the user is aware that creating a new baseline will overwrite the existing one.

  • Providing Directory Paths: Allowing the user to specify the target directory to monitor.


These interactions make the tool more user-friendly, especially for those who may not be comfortable editing scripts or working directly with code.


The Monitoring Mechanism


With the baseline in place, the next challenge was setting up the monitoring system.


The Approach


The script needed to:

  • Run Continuously: Keep checking for changes at regular intervals.

  • Provide Clear Alerts: Notify only on actual changes.

  • Interact with the User: Allow the user to start monitoring after choosing the appropriate option.


I anticipated the first iteration of this section of the code would be noisy but we just needed a working PoC for starters. This is what it looked like in action.


What happens when we add a file the directory that we're monitoring? Remember the baseline file initially only included doc1 through doc5.



Well we are alerted of the file that was added and its path.


What if a file is deleted?




You can see that we are notified. We tested to see if the monitoring part of the script detects modifications to file content and it also alerted on those changes in real time.




Now, before I share the code, I want to highlight to you some grievances that I had with the initial testing of the monitoring mechanism above that caused me to make some adjustments and refinements to the script. I will share the hurdles, how I overcame them, and then the final version of the script with the enhancements.


Hurdles and How I Overcame Them


Repetitive Alerts


Problem: The script was initially too "noisy," alerting me about the same changes over and over again. I used an infinite loop for monitoring and as long as there was a deviation from baseline, the version of the script that produced the output above, would alert on the same deviations over and over again. How could we get the script to a stage where it know when it has alerted on a particular change already?........


2 hours later!

Solution: After a bit of research and brainstorming, I decided to implemented an $alertedFiles hashtable to keep track of files already alerted on. This dramatically reduced the noise and made the alerts more actionable. In addition, I figured we would also want to be able to remove a suppression if the file was restored to original state or restored from deletion so we built that logic in to our if statements. See it all in the code below.


The Code


Here's the code for monitoring files, including user prompts and fixes to the repetitive alerts problem I had initially. Some resources, I had to leverage to help figure it out were:


 
#executes when use chooses option 2 - Monitor Baseline
elseif($action -eq 2)
{
$baselinefile = '.\Baseline\baseline.txt'

#Check that the baseline exists
if (-not (Test-Path $baselineFile)){
    Write-Host "Baseline file not found. Please create a baseline first."
    exit
}

#Initializing our Hash Table or Dictionary
$filehashDictionary = @{}
$alertedFiles = @{}
#Load FilePath and hash into Dictionary
$filepathsAndHashes = Get-Content -Path $baselineFile


#We loop through each line of the file parsing the file path which is to the left of "|" and hash which is to the right of "|"
#Split() allows us to split the line into an array of substrings
#It places the path in 0 and the hash in 1, We add them to the hashtable with the path being the key and the hash being the value. 

foreach ($f in $filepathsAndHashes){
    $filehashDictionary.Add($f.split("|")[0], $f.split("|")[1])
}

while($true){
Start-Sleep -Seconds 5
Write-host "Checking integrity of files in baseline. You will on be alerted if any of the files have experienced changes."

#Retrieves files in the Monitored Directory
$retrievedFiles = Get-Files-In-Directory .\WorkDocuments

foreach ($file in $retrievedFiles)
{
    $hash = Calculate-File-Hash $file
    $filepath = $file.FullName
    #Checks if file hash for retrieved file exists in the filehashDictionary populated from the baseline file.
    #Hashtables return null for keys not found.  
    if ( $null  -eq $filehashDictionary[$filepath]){
        #The below nestedif ensures it only alerts if the file has not be alerted on already.
        if($null -eq $alertedFiles[$filepath]){
            Write-Host "ALERT $($filepath) has been added but is not included in the baseline." -ForegroundColor Green
            #Adds the file path as a key to the hash table of files that have been alerted on
            $alertedFiles.Add($filepath,'Added')
        }
        }   elseif($filehashDictionary[$filepath] -ne $hash.Hash){
                    #The below nestedif ensures it only alerts if the file has not be alerted on already.
                    if($null -eq $alertedFiles[$filepath]){
                    #Alert:File has been changed!! Notify the User
                    Write-Host "ALERT: $($filepath) has been changed!" -ForegroundColor Red
                    #Adds the file path as a key to the hash table of files that have been alerted on
                    $alertedFiles.Add($filepath,'Modified')
                    }
            }   else{
                    #If the file is not a new one that is added and it does not vary from baseline. Then it has maintained state. 
                    #However this check allows files that were modified before but now have returned to baseline to be alerted on again in future by removing it
                    #from the alertedFiles hash table.
                    if ($alertedFiles.ContainsKey($filepath) -and $alertedFiles[$filePath] -eq 'Modified'){
                        $alertedFiles.Remove($filePath)
                    }
                    else
                    {
                        #The file has not been changed we do not alert on this. 
                        #File Integrity Monitors alert when file state is altered. 
                        }

    }
}



    #Separate check for deleted files
    foreach ($file in $filehashDictionary.Keys){
        #The below condition executes when the file path is not found.  
        if(-not (Test-Path $file)) {
            if($null -eq $alertedFiles[$file]){
            Write-Host "ALERT: $file was deleted!" -ForegroundColor Yellow
            #Adds the file path as a key to the hash table of files that have been alerted on
            $alertedFiles.Add($file,'Deleted')
                }
            }
            else{
                #Executes if the file path is found and it is present in the $alertFiles hashTable. 
                if($alertedFiles.ContainsKey($file) -and $alertedFiles[$file] -eq 'Deleted'){
                    #File was restored and should be removed from alertFiles supressions, so it can be alerted on in the event of future deletions
                    $alertedFiles.Remove($file)
                }
            }

        }
}
}


 

Let me show you a final example of how the improved version works now.


A new file is added?

It now triggers an alert once!


Let's change the text in doc4.txt again.

It now triggers the alert once as well!


Let's really push the script to its limits. Let us change the text back to the original and change it again to see if it triggers the alert again that the doc has changed. Look at that, it records the change the second time around. This shows that script removed the file path from the alertedFiles hash table once it had returned to baseline. That resulted in it being able to trigger once again when it was changed.

What happens when we delete a document? We deleted doc5.txt


The FIM detected it.

We readded and deleted it again. It alerted accordingly. All aspects of the code appear to be working.



Ways to taking It to the Next Level


While I'm thrilled with how the script turned out, I can't help but think about future enhancements.


Handling Malformed Baseline Data


I realized that if the baseline file had any malformed lines, it could cause errors during execution.


Solution: I could add validation when parsing each line of the baseline file. If a line doesn't split into exactly two parts, the script logs a warning and skips it.


Packaging and Deployment

Potentially:

  • Convert the Script to an Executable: Using tools like ps2exe to make deployment easier.

  • Create an Installer: Simplify the installation process across multiple systems.

  • Run as a Service: Configure the script to run as a background service for continuous monitoring.


Write to a log file:

Rather than printing to the terminal, print to a .txt file then ingest these logs into a Log analytics work space for added detections for Azure virtual machine. That would be a great application of the concepts learned in my Microsoft Sentinel HoneyPot lab.


Do let me know if you have any additional ideas! Thanks again for coding along with JvCyberguard!



47 views0 comments

Comments


bottom of page