How to Read and Write Files with PowerShell

Illustration of PowerShell script editing and executing commands to read, write, append and manage plain text and CSV files, showing terminal, code snippets, file icons, data flow.

How to Read and Write Files with PowerShell
SPONSORED

Sponsor message — This article is made possible by Dargslan.com, a publisher of practical, no-fluff IT & developer workbooks.

Why Dargslan.com?

If you prefer doing over endless theory, Dargslan’s titles are built for you. Every workbook focuses on skills you can apply the same day—server hardening, Linux one-liners, PowerShell for admins, Python automation, cloud basics, and more.


How to Read and Write Files with PowerShell

Working with files represents one of the most fundamental tasks in system administration and automation. Whether you're managing server configurations, processing log files, or automating data workflows, the ability to efficiently read from and write to files determines how effectively you can accomplish your daily tasks. PowerShell provides a comprehensive suite of commands and techniques that transform what could be tedious manual work into streamlined, repeatable processes.

File operations in PowerShell encompass everything from simple text file manipulation to complex structured data handling. This includes reading content line by line, writing formatted output, managing file encodings, handling large datasets efficiently, and working with various file formats including CSV, JSON, and XML. The platform offers multiple approaches for each task, allowing you to choose the method that best fits your specific scenario and performance requirements.

Throughout this guide, you'll discover practical techniques for reading file content using different cmdlets, writing data with proper formatting and encoding, managing file paths and locations, handling errors gracefully, and implementing best practices for performance optimization. You'll learn when to use each method, understand the differences between various approaches, and gain insights into real-world applications that will immediately enhance your scripting capabilities.

Understanding File Operations in PowerShell

PowerShell treats files as objects within the file system, providing rich metadata and properties beyond just the content. When you interact with files, you're working with FileInfo objects that contain information about size, creation date, attributes, and location. This object-oriented approach enables powerful manipulation capabilities that go far beyond traditional command-line file operations.

The cmdlet architecture in PowerShell follows consistent naming conventions that make discovering and using file commands intuitive. Commands like Get-Content, Set-Content, Add-Content, and Out-File each serve specific purposes in the file manipulation ecosystem. Understanding when to use each cmdlet depends on your specific requirements for performance, memory usage, and the type of data you're processing.

"The fundamental difference between reading files in PowerShell versus traditional methods lies in the ability to treat content as objects immediately available for pipeline processing, rather than simple text streams requiring additional parsing."

Core Cmdlets for File Reading

The primary cmdlet for reading file content is Get-Content, which loads file contents into memory and returns them as an array of strings, with each line becoming a separate array element. This default behavior works exceptionally well for text files and provides immediate access to individual lines for processing. The cmdlet supports numerous parameters that control how content is read, including encoding specifications, line counting, and streaming capabilities.

For scenarios requiring raw binary data or when you need to read files as single strings rather than line arrays, PowerShell offers alternative approaches. The -Raw parameter transforms Get-Content behavior to return the entire file as a single string, preserving all formatting and line breaks. This proves particularly useful when working with structured data formats or when you need to perform pattern matching across multiple lines.

Essential File Reading Commands

  • Get-Content -Path "file.txt" - Reads file content line by line into an array
  • Get-Content -Path "file.txt" -Raw - Reads entire file as a single string
  • Get-Content -Path "file.txt" -TotalCount 10 - Reads only the first 10 lines
  • Get-Content -Path "file.txt" -Tail 5 - Reads only the last 5 lines
  • Get-Content -Path "file.txt" -Encoding UTF8 - Reads file with specific encoding
  • [System.IO.File]::ReadAllText("file.txt") - .NET method for reading entire file
  • [System.IO.File]::ReadAllLines("file.txt") - .NET method for reading lines as array
  • Get-Content -Path "file.txt" -Wait - Continuously monitors file for new content

Reading Files: Techniques and Best Practices

When reading small to medium-sized text files, Get-Content provides the most straightforward approach. The cmdlet automatically handles file opening, reading, and closing, while managing encoding detection in most cases. For files under several megabytes, the performance impact of loading entire content into memory remains negligible, making this the preferred method for routine scripting tasks.

$fileContent = Get-Content -Path "C:\logs\application.log"
foreach ($line in $fileContent) {
    if ($line -match "ERROR") {
        Write-Host "Found error: $line" -ForegroundColor Red
    }
}

This basic pattern demonstrates reading a file and processing each line individually. The variable $fileContent becomes an array where each element represents one line from the file. You can access specific lines using array notation, such as $fileContent[0] for the first line or $fileContent[-1] for the last line.

Handling Large Files Efficiently

When working with large files that could consume significant memory, streaming becomes essential. PowerShell supports streaming through the -ReadCount parameter, which processes files in batches rather than loading everything at once. This approach maintains low memory usage while still providing line-by-line processing capabilities.

Get-Content -Path "C:\logs\huge-file.log" -ReadCount 1000 | ForEach-Object {
    $batch = $_
    foreach ($line in $batch) {
        # Process each line in the batch
        if ($line -contains "CRITICAL") {
            Add-Content -Path "C:\logs\critical-errors.log" -Value $line
        }
    }
}

The -ReadCount parameter specifies how many lines to read before passing them down the pipeline. Setting this to 1000 means PowerShell reads 1000 lines, processes them, releases that memory, then reads the next 1000 lines. This technique enables processing files of virtually any size without memory constraints.

"Memory management becomes critical when processing large datasets. Using streaming techniques with appropriate batch sizes can mean the difference between a script that completes successfully and one that crashes with out-of-memory errors."
Method Best Use Case Memory Impact Performance
Get-Content Small to medium files (under 100MB) High - loads entire file Fast for small files
Get-Content -ReadCount Large files requiring line processing Low - streams in batches Moderate, consistent
[System.IO.File]::ReadLines() Large files, .NET integration Very low - true streaming Fast, memory efficient
Get-Content -Raw Need entire file as single string High - loads entire file Very fast for string operations
StreamReader Maximum control, custom parsing Very low - manual control Fastest with proper implementation

Working with File Encodings

Character encoding represents a critical consideration when reading files, particularly when dealing with international characters or files created on different operating systems. PowerShell defaults to UTF-8 encoding in modern versions, but legacy files may use ASCII, Unicode, UTF-16, or other encodings. Specifying the wrong encoding can result in corrupted characters or complete reading failures.

# Reading with specific encoding
$content = Get-Content -Path "C:\data\international.txt" -Encoding UTF8

# Detecting encoding (requires manual inspection or external tools)
$bytes = Get-Content -Path "C:\data\unknown.txt" -Encoding Byte -TotalCount 4
# BOM analysis: EF BB BF = UTF-8, FF FE = UTF-16 LE, FE FF = UTF-16 BE

# Reading as raw bytes when encoding is problematic
$rawBytes = [System.IO.File]::ReadAllBytes("C:\data\binary-file.dat")

The -Encoding parameter accepts various values including ASCII, UTF8, UTF32, Unicode (UTF-16), BigEndianUnicode, and Default (system default encoding). When working with files from unknown sources, examining the first few bytes (Byte Order Mark or BOM) can help identify the correct encoding.

Writing Files: Methods and Considerations

Writing data to files in PowerShell offers several approaches, each optimized for different scenarios. The most common cmdlets include Set-Content, which replaces file content entirely, Add-Content, which appends to existing files, and Out-File, which provides additional formatting options. Understanding the distinctions between these commands ensures you choose the right tool for your specific requirements.

Essential File Writing Commands

  • Set-Content -Path "file.txt" -Value "content" - Creates or overwrites file with content
  • Add-Content -Path "file.txt" -Value "content" - Appends content to existing file
  • Out-File -FilePath "file.txt" -InputObject $data - Writes formatted output to file
  • "content" | Set-Content -Path "file.txt" - Pipeline writing to file
  • [System.IO.File]::WriteAllText("file.txt", $content) - .NET method for writing string
  • [System.IO.File]::WriteAllLines("file.txt", $array) - .NET method for writing array
  • Set-Content -Path "file.txt" -Value $data -Encoding UTF8 - Write with specific encoding
  • $data | Export-Csv -Path "file.csv" -NoTypeInformation - Write structured data to CSV

Creating and Overwriting Files

The Set-Content cmdlet provides the standard method for creating new files or completely replacing existing file content. This cmdlet accepts single strings, arrays of strings, or pipeline input, automatically converting objects to strings before writing. When you pass an array to Set-Content, each array element becomes a separate line in the output file.

# Writing simple text
Set-Content -Path "C:\output\report.txt" -Value "Report generated on $(Get-Date)"

# Writing multiple lines from an array
$lines = @(
    "System Report",
    "=" * 50,
    "Generated: $(Get-Date)",
    "Computer: $env:COMPUTERNAME"
)
Set-Content -Path "C:\output\report.txt" -Value $lines

# Writing pipeline output
Get-Process | Select-Object Name, CPU, Memory | Out-File "C:\output\processes.txt"

When using Set-Content, PowerShell creates the file if it doesn't exist and overwrites it completely if it does. This behavior makes it ideal for generating reports, creating configuration files, or any scenario where you want to ensure the file contains only your new content without any remnants of previous data.

"The choice between Set-Content and Out-File often comes down to formatting needs. Set-Content provides cleaner output for raw data, while Out-File preserves the formatted table appearance you see in the console."

Appending Content to Existing Files

For logging scenarios or when you need to add information to existing files without destroying current content, Add-Content serves as the appropriate cmdlet. This command opens the file in append mode, adds your new content to the end, and closes the file, preserving all existing data. The operation proves particularly valuable for continuous logging or accumulating results from multiple operations.

# Simple append operation
Add-Content -Path "C:\logs\activity.log" -Value "$(Get-Date) - Script started"

# Appending multiple lines
$logEntries = @(
    "$(Get-Date) - Processing file batch",
    "$(Get-Date) - Batch completed successfully"
)
Add-Content -Path "C:\logs\activity.log" -Value $logEntries

# Conditional logging
try {
    # Some operation
    $result = Get-Service -Name "NonExistentService"
} catch {
    Add-Content -Path "C:\logs\errors.log" -Value "$(Get-Date) - ERROR: $($_.Exception.Message)"
}

The append operation includes minimal overhead since PowerShell only needs to open the file, seek to the end, write new content, and close it. This efficiency makes Add-Content suitable even for high-frequency logging scenarios, though for extremely high-volume logging, consider batching writes to reduce file system operations.

Advanced File Writing Techniques

Beyond basic content writing, PowerShell supports sophisticated file operations including formatted output, encoding control, and structured data export. The Out-File cmdlet provides parameters for width control, encoding specification, and append mode, making it versatile for various output requirements. Understanding these advanced options enables you to create precisely formatted files that meet specific application or integration requirements.

# Formatted output with width control
Get-Process | Format-Table Name, CPU, Memory -AutoSize | 
    Out-File "C:\reports\processes.txt" -Width 200

# Writing with specific encoding
$content = "Special characters: é, ñ, ü, 中文"
Set-Content -Path "C:\output\international.txt" -Value $content -Encoding UTF8

# Appending with Out-File
Get-Date | Out-File "C:\logs\timestamps.txt" -Append

# Writing without newline
$data = "Inline content"
[System.IO.File]::WriteAllText("C:\output\no-newline.txt", $data)

Working with Structured Data Formats

Modern PowerShell scripting frequently involves working with structured data formats like CSV, JSON, and XML. PowerShell provides specialized cmdlets for these formats that handle serialization and deserialization automatically, eliminating the need for manual formatting. These cmdlets ensure proper structure and encoding while providing options for customizing output.

# Exporting to CSV
Get-Process | Select-Object Name, CPU, Memory | 
    Export-Csv -Path "C:\data\processes.csv" -NoTypeInformation

# Exporting to JSON
$data = @{
    ComputerName = $env:COMPUTERNAME
    Timestamp = Get-Date
    Services = Get-Service | Select-Object Name, Status
}
$data | ConvertTo-Json -Depth 3 | Set-Content "C:\data\system-info.json"

# Exporting to XML
Get-Service | Export-Clixml -Path "C:\data\services.xml"

# Custom CSV with delimiter
$data | Export-Csv -Path "C:\data\data.csv" -Delimiter ";" -Encoding UTF8

The Export-Csv cmdlet automatically handles header creation, quoting of fields containing delimiters, and proper escaping of special characters. The -NoTypeInformation parameter removes the type information line that PowerShell adds by default, creating cleaner CSV files compatible with external applications. For JSON export, ConvertTo-Json serializes objects with the -Depth parameter controlling how many levels of nested objects to include.

Format Export Cmdlet Import Cmdlet Best For
CSV Export-Csv Import-Csv Tabular data, Excel compatibility
JSON ConvertTo-Json ConvertFrom-Json Web APIs, hierarchical data
XML Export-Clixml Import-Clixml Complex objects, PowerShell serialization
Plain Text Set-Content Get-Content Logs, simple data, human-readable
HTML ConvertTo-Html N/A Reports, web display

Using .NET Methods for Enhanced Performance

While PowerShell cmdlets provide convenience and consistency, direct .NET Framework methods often deliver superior performance for file operations, particularly when dealing with large files or high-frequency operations. The System.IO.File and System.IO.StreamWriter classes offer low-level control over file operations with minimal overhead.

# Fast write using .NET
[System.IO.File]::WriteAllText("C:\output\fast-write.txt", $content)

# Writing lines efficiently
$lines = @("Line 1", "Line 2", "Line 3")
[System.IO.File]::WriteAllLines("C:\output\lines.txt", $lines)

# Using StreamWriter for multiple writes
$stream = [System.IO.StreamWriter]::new("C:\output\stream.txt")
try {
    foreach ($item in $largeDataset) {
        $stream.WriteLine($item)
    }
} finally {
    $stream.Close()
}

# Appending with StreamWriter
$stream = [System.IO.StreamWriter]::new("C:\logs\append.log", $true)
try {
    $stream.WriteLine("$(Get-Date) - Log entry")
} finally {
    $stream.Close()
}

The StreamWriter class provides particularly significant performance benefits when writing many lines to a file. Rather than opening and closing the file for each write operation, StreamWriter maintains an open file handle and buffer, dramatically reducing file system overhead. Always wrap StreamWriter usage in try-finally blocks to ensure proper file closure even if errors occur.

"Performance optimization in file operations often requires balancing readability and maintainability against raw speed. For most administrative scripts, cmdlet convenience outweighs minor performance differences, but high-volume data processing benefits significantly from .NET methods."

File Path Management and Best Practices

Proper file path handling prevents common errors and ensures scripts work reliably across different environments. PowerShell provides several approaches for constructing and validating file paths, including the Join-Path cmdlet for combining path segments, Test-Path for verifying existence, and automatic variables like $PSScriptRoot for script-relative paths.

# Constructing paths safely
$outputPath = Join-Path -Path "C:\Reports" -ChildPath "$(Get-Date -Format 'yyyy-MM-dd').txt"

# Script-relative paths
$configPath = Join-Path -Path $PSScriptRoot -ChildPath "config.json"

# Verifying path existence
if (Test-Path -Path $outputPath) {
    Write-Host "File already exists"
} else {
    New-Item -Path $outputPath -ItemType File -Force
}

# Creating directories if needed
$directory = Split-Path -Path $outputPath -Parent
if (-not (Test-Path -Path $directory)) {
    New-Item -Path $directory -ItemType Directory -Force
}

# Getting absolute paths
$absolutePath = Resolve-Path -Path ".\relative\path\file.txt"

Error Handling in File Operations

Robust file operations require comprehensive error handling to manage common issues like missing files, access permissions, locked files, and disk space limitations. PowerShell's try-catch error handling combined with -ErrorAction parameters provides flexible approaches for managing these scenarios gracefully.

# Basic error handling
try {
    $content = Get-Content -Path "C:\data\important.txt" -ErrorAction Stop
    # Process content
} catch [System.IO.FileNotFoundException] {
    Write-Error "File not found: $($_.Exception.Message)"
} catch [System.UnauthorizedAccessException] {
    Write-Error "Access denied: $($_.Exception.Message)"
} catch {
    Write-Error "Unexpected error: $($_.Exception.Message)"
}

# Testing before operations
if (Test-Path -Path $inputFile) {
    $data = Get-Content -Path $inputFile
} else {
    Write-Warning "Input file not found, using defaults"
    $data = @("Default", "Values")
}

# Handling locked files
$retryCount = 0
$maxRetries = 3
$success = $false

while (-not $success -and $retryCount -lt $maxRetries) {
    try {
        Add-Content -Path $logFile -Value $logEntry -ErrorAction Stop
        $success = $true
    } catch {
        $retryCount++
        Write-Warning "File locked, retry $retryCount of $maxRetries"
        Start-Sleep -Seconds 1
    }
}
"Effective error handling in file operations transforms brittle scripts that fail unexpectedly into robust automation that gracefully handles real-world conditions like network interruptions, permission changes, and resource contention."

Performance Optimization Strategies

Optimizing file operations becomes crucial when processing large volumes of data or when scripts run frequently. Several strategies can significantly improve performance, including batching operations, using appropriate cmdlets for the task, minimizing file open/close cycles, and leveraging pipeline streaming for memory efficiency.

⚡ Key Performance Optimization Techniques

  • ✅ Use -ReadCount parameter for large file processing to control memory usage
  • 📊 Batch multiple writes together rather than writing line by line
  • 🔄 Leverage pipeline streaming to process data without loading everything into memory
  • ⚙️ Choose .NET methods over cmdlets for high-frequency operations
  • 💾 Consider using StreamReader and StreamWriter for maximum control
# Inefficient: Multiple file opens
foreach ($item in $data) {
    Add-Content -Path $output -Value $item  # Opens and closes file each time
}

# Efficient: Single write operation
$data | Add-Content -Path $output  # Opens file once, writes all data

# Inefficient: Loading entire large file
$content = Get-Content -Path $largeFile
$filtered = $content | Where-Object { $_ -match "ERROR" }

# Efficient: Streaming with filtering
Get-Content -Path $largeFile -ReadCount 1000 | Where-Object { $_ -match "ERROR" } | 
    Add-Content -Path $errorLog

# Maximum efficiency for large datasets
$reader = [System.IO.StreamReader]::new($inputFile)
$writer = [System.IO.StreamWriter]::new($outputFile)
try {
    while ($null -ne ($line = $reader.ReadLine())) {
        if ($line -match "IMPORTANT") {
            $writer.WriteLine($line)
        }
    }
} finally {
    $reader.Close()
    $writer.Close()
}

Real-World Practical Applications

Understanding file operations in isolation provides foundational knowledge, but seeing how these techniques combine in real-world scenarios demonstrates their practical value. Common applications include log file analysis, configuration file management, data transformation pipelines, report generation, and automated backup verification.

Log File Analysis and Monitoring

System administrators frequently need to parse log files to identify errors, track events, or generate reports. PowerShell's file reading capabilities combined with pattern matching and filtering create powerful log analysis tools. The following example demonstrates a comprehensive log analysis script that processes large log files efficiently.

# Log analysis script
$logFile = "C:\Logs\application.log"
$reportFile = "C:\Reports\log-analysis-$(Get-Date -Format 'yyyy-MM-dd').txt"

# Initialize report
$report = @()
$report += "Log Analysis Report"
$report += "=" * 80
$report += "Generated: $(Get-Date)"
$report += "Source: $logFile"
$report += ""

# Counters
$errorCount = 0
$warningCount = 0
$criticalErrors = @()

# Process log file
Get-Content -Path $logFile -ReadCount 5000 | ForEach-Object {
    foreach ($line in $_) {
        switch -Regex ($line) {
            "ERROR" { 
                $errorCount++
                if ($line -match "CRITICAL") {
                    $criticalErrors += $line
                }
            }
            "WARNING" { 
                $warningCount++ 
            }
        }
    }
}

# Generate report
$report += "Statistics:"
$report += "Total Errors: $errorCount"
$report += "Total Warnings: $warningCount"
$report += ""

if ($criticalErrors.Count -gt 0) {
    $report += "Critical Errors Found:"
    $report += $criticalErrors
}

# Write report
Set-Content -Path $reportFile -Value $report
Write-Host "Report generated: $reportFile"

Configuration File Management

Managing configuration files represents another common use case where PowerShell file operations shine. Whether you're updating application settings, modifying system configurations, or maintaining infrastructure-as-code definitions, the ability to read, modify, and write configuration files programmatically proves invaluable.

# Reading and modifying JSON configuration
$configFile = "C:\Apps\MyApp\config.json"
$config = Get-Content -Path $configFile -Raw | ConvertFrom-Json

# Modify configuration
$config.Database.ConnectionString = "Server=newserver;Database=mydb"
$config.Logging.Level = "Debug"
$config.Features.NewFeature = $true

# Write updated configuration
$config | ConvertTo-Json -Depth 10 | Set-Content -Path $configFile

# Backup original configuration
Copy-Item -Path $configFile -Destination "$configFile.backup-$(Get-Date -Format 'yyyyMMdd')"

# Working with INI-style configuration files
$iniFile = "C:\Config\settings.ini"
$iniContent = Get-Content -Path $iniFile

# Modify specific setting
$updatedContent = $iniContent | ForEach-Object {
    if ($_ -match "^MaxConnections=") {
        "MaxConnections=100"
    } else {
        $_
    }
}

Set-Content -Path $iniFile -Value $updatedContent

Data Transformation Pipelines

Data transformation scenarios often require reading data from one format, processing or transforming it, and writing it to another format. PowerShell excels at these pipeline operations, allowing you to chain together read, transform, and write operations efficiently.

# CSV to JSON transformation
$csvData = Import-Csv -Path "C:\Data\employees.csv"

$jsonData = $csvData | ForEach-Object {
    [PSCustomObject]@{
        FullName = "$($_.FirstName) $($_.LastName)"
        Email = $_.Email.ToLower()
        Department = $_.Department
        HireDate = [DateTime]::Parse($_.HireDate).ToString("yyyy-MM-dd")
    }
}

$jsonData | ConvertTo-Json | Set-Content -Path "C:\Data\employees.json"

# Text file to structured data
$logEntries = Get-Content -Path "C:\Logs\access.log" | ForEach-Object {
    if ($_ -match "^(\S+) - - \[(.*?)\] \"(\w+) (.*?) HTTP") {
        [PSCustomObject]@{
            IPAddress = $Matches[1]
            DateTime = $Matches[2]
            Method = $Matches[3]
            Resource = $Matches[4]
        }
    }
}

$logEntries | Export-Csv -Path "C:\Reports\access-analysis.csv" -NoTypeInformation

Automated Report Generation

Generating reports from system data represents a frequent automation task where file writing capabilities prove essential. The following example demonstrates creating a comprehensive HTML report with formatted data from multiple sources.

# System report generation
$reportPath = "C:\Reports\system-report-$(Get-Date -Format 'yyyy-MM-dd').html"

# Gather system information
$computerInfo = Get-ComputerInfo | Select-Object CsName, OsName, OsVersion, OsArchitecture
$diskInfo = Get-PSDrive -PSProvider FileSystem | Select-Object Name, Used, Free
$processInfo = Get-Process | Sort-Object CPU -Descending | Select-Object -First 10 Name, CPU, WorkingSet

# Create HTML report
$html = @"



    System Report
    
        body { font-family: Arial, sans-serif; margin: 20px; }
        h1 { color: #333; }
        table { border-collapse: collapse; width: 100%; margin: 20px 0; }
        th, td { border: 1px solid #ddd; padding: 8px; text-align: left; }
        th { background-color: #4CAF50; color: white; }
    


    System Report - $(Get-Date -Format 'yyyy-MM-dd HH:mm')
    
    Computer Information
    $($computerInfo | ConvertTo-Html -Fragment)
    
    Disk Usage
    $($diskInfo | ConvertTo-Html -Fragment)
    
    Top Processes by CPU
    $($processInfo | ConvertTo-Html -Fragment)


"@

Set-Content -Path $reportPath -Value $html
Write-Host "Report generated: $reportPath"

Security Considerations and File Permissions

When working with files, security considerations must remain paramount. File operations can expose sensitive data, create security vulnerabilities if not properly validated, or inadvertently modify critical system files. Understanding permission management, input validation, and secure coding practices ensures your scripts operate safely in production environments.

# Check file permissions before operations
$acl = Get-Acl -Path $targetFile
$currentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name

# Verify write access
$hasWriteAccess = $acl.Access | Where-Object {
    $_.IdentityReference -eq $currentUser -and 
    $_.FileSystemRights -match "Write|FullControl"
}

if (-not $hasWriteAccess) {
    Write-Error "Insufficient permissions to write to $targetFile"
    exit
}

# Sanitize file paths to prevent directory traversal
function Test-SafePath {
    param([string]$Path, [string]$BaseDirectory)
    
    $resolvedPath = [System.IO.Path]::GetFullPath($Path)
    $resolvedBase = [System.IO.Path]::GetFullPath($BaseDirectory)
    
    return $resolvedPath.StartsWith($resolvedBase)
}

# Validate before using user input in paths
$userInput = Read-Host "Enter filename"
$basePath = "C:\AllowedDirectory"
$fullPath = Join-Path -Path $basePath -ChildPath $userInput

if (Test-SafePath -Path $fullPath -BaseDirectory $basePath) {
    Set-Content -Path $fullPath -Value $data
} else {
    Write-Error "Invalid path specified"
}

Handling Sensitive Data

When file operations involve sensitive information like passwords, API keys, or personal data, additional security measures become necessary. PowerShell provides mechanisms for encryption, secure string handling, and credential management that should be employed when dealing with sensitive content.

# Encrypting sensitive data before writing
$sensitiveData = "SecretInformation"
$secureString = ConvertTo-SecureString -String $sensitiveData -AsPlainText -Force
$encryptedData = ConvertFrom-SecureString -SecureString $secureString
Set-Content -Path "C:\Secure\encrypted.txt" -Value $encryptedData

# Reading encrypted data
$encryptedContent = Get-Content -Path "C:\Secure\encrypted.txt"
$secureString = ConvertTo-SecureString -String $encryptedContent
$credential = New-Object System.Management.Automation.PSCredential("user", $secureString)
$decryptedData = $credential.GetNetworkCredential().Password

# Using Data Protection API for encryption
$dataToEncrypt = "Sensitive information"
$bytes = [System.Text.Encoding]::UTF8.GetBytes($dataToEncrypt)
$encryptedBytes = [System.Security.Cryptography.ProtectedData]::Protect(
    $bytes, 
    $null, 
    [System.Security.Cryptography.DataProtectionScope]::CurrentUser
)
[System.IO.File]::WriteAllBytes("C:\Secure\protected.dat", $encryptedBytes)
What is the fastest way to read a large file in PowerShell?

For large files, use the .NET System.IO.File::ReadLines() method or Get-Content with the -ReadCount parameter. The ReadLines() method provides true streaming with minimal memory overhead, while -ReadCount processes files in batches. For maximum performance with custom processing, use StreamReader with a while loop reading line by line.

How do I append text to a file without overwriting existing content?

Use the Add-Content cmdlet which automatically appends to the end of the file, or use Out-File with the -Append parameter. Both methods preserve existing content and add new lines at the end. For .NET approaches, create a StreamWriter with the append parameter set to $true.

What encoding should I use when writing files?

UTF-8 is recommended for most modern applications as it supports international characters and is widely compatible. Specify encoding explicitly using the -Encoding UTF8 parameter with file cmdlets. For legacy systems or specific application requirements, you may need ASCII, Unicode (UTF-16), or other encodings. Always verify encoding requirements with the consuming application.

How can I process a file that is locked by another application?

Implement retry logic with delays between attempts, as shown in the error handling examples. Use try-catch blocks to capture file locking exceptions and retry after a brief pause. For reading locked files, consider using the .NET FileStream class with FileShare.ReadWrite to allow reading while another process has the file open. However, be cautious as this can lead to reading incomplete or inconsistent data.

What is the difference between Set-Content and Out-File?

Set-Content writes raw data to files and is optimized for string and byte array content, while Out-File writes formatted output as it appears in the console, including table formatting and spacing. Set-Content is generally preferred for data files, while Out-File works well for reports and logs where visual formatting matters. Out-File also provides the -Width parameter to control line wrapping.

How do I read only specific lines from a large file?

Use the -TotalCount parameter to read lines from the beginning, or -Tail to read lines from the end. For reading specific line ranges, combine Get-Content with array indexing or use Select-Object with -Skip and -First parameters. For maximum efficiency with very large files, use StreamReader with a counter to skip to the desired position.