本文介绍了如何使用 PowerShell 拆分文本文件?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要将一个大的(500MB)文本文件(一个 log4net 异常文件)分割成可管理的块,比如 100 个 5MB 的文件就可以了.

I need to split a large (500 MB) text file (a log4net exception file) into manageable chunks like 100 5 MB files would be fine.

我认为这应该是 PowerShell 的公园散步.我该怎么做?

I would think this should be a walk in the park for PowerShell. How can I do it?

推荐答案

这对 PowerShell 来说有点简单,但由于标准 Get-Content cmdlet 不能很好地处理非常大的文件而变得复杂.我建议做的是使用 .NET StreamReader 类 在 PowerShell 脚本中逐行读取文件,并使用 Add-Content cmdlet 将每一行写入文件名中索引不断增加的文件.像这样:

This is a somewhat easy task for PowerShell, complicated by the fact that the standard Get-Content cmdlet doesn't handle very large files too well. What I would suggest to do is use the .NET StreamReader class to read the file line by line in your PowerShell script and use the Add-Content cmdlet to write each line to a file with an ever-increasing index in the filename. Something like this:

$upperBound = 50MB # calculated by Powershell
$ext = "log"
$rootName = "log_"

$reader = new-object System.IO.StreamReader("C:\Exceptions.log")
$count = 1
$fileName = "{0}{1}.{2}" -f ($rootName, $count, $ext)
while(($line = $reader.ReadLine()) -ne $null)
{
    Add-Content -path $fileName -value $line
    if((Get-ChildItem -path $fileName).Length -ge $upperBound)
    {
        ++$count
        $fileName = "{0}{1}.{2}" -f ($rootName, $count, $ext)
    }
}

$reader.Close()

这篇关于如何使用 PowerShell 拆分文本文件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-23 00:50