我偶尔使用的一些代码有时需要引用较长的UNC路径(例如\\?\UNC\MachineName\Path),但是我们发现无论目录位于何处,即使在同一台计算机上,也是如此通过UNC路径访问时,速度比本地路径慢。

例如,我们编写了一些基准测试代码,该代码将乱码字符串写入文件,然后稍后将其读回多次。我正在用6种不同的方法测试它,以在我的开发机上访问同一共享目录,并且代码在同一台机器上运行:

  • C:\Temp
  • \\MachineName\Temp
  • \\?\C:\Temp
  • \\?\UNC\MachineName\Temp
  • \\127.0.0.1\Temp
  • \\?\UNC\127.0.0.1\Temp

  • 结果如下:
    Testing: C:\Temp
    Wrote 1000 files to C:\Temp in 861.0647 ms
    Read 1000 files from C:\Temp in 60.0744 ms
    Testing: \\MachineName\Temp
    Wrote 1000 files to \\MachineName\Temp in 2270.2051 ms
    Read 1000 files from \\MachineName\Temp in 1655.0815 ms
    Testing: \\?\C:\Temp
    Wrote 1000 files to \\?\C:\Temp in 916.0596 ms
    Read 1000 files from \\?\C:\Temp in 60.0517 ms
    Testing: \\?\UNC\MachineName\Temp
    Wrote 1000 files to \\?\UNC\MachineName\Temp in 2499.3235 ms
    Read 1000 files from \\?\UNC\MachineName\Temp in 1684.2291 ms
    Testing: \\127.0.0.1\Temp
    Wrote 1000 files to \\127.0.0.1\Temp in 2516.2847 ms
    Read 1000 files from \\127.0.0.1\Temp in 1721.1925 ms
    Testing: \\?\UNC\127.0.0.1\Temp
    Wrote 1000 files to \\?\UNC\127.0.0.1\Temp in 2499.3211 ms
    Read 1000 files from \\?\UNC\127.0.0.1\Temp in 1678.18 ms
    

    我尝试使用IP地址排除DNS问题。可以检查每个文件访问的凭据或权限吗?如果是这样,是否有一种方法可以缓存它?它只是假设它是UNC路径,所以应该通过TCP/IP进行所有操作,而不是直接访问磁盘吗?我们用于读/写的代码有问题吗?我已经摘录了相关的基准测试部分,如下所示:
    using System;
    using System.Collections.Generic;
    using System.IO;
    using System.Runtime.InteropServices;
    using System.Text;
    using Microsoft.Win32.SafeHandles;
    using Util.FileSystem;
    
    namespace UNCWriteTest {
        internal class Program {
            [DllImport("kernel32.dll", CharSet = CharSet.Auto, SetLastError = true)]
            public static extern bool DeleteFile(string path); // File.Delete doesn't handle \\?\UNC\ paths
    
            private const int N = 1000;
    
            private const string TextToSerialize =
                "asd;lgviajsmfopajwf0923p84jtmpq93worjgfq0394jktp9orgjawefuogahejngfmliqwegfnailsjdhfmasodfhnasjldgifvsdkuhjsmdofasldhjfasolfgiasngouahfmp9284jfqp92384fhjwp90c8jkp04jk34pofj4eo9aWIUEgjaoswdfg8jmp409c8jmwoeifulhnjq34lotgfhnq34g";
    
            private static readonly byte[] _Buffer = Encoding.UTF8.GetBytes(TextToSerialize);
    
            public static string WriteFile(string basedir) {
                string fileName = Path.Combine(basedir, string.Format("{0}.tmp", Guid.NewGuid()));
    
                try {
                    IntPtr writeHandle = NativeFileHandler.CreateFile(
                        fileName,
                        NativeFileHandler.EFileAccess.GenericWrite,
                        NativeFileHandler.EFileShare.None,
                        IntPtr.Zero,
                        NativeFileHandler.ECreationDisposition.New,
                        NativeFileHandler.EFileAttributes.Normal,
                        IntPtr.Zero);
    
                    // if file was locked
                    int fileError = Marshal.GetLastWin32Error();
                    if ((fileError == 32 /* ERROR_SHARING_VIOLATION */) || (fileError == 80 /* ERROR_FILE_EXISTS */)) {
                        throw new Exception("oopsy");
                    }
    
                    using (var h = new SafeFileHandle(writeHandle, true)) {
                        using (var fs = new FileStream(h, FileAccess.Write, NativeFileHandler.DiskPageSize)) {
                            fs.Write(_Buffer, 0, _Buffer.Length);
                        }
                    }
                }
                catch (IOException) {
                    throw;
                }
                catch (Exception ex) {
                    throw new InvalidOperationException(" code " + Marshal.GetLastWin32Error(), ex);
                }
    
                return fileName;
            }
    
            public static void ReadFile(string fileName) {
                var fileHandle =
                    new SafeFileHandle(
                        NativeFileHandler.CreateFile(fileName, NativeFileHandler.EFileAccess.GenericRead, NativeFileHandler.EFileShare.Read, IntPtr.Zero,
                                                     NativeFileHandler.ECreationDisposition.OpenExisting, NativeFileHandler.EFileAttributes.Normal, IntPtr.Zero), true);
    
                using (fileHandle) {
                    //check the handle here to get a bit cleaner exception semantics
                    if (fileHandle.IsInvalid) {
                        //ms-help://MS.MSSDK.1033/MS.WinSDK.1033/debug/base/system_error_codes__0-499_.htm
                        int errorCode = Marshal.GetLastWin32Error();
                        //now that we've taken more than our allotted share of time, throw the exception
                        throw new IOException(string.Format("file read failed on {0} to {1} with error code {1}", fileName, errorCode));
                    }
    
                    //we have a valid handle and can actually read a stream, exceptions from serialization bubble out
                    using (var fs = new FileStream(fileHandle, FileAccess.Read, 1*NativeFileHandler.DiskPageSize)) {
                        //if serialization fails, we'll just let the normal serialization exception flow out
                        var foo = new byte[256];
                        fs.Read(foo, 0, 256);
                    }
                }
            }
    
            public static string[] TestWrites(string baseDir) {
                try {
                    var fileNames = new List<string>();
                    DateTime start = DateTime.UtcNow;
                    for (int i = 0; i < N; i++) {
                        fileNames.Add(WriteFile(baseDir));
                    }
                    DateTime end = DateTime.UtcNow;
    
                    Console.Out.WriteLine("Wrote {0} files to {1} in {2} ms", N, baseDir, end.Subtract(start).TotalMilliseconds);
                    return fileNames.ToArray();
                }
                catch (Exception e) {
                    Console.Out.WriteLine("Failed to write for " + baseDir + " Exception: " + e.Message);
                    return new string[] {};
                }
            }
    
            public static void TestReads(string baseDir, string[] fileNames) {
                try {
                    DateTime start = DateTime.UtcNow;
    
                    for (int i = 0; i < N; i++) {
                        ReadFile(fileNames[i%fileNames.Length]);
                    }
                    DateTime end = DateTime.UtcNow;
    
                    Console.Out.WriteLine("Read {0} files from {1} in {2} ms", N, baseDir, end.Subtract(start).TotalMilliseconds);
                }
                catch (Exception e) {
                    Console.Out.WriteLine("Failed to read for " + baseDir + " Exception: " + e.Message);
                }
            }
    
            private static void Main(string[] args) {
                foreach (string baseDir in args) {
                    Console.Out.WriteLine("Testing: {0}", baseDir);
    
                    string[] fileNames = TestWrites(baseDir);
    
                    TestReads(baseDir, fileNames);
    
                    foreach (string fileName in fileNames) {
                        DeleteFile(fileName);
                    }
                }
            }
        }
    }
    

    最佳答案

    这并不令我惊讶。您正在写入/读取相当少量的数据,因此文件系统缓存可能会将物理磁盘I/O的影响降到最低。基本上,瓶颈将是CPU。我不确定流量是否会通过TCP/IP堆栈,但至少会涉及SMB协议(protocol)。一方面,这意味着将在SMB客户端进程和SMB服务器进程之间来回传递请求,因此您需要在三个不同的进程(包括您自己的进程)之间进行上下文切换。使用本地文件系统路径,您将切换回内核模式,但不涉及其他任何过程。上下文切换比进入和退出内核模式要慢得多。

    可能会有两种不同的额外开销,每个文件一个,每千字节数据一个。在此特定测试中,每个文件SMB的开销很可能占主导。因为涉及的数据量也会影响物理磁盘I/O的影响,所以您可能会发现这仅在处理许多小文件时才是真正的问题。

    09-18 21:21