C# 使用“UploadFromStream”方法将文件从 MongoDB 迁移到

我在从 MongoDB 2 Azure blob 存储迁移文件时遇到问题。


下一个方法获取一个 GridFSFile 对象(代表 MongoDB GridFSFileStorage 中的一个文件),然后调用 uploadMemoryStream 方法进行上传。


值得一提的是,gridFSFile 在 findById 之后确实有内容,并且 length dows 也有内容,并且该位置最初在 0。


gridFSFile.Open 方法创建一个 Stream 对象,然后我将其作为参数传递给上传。


private static void iterateOverVersionCollection(Version version, Asset asset)

{

    try

    {    

        string _gridFSId = version.GridFSId;

        GridFSFile gridFSFile = gridFSFileStorage.FindById(_gridFSId);

        if (gridFSFile == null) return;


        string size = version.Name.ToLower();

        asset.Size = size;

        CloudBlockBlob blockBlob = GetBlockBlobReference(version, gridFSFile, asset);

        uploadMemoryStream(blockBlob, gridFSFile, asset);

        asset.UploadedOK = true;

    }

    catch (StorageException ex)

    {

        asset.UploadedOK = false;

        logException(ex, asset);

    }

}


private static void uploadMemoryStream(CloudBlockBlob blockBlob, GridFSFile gridFSFile, Asset asset)

{

      Stream st = gridFSFile.Open();

      blockBlob.UploadFromStream(st);

}

UploadFromStream 需要永远并且永远不会上传,而且要提到的一件事是,无论我如何使用 gridFSFile,如果我尝试使用 Stream.copyTo c# 方法用它创建一个 MemoryStream,它也需要永远并且永无止境所以应用程序卡在 blockBlob.UploadFromStream(st);


除了将 gridFSFile.Open 传递给 UploadFromMemoryStream 我还尝试了下一段代码:


using (var stream = new MemoryStream())

{

    byte[] buffer = new byte[2048]; // read in chunks of 2KB

    int bytesRead;

    while((bytesRead = st.Read(buffer, 0, buffer.Length)) > 0)

    {

        stream.Write(buffer, 0, bytesRead);

    }

    byte[] result = stream.ToArray();

}

但同样,程序卡在 st.Read 行。


任何帮助都感激不尽。


倚天杖
浏览 92回答 1
1回答

拉丁的传说

请注意,由于 UploadFromFileAsync() 或 UploadFromStream 对于巨大的 blob 不是可靠且高效的操作,我建议您考虑以下替代方案:如果你可以接受命令行工具,你可以尝试 AzCopy,它能够以高性能传输 Azure 存储数据,并且可以暂停和恢复传输。如果您想以编程方式控制传输作业,请使用Azure Storage Data Movement Library,它是 AzCopy 的核心。相同的示例代码string storageConnectionString = "myStorageConnectionString";&nbsp; &nbsp; &nbsp;CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);&nbsp; &nbsp; &nbsp; CloudBlobClient blobClient = account.CreateCloudBlobClient();&nbsp; &nbsp; &nbsp; CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");&nbsp; &nbsp; &nbsp; blobContainer.CreateIfNotExistsAsync().Wait();&nbsp; &nbsp; &nbsp; string sourcePath = @"C:\Tom\TestLargeFile.zip";&nbsp; &nbsp; &nbsp; CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("LargeFile.zip");&nbsp; &nbsp; &nbsp; // Setup the number of the concurrent operations&nbsp; &nbsp; &nbsp; TransferManager.Configurations.ParallelOperations = 64;&nbsp; &nbsp; &nbsp; // Setup the transfer context and track the upoload progress&nbsp; &nbsp; &nbsp; var context = new SingleTransferContext&nbsp; &nbsp; &nbsp; {&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; ProgressHandler =&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; new Progress<TransferStatus>(&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;progress => { Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred); })&nbsp; &nbsp; &nbsp; &nbsp;};&nbsp; &nbsp; &nbsp; // Upload a local blob&nbsp; &nbsp; &nbsp; TransferManager.UploadAsync(sourcePath, destBlob, null, context, CancellationToken.None).Wait();&nbsp; &nbsp; &nbsp; Console.WriteLine("Upload finished !");&nbsp; &nbsp; &nbsp; Console.ReadKey();如果您仍在寻找从流中以编程方式上传文件,我建议您使用以下代码分块上传var container = _client.GetContainerReference("test");container.CreateIfNotExists();var blob = container.GetBlockBlobReference(file.FileName);var blockDataList = new Dictionary<string, byte[]>();using (var stream = file.InputStream){&nbsp; &nbsp; var blockSizeInKB = 1024;&nbsp; &nbsp; var offset = 0;&nbsp; &nbsp; var index = 0;&nbsp; &nbsp; while (offset < stream.Length)&nbsp; &nbsp; {&nbsp; &nbsp; &nbsp; &nbsp; var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset);&nbsp; &nbsp; &nbsp; &nbsp; var blockData = new byte[readLength];&nbsp; &nbsp; &nbsp; &nbsp; offset += stream.Read(blockData, 0, readLength);&nbsp; &nbsp; &nbsp; &nbsp; blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData);&nbsp; &nbsp; &nbsp; &nbsp; index++;&nbsp; &nbsp; }}Parallel.ForEach(blockDataList, (bi) =>{&nbsp; &nbsp; blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null);});blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray());另一方面,如果您的系统中有可用的文件并想使用 Uploadfile 方法,我们也可以灵活地使用此方法以块的形式上传文件数据TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);int retryCount = 1;BlobRequestOptions bro = new BlobRequestOptions(){&nbsp; SingleBlobUploadThresholdInBytes = 1024 * 1024, //1MB, the minimum&nbsp; ParallelOperationThreadCount = 1,&nbsp;&nbsp; RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),};CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(ConnectionString);CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();cloudBlobClient.DefaultRequestOptions = bro;cloudBlobContainer = cloudBlobClient.GetContainerReference(ContainerName);CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(Path.GetFileName(fileName));blob.StreamWriteSizeInBytes = 256 * 1024; //256 kblob.UploadFromFile(fileName, FileMode.Open);详细解释请浏览https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-4-uploading-large-blobs/希望能帮助到你。
打开App,查看更多内容
随时随地看视频慕课网APP