本文介绍了Node.js可读流,解析二进制数据,保留顺序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用最新的nodejs ...

Using latest nodejs...

有一个来自mongodb的二进制文件(文档中的字段).意味着我将同时处理多个二进制有效负载.数据是由片(最终单位)组成的媒体文件(h264).每个切片都是定界的.

Got a binary coming from mongodb (field within a document). Means I will be processing multiple binary payloads concurrently. Data is a media file (h264) made up of slices (nal units). Each slice is delimited.

如果我对数据"事件采取行动,则使用来自fs的可读流是否保留了数据块的顺序?是否可以保证按顺序处理数据"? (请参阅每次调用中"this"作用域的路径部分中的来源)

Using a readable stream from fs if I act on "data" events is the order of the data chunks preserved? Can I be guaranteed to process the "data" in order? (See the origin in the path part of the "this" scope in each call)

推荐答案

保证将数据写入流的顺序与读取数据的顺序相同.当写入流时,数据将被写入或排队,顺序不会改变.这是从Node.js来源获得的:

The order that data is written to a stream is guaranteed to be the same order that it is read with. When writing to a stream, the data is either written or queued, order does not change. This is from the Node.js source:

function writeOrBuffer(stream, state, chunk, encoding, cb) {
  chunk = decodeChunk(state, chunk, encoding);
  if (util.isBuffer(chunk))
    encoding = 'buffer';
  var len = state.objectMode ? 1 : chunk.length;

  state.length += len;

  var ret = state.length < state.highWaterMark;
  state.needDrain = !ret;

  if (state.writing || state.corked)
    state.buffer.push(new WriteReq(chunk, encoding, cb));
  else
    doWrite(stream, state, false, len, chunk, encoding, cb);

  return ret;
}

这也是触发数据事件的方式:

This is also how data events are fired:

// if we want the data now, just emit it.
if (state.flowing && state.length === 0 && !state.sync) {
  stream.emit('data', chunk);
  stream.read(0);
}

除非没有排队的数据,否则不会为大块触发数据事件,这意味着您将按传入的顺序获取数据.

The data event won't fire for a chunk unless there is no queued data, which means you will get the data in the order that it was passed in as.

这篇关于Node.js可读流,解析二进制数据,保留顺序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-26 11:21