简体   繁体   中英

Node duplex stream doesn't actually emit data when read

I'm working on a data-sink of sorts that is eventually to be used for Key-Value object streams in nodejs.

I've stumbled across duplex streams and started playing with them a bit to get my feet wet, but everything I try seems to not work.

At present, I've got this duplex stream:

class StorageStream extends stream.Duplex {
  constructor() {
    super({
      objectMode: true
    })

    this.values = new Map();
    this.entries = this.values.entries();
  }

  _write(data, encoding, next) {
    const { key, value } = data;

    this.values.set(key, value);

    next();
  }

  _read() {
    const next = this.entries.next();

    if (next.value) {
      this.push(next.value);
    }
  }
}

This is a SUPER CONTRIVED example, but essentially, when I write to this stream it should store the key and value in the Map, and when I read from this stream it should start reading from the map and passing them down the stream. However, this doesn't work, doing basically the below

const kvstream = createKVStreamSomeHow(); // a basic, readable stream with KV Pairs

const logger = createLoggerStreamSomeHow(); // writable stream, logs the data coming through

const storage = new StorageStream();

kvstream.pipe(storage).pipe(logger);

causes the process to just end. So I guess I'm just a bit confused as to what I'm supposed to be doing inside of the _read method.

A couple of observations from the code provided by OP:

  1. The iterator that loops over the keys returned by read() is generated before any keys have been set, in: this.entries = this.values.entries(); . Consequently, calling read() never produce output.
  2. If a new key is set in the Map, it won't be pushed onto the read buffer for subsequent writables to process

The duplex implementation can be simplified by using the built-in Transform (docs) constructor. The transform constructor is perfect for store-and-forward scenarios.

This is an example of how a stream Transform could be applied in this scenario. Note that the pipeline() function is not required, and has been used in this example to simplify awaiting that the readable to have emitted all its data:

const { Writable, Readable, Transform, pipeline } = require('stream');

class StorageStream extends Transform {
  constructor() {
    super({
      objectMode: true
    })

    this.values = new Map();
  }

  _transform(data, encoding, next) {
    const { key, value } = data;

    this.values.set(key, value);
    console.log(`Setting Map key ${key} := ${value}`)

    next(null, data);
  }
}

(async ()=>{
  await new Promise( resolve => {
    pipeline(
      new Readable({
        objectMode: true,
        read(){
          this.push( { key: 'foo', value: 'bar' } );
          this.push( null );
        }
      }),
      new StorageStream(),
      new Writable({
        objectMode: true,
        write( chunk, encoding, next ){
          console.log("propagated:", chunk);
          next();
        }
      }),
      (error) => {
        if( error ){
          reject( error );
        }
        else {
          resolve();
        }
      }
    );
  });
})()
  .catch( console.error );

This produces the following output

> Setting Map key foo := bar
> propagated: { key: 'foo', value: 'bar' }

And can be used as

kvstream.pipe(new StorageStream()).pipe(logger);

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM