[英]Node duplex stream doesn't actually emit data when read
I'm working on a data-sink of sorts that is eventually to be used for Key-Value object streams in nodejs.我正在研究最终将用于 nodejs 中的键值对象流的各种数据接收器。
I've stumbled across duplex streams and started playing with them a bit to get my feet wet, but everything I try seems to not work.我偶然发现了双工流并开始尝试使用它们来弄湿我的脚,但我尝试的一切似乎都不起作用。
At present, I've got this duplex stream:目前,我有这个双工流:
class StorageStream extends stream.Duplex {
constructor() {
super({
objectMode: true
})
this.values = new Map();
this.entries = this.values.entries();
}
_write(data, encoding, next) {
const { key, value } = data;
this.values.set(key, value);
next();
}
_read() {
const next = this.entries.next();
if (next.value) {
this.push(next.value);
}
}
}
This is a SUPER CONTRIVED example, but essentially, when I write to this stream it should store the key and value in the Map, and when I read from this stream it should start reading from the map and passing them down the stream.这是一个超级设计的例子,但本质上,当我写入这个流时,它应该将键和值存储在地图中,当我从这个流中读取时,它应该开始从地图读取并将它们传递到流中。 However, this doesn't work, doing basically the below
但是,这不起作用,基本上执行以下操作
const kvstream = createKVStreamSomeHow(); // a basic, readable stream with KV Pairs
const logger = createLoggerStreamSomeHow(); // writable stream, logs the data coming through
const storage = new StorageStream();
kvstream.pipe(storage).pipe(logger);
causes the process to just end.导致进程刚刚结束。 So I guess I'm just a bit confused as to what I'm supposed to be doing inside of the
_read
method.所以我想我对我应该在
_read
方法中做什么感到有点困惑。
A couple of observations from the code provided by OP: OP提供的代码的一些观察:
read()
is generated before any keys have been set, in: this.entries = this.values.entries();
read()
返回的键的迭代器是在设置任何键之前生成的,在: this.entries = this.values.entries();
. The duplex implementation can be simplified by using the built-in Transform (docs) constructor.可以使用内置的Transform (docs)构造函数简化双工实现。 The transform constructor is perfect for store-and-forward scenarios.
转换构造函数非常适合存储转发场景。
This is an example of how a stream Transform could be applied in this scenario.这是如何在这种情况下应用流转换的示例。 Note that the
pipeline()
function is not required, and has been used in this example to simplify awaiting that the readable to have emitted all its data:请注意,
pipeline()
函数不是必需的,并且已在此示例中用于简化等待可读对象发出其所有数据的过程:
const { Writable, Readable, Transform, pipeline } = require('stream');
class StorageStream extends Transform {
constructor() {
super({
objectMode: true
})
this.values = new Map();
}
_transform(data, encoding, next) {
const { key, value } = data;
this.values.set(key, value);
console.log(`Setting Map key ${key} := ${value}`)
next(null, data);
}
}
(async ()=>{
await new Promise( resolve => {
pipeline(
new Readable({
objectMode: true,
read(){
this.push( { key: 'foo', value: 'bar' } );
this.push( null );
}
}),
new StorageStream(),
new Writable({
objectMode: true,
write( chunk, encoding, next ){
console.log("propagated:", chunk);
next();
}
}),
(error) => {
if( error ){
reject( error );
}
else {
resolve();
}
}
);
});
})()
.catch( console.error );
This produces the following output这会产生以下输出
> Setting Map key foo := bar
> propagated: { key: 'foo', value: 'bar' }
And can be used as并且可以用作
kvstream.pipe(new StorageStream()).pipe(logger);
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.