简体   繁体   English

Akka流如何不断实现?

[英]How can Akka streams be materialized continually?

I am using Akka Streams in Scala to poll from an AWS SQS queue using the AWS Java SDK . 我使用Scala中的Akka Streams使用AWS Java SDKAWS SQS队列进行轮询。 I created an ActorPublisher which dequeues messages on a two second interval: 我创建了一个ActorPublisher ,它以两秒的间隔使消息出列:

class SQSSubscriber(name: String) extends ActorPublisher[Message] {
  implicit val materializer = ActorMaterializer()

  val schedule = context.system.scheduler.schedule(0 seconds, 2 seconds, self, "dequeue")

  val client = new AmazonSQSClient()
  client.setRegion(RegionUtils.getRegion("us-east-1"))
  val url = client.getQueueUrl(name).getQueueUrl

  val MaxBufferSize = 100
  var buf = Vector.empty[Message]

  override def receive: Receive = {
    case "dequeue" =>
      val messages = iterableAsScalaIterable(client.receiveMessage(new ReceiveMessageRequest(url).getMessages).toList
      messages.foreach(self ! _)
    case message: Message if buf.size == MaxBufferSize =>
      log.error("The buffer is full")
    case message: Message =>
      if (buf.isEmpty && totalDemand > 0)
        onNext(message)
      else {
        buf :+= message
        deliverBuf()
      }
    case Request(_) =>
      deliverBuf()
    case Cancel =>
      context.stop(self)
  }

  @tailrec final def deliverBuf(): Unit =
    if (totalDemand > 0) {
      if (totalDemand <= Int.MaxValue) {
        val (use, keep) = buf.splitAt(totalDemand.toInt)
        buf = keep
        use foreach onNext
      } else {
        val (use, keep) = buf.splitAt(Int.MaxValue)
        buf = keep
        use foreach onNext
        deliverBuf()
      }
    }
}

In my application, I am attempting to run the flow at a 2 second interval as well: 在我的应用程序中,我也试图以2秒的间隔运行流程:

val system = ActorSystem("system")
val sqsSource = Source.actorPublisher[Message](SQSSubscriber.props("queue-name"))
val flow = Flow[Message]
  .map { elem => system.log.debug(s"${elem.getBody} (${elem.getMessageId})"); elem }
  .to(Sink.ignore)

system.scheduler.schedule(0 seconds, 2 seconds) {
  flow.runWith(sqsSource)(ActorMaterializer()(system))
}

However, when I run my application I receive java.util.concurrent.TimeoutException: Futures timed out after [20000 milliseconds] and subsequent dead letter notices which is caused by the ActorMaterializer . 但是,当我运行我的应用程序时,我收到java.util.concurrent.TimeoutException: Futures timed out after [20000 milliseconds]以及由ActorMaterializer引起的后续死信通知。

Is there a recommended approach for continually materializing an Akka Stream? 是否有推荐的方法来持续实现Akka Stream?

I don't think you need to create a new ActorPublisher every 2 seconds. 我认为你不需要每2秒创建一个新的ActorPublisher This seems redundant and wasteful of memory. 这似乎是多余的,浪费了内存。 Also, I don't think an ActorPublisher is necessary. 另外,我认为不需要ActorPublisher。 From what I can tell of the code, your implementation will have an ever growing number of Streams all querying the same data. 从我所知的代码来看,你的实现将有越来越多的Streams查询相同的数据。 Each Message from the client will be processed by N different akka Streams and, even worse, N will grow over time. 来自客户端的每条Message将由N个不同的akka​​ Streams处理,更糟糕的是,N将随着时间的推移而增长。

Iterator For Infinite Loop Querying 无限循环查询的迭代器

You can get the same behavior from your ActorPublisher by using scala's Iterator . 您可以使用scala的Iterator从ActorPublisher获得相同的行为。 It is possible to create an Iterator which continuously queries the client: 可以创建一个不断查询客户端的Iterator:

//setup the client
val client = {
  val sqsClient = new AmazonSQSClient()
  sqsClient setRegion (RegionUtils getRegion "us-east-1")
  sqsClient
}

val url = client.getQueueUrl(name).getQueueUrl

//single query
def queryClientForMessages : Iterable[Message] = iterableAsScalaIterable {
  client receiveMessage (new ReceiveMessageRequest(url).getMessages)
}

def messageListIteartor : Iterator[Iterable[Message]] = 
  Iterator continually messageListStream

//messages one-at-a-time "on demand", no timer pushing you around
def messageIterator() : Iterator[Message] = messageListIterator flatMap identity

This implementation only queries the client when all previous Messages have been consumed and is therefore truly reactive . 此实现仅在所有先前消息已被消耗时查询客户端,因此才真正被动 No need to keep track of a buffer with fixed size. 无需跟踪具有固定大小的缓冲区。 Your solution needs a buffer because the creation of Messages (via a timer) is de-coupled from the consumption of Messages (via println). 您的解决方案需要一个缓冲区,因为消息(通过计时器)的创建与消息消息(通过println)分离。 In my implementation, creation & consumption are tightly coupled via back-pressure. 在我的实施中,创造和消费通过背压紧密耦合

Akka Stream Source Akka Stream Source

You can then use this Iterator generator-function to feed an akka stream Source: 然后,您可以使用此Iterator生成器函数来提供akka流源:

def messageSource : Source[Message, _] = Source fromIterator messageIterator

Flow Formation 流动形成

And finally you can use this Source to perform the println (As a side note: your flow value is actually a Sink since Flow + Sink = Sink ). 最后,您可以使用此Source来执行println (作为旁注:您的flow值实际上是一个Sink因为Flow + Sink = Sink )。 Using your flow value from the question: 使用问题中的flow值:

messageSource runWith flow

One akka Stream processing all messages. 一个akka Stream处理所有消息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Akka Streams - 如何在图形中保持辅助接收器的物化值 - Akka Streams - How to keep materialized value of an auxiliary Sink in a Graph Akka-streams - 如何访问流的物化值 - Akka-streams - how to access the materialized value of the stream 如何使用Akka流将Flow1的物化值传递和消费到Flow2中 - How to pass and consume Flow1 materialized value into Flow2 using Akka streams Akka-streams - 如何将flatMapConcatenated的source的物化值带到另一个源? - Akka-streams - How to take the materialized value of source which is flatMapConcatenated in to another source? 您将如何更改此Akka Streams示例以获得物化值Future [ServerBinding]? - How would you change this Akka Streams example to get the materialized value Future[ServerBinding]? Akka Streams:如何从GraphDSL API获得Materialized Sink输出? - Akka Streams: How do I get Materialized Sink output from GraphDSL API? 如何发布或订阅物化的Akka Stream流程图? - How can I publish or subscribe to a materialized Akka Stream flow graph? 在简单的Akka Streams操作中未实现Scala集合 - Scala collection not materialized inside simple Akka Streams operation 如何将Akka Streams Merge的输出传输到另一个Flow? - How can I pipe the output of an Akka Streams Merge to another Flow? 可以同时运行多少个 Akka Streams 有限制吗? - Is there a limit to how many Akka Streams can run at the same time?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM