简体   繁体   English

如何在运行测试之前启动docker容器

[英]How to start a docker container before run the test

I would like to setup a test environment for Scala project. 我想为Scala项目设置一个测试环境。 In addition, I need to start Kafka, that is running in a docker container. 另外,我需要启动在Docker容器中运行的Kafka。 Before the test is going to start, it should start first Kafka container. 在测试开始之前,它应该首先启动Kafka容器。

I am using Scalatest and thinking about to start the Kafka container in the TestFixture , once before tests run. 我正在使用Scalatest并考虑在测试运行之前启动TestFixture中的Kafka容器。

The question is, which is the recommended way to start a container before running tests. 问题是,这是在运行测试之前启动容器的推荐方法。 I considered the Docker API , but not sure, it is the right way or not. 我考虑过Docker API ,但不确定,这是正确的方法。

You can use testcontainers-scala , which is just a wrapper around testcontainers . 您可以使用testcontainers-scala ,它只是testcontainers的包装器。

In your build.sbt add: 在你的build.sbt添加:

libraryDependencies += "com.dimafeng" %% "testcontainers-scala" % "0.25.0" % "test"
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.2.0"

And then you can create spec: 然后你可以创建规范:

import com.dimafeng.testcontainers.{ForAllTestContainer, GenericContainer}
import org.apache.kafka.clients.consumer.KafkaConsumer
import org.apache.kafka.common.serialization.{StringDeserializer, StringSerializer}
import org.scalatest.FlatSpec
import org.testcontainers.containers.Network
import org.testcontainers.utility.Base58


class KafkaSpec extends FlatSpec with ForAllTestContainer {

  final val KafkaPort = 9093

  override val container = GenericContainer("confluentinc/cp-kafka").configure{ c =>
    c.withNetwork(Network.newNetwork())
    c.withNetworkAliases("kafka-" + Base58.randomString(6))
    c.withExposedPorts(KafkaPort)
    c.withEnv("KAFKA_LISTENERS", "PLAINTEXT://0.0.0.0:" + KafkaPort + ",BROKER://0.0.0.0:9092")
    c.withEnv("KAFKA_LISTENER_SECURITY_PROTOCOL_MAP", "BROKER:PLAINTEXT,PLAINTEXT:PLAINTEXT")
    c.withEnv("KAFKA_INTER_BROKER_LISTENER_NAME", "BROKER")
    c.withEnv("KAFKA_BROKER_ID", "1")
    c.withEnv("KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR", "1")
    c.withEnv("KAFKA_OFFSETS_TOPIC_NUM_PARTITIONS", "1")
    c.withEnv("KAFKA_LOG_FLUSH_INTERVAL_MESSAGES", Long.MaxValue.toString)
    c.withEnv("KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS", "0")
  }


  it should "do something" in {

    val properties = new Properties()
    properties.put("bootstrap.servers", s"${container.containerIpAddress}:$KafkaPort")
    properties.put("group.id", "test")
    properties.put("key.deserializer", classOf[StringDeserializer])
    properties.put("value.deserializer", classOf[StringDeserializer])
    properties.put("key.serializer", classOf[StringSerializer])
    properties.put("value.serializer", classOf[StringSerializer])

    val kafkaConsumer = new KafkaConsumer[String, String](properties)
    ....

  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM