简体   繁体   English

我可以在Eclipse中运行Spark单元测试吗

[英]Can I run spark unit tests within eclipse

Recently we moved from using scalding to spark. 最近,我们从使用烫伤转移到火花。 I used eclipse and the scala IDE for eclipse to write code and tests. 我使用eclipse和scala IDE进行eclipse编写代码和测试。 The tests ran fine with twitter's JobTest class. 该测试在twitter的JobTest类上运行良好。 Any class using JobTest would be automatically available to run as a scala unit test within eclipse. 任何使用JobTest的类都可以自动在Eclipse中作为scala单元测试运行。 I've lost that ability now. 我已经失去了这种能力。 The spark test cases are perfectly runnable using sbt, but the run configuration in eclipse for these tests lists 'none applicable'. 使用sbt可以很好地运行spark测试用例,但是eclipse中针对这些测试的运行配置列出了“不适用”。

Is there a way to run spark unit tests within eclipse? 有没有办法在Eclipse中运行Spark单元测试?

I think this same approach using Java would work in Scala. 我认为使用Java的相同方法也可以在Scala中使用。 Basically just make a SparkContext using the master as "local" and then build and run unit tests as normal. 基本上,只需将master用作“本地”即可创建一个SparkContext,然后像往常一样构建和运行单元测试。 Be sure to stop the SparkContext when the test is finished. 测试完成后,请确保停止SparkContext。

I have this working for Spark 1.0.0 but not a newer version. 我有这个适用于Spark 1.0.0,但没有较新的版本。

public class Test123 {
  static JavaSparkContext sparkCtx;

  @BeforeClass
  public static void sparkSetup() {
    // Setup Spark
    SparkConf conf = new SparkConf();
    sparkCtx = new JavaSparkContext("local", "test", conf);     
  }

  @AfterClass
  public static void sparkTeardown() {
    sparkCtx.stop();
  }

  @Test
  public void integrationTest() {
    JavaRDD<String> logRawInput = sparkCtx.parallelize(Arrays.asList(new String[] {
            "data1",
            "data2",
            "garbage",
            "data3",
        }));
  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM