简体   繁体   中英

overriding classes using sbt console/spark-shell

So I've recently gotten much more interested in developing using sbt console/spark-shell and I had a question about working with existing packages/jars. So I know that you can import jars that that it's possible to override classes, but I'm wondering: Is it possible to override classes and force all other classes to point to that overriden class?

So if I have

class Bar(){ 
    def a() = { (new Foo).blah()}
}

and I override Foo, is there a way I can do that so that I don't need to also override Bar?

Let's explain this with a timeline:

1. class X { def t = 1 }

2. class Y {
      def x: X = new X
   }

Up to here the definition of class Y at line 2 refers to the definition of X in line 1.

3. class X { def t = 2 } 

Now, class Y from line 2 still refers to X from line 1. This is how the REPL works. Changes are effective forward in time not backwards.

4. class Y {
     def x: X = new X
   }

Now, as you expect, the new Y at line 4 will refer to the new X from line 3.

Normally, you'd do that by replacing the class in your classpath. If the new version is binary-compatible, you could even re-run without re-compiling.

The couple of hitches are that the REPL compiler is resident, and the class is in a specific package (eg, $line8 ). You'd need a fresh compiler to use the refreshed package.

There are open tickets to retain or discard $line packages when resetting the compiler. The other missing piece is to compile the new version of the class in the appropriate package, or conversely to regenerate the consuming class.

Note that the :require command lets you add a jar but not replace classes.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM