简体   繁体   中英

TypeScript Type Inference with Intersection of Generic Types

I have a use case where I have a composable object, that should (not) work with some functions, depending on what it has been composed of. The composition is done using type intersection.

Here's a simplified version of the code:

type A<T> = { a: T }
type B<T> = { b: (val: T) => T }

const shouldWork = {
  a: 'str',
  b: (val: string) => val.toUpperCase(),
  someOtherProp: 'foo'
}

const shouldFail = {
  a: 'str',
  b: (val: number) => 42
}

function test<T extends A<???> & B<???>>(obj: T): T {
  return {
    ...obj,
    a: obj.b(obj.a)
  }
}

const res1 = test(shouldWork);
// res1 should be typeof shouldWork, i.e. A & B & { someOtherProp: string }
console.log(res1.a); // "STR"
console.log(res1.someOtherProp); // "foo"

const res2 = test(shouldFail); // should fail because A<string> and B<number> don't match

I've tried using test<T extends A<any> & B<any>> , but this of course will allow any combination of generic types.

Then I tried adding an extra type S to ensure that both generics are the same: test<S, T extends A<S> & B<S>> . But this will fail because "Types of parameters 'val' and 'val' are incompatible. Type 'unknown' is not assignable to type 'string'"

After some more tries I found something that does work as expected:

function test<S, T extends A<S> & B<S> = A<S> & B<S>>(obj: T): T { ... }
const res1 = test<string>(shouldWork);
const res2 = test<string>(shouldFail); // throws error: Types of parameters 'val' and 'val' are incompatible. Type 'string' is not assignable to type 'number'

But as you can see, it's hard to read and a lot to write, especially when the intersection consists of more types.

Is there some easier way to get this done?

This is a little "iffy", but one way to achieve the inference you want is like this:

function test<O, T>(obj: O & A<T> & B<T>): O {
    return {
        ...obj,
        a: obj.b(obj.a)
    }
}

When you want the compiler to infer a type parameter X from a call to a function func(param) where param is of type P and the call signature for func is like <X>(param: F<X>) => any , the compiler will plug P in for F<X> to get candidates for X .

If the call signature involves an intersection type like <X>(param: F<X> & G<X>) => void , the compiler will tend to plug P in for both F<X> and G<X> to get candidates... even though it would be fine (and sometimes desirable) to do that for only one of them.

So with test(obj) of call signature <O, T>(obj: O & A<T> & B<T>) => O , the compiler will tend to match O with typeof obj to get the candidate for O , and then also match A<T> and B<T> to typeof obj to get the candidate for T . If obj is not a valid A<T> & B<T> the inference will fail. And you can use O to represent the actual type of obj without widening it to A<T> & B<T> .

So this behaves the way you want:

const res1 = test(shouldWork); // okay
const res2 = test(shouldFail); // error
// -------------> ~~~~~~~~~~
// Argument of type '{ a: string; b: (val: number) => number; }' is not assignable to
//  parameter of type '{ a: string; b: (val: number) => number; } & A<number> & B<number>'.

If that hadn't worked, you could have gotten similar behavior by using a single unconstrained type parameter O and have the parameter obj be of a conditional type :

function test<O>(obj: O extends (A<infer T> & B<infer T>) ? O : never): O {
    return {
        ...obj,
        a: obj.b(obj.a)
    }
}

const res1 = test(shouldWork); // okay
const res2 = test(shouldFail); // error\
// -------------> ~~~~~~~~~~
// Argument of type '{ a: string; b: (val: number) => number; }' \
// is not assignable to parameter of type 'never'

This works because the compiler will tend to plug the type of the parameter P in for the parameter X in a conditional type like X extends... , and so typeof obj is the candidate for O , which then gets checked against A<infer T> & B<infer T> . If a valid T is found, then obj will be checked against O again, which is no problem. If no valid T is found, then obj will be checked against never , which is probably not going to work. It's a bit more complicated but has similar behavior (with a less understandable error message).


As for why the following doesn't work:

declare function test<O extends A<T> & B<T>, T>(obj: O): O;

The problem is that there is no inference site for T . The compiler can use typeof obj to infer O , but when it checks T , it's stuck. You might expect that the compiler could infer T from O 's constraint A<T> & B<T> , but generic constraints are not used as inference sites this way. See microsoft/TypeScript#7234 for an old suggestion to support this. Instead of implementing such a feature, they suggested that people infer from intersections in exactly the way I have done at the beginning of this answer.

Anyway with no inference site for T , the compiler falls back to unknown for it. And then O is constrained to A<unknown> & B<unknown> , and that is unlikely to work. So everything falls apart.

Playground link to code

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM