简体   繁体   中英

TypeScript function's return type from argument shows wrong type

I am trying to implement a function in TypeScript, whose return type is determined by the 2nd argument that is passed to the function.

For example,

const a = ifNotEmpty(var1, "hello"); //a's type should be string
const b = ifNotEmpty(var2, [1, 2]); //b's type should be number[]

But instead, I am getting a's type as "hello" and b's type as [1, 2]. Both constant types, hardcoded to "hello" and [1,2] instead of being "string" or "number[]";

Here is a screenshot of the determined types in VSCode在此处输入图像描述

As you can see, the a's type is "true" hardcoded, instead of boolean. Is there a way to infer the 2nd parameter's type and return that type as the return type of the function, so a's type will be "boolean" and not "true"?

Here is the function, what I tried to do:

const ifNotEmpty = <U>(value: any, defaultValue: U): U => {
    return value ?? defaultValue;
};

true is inferred because the default value is taken from the second parameter as strictly as possible. There is no way to tell TS otherwise. It could be seen as an advantage:)

Also, I edited your function type, to not ignore the first parameter unless it is actually null or undefined, making it reflect the type correctly.

const ifNotEmpty = <D, T>(
  value: T,
  defaultValue: U,
): T extends null | undefined ? D : T => {
    return value ?? defaultValue;
};

const a: boolean = ifNotEmpty(null, true);

if i am getting right what you wanna do. You just need return

return value != null && value!=undefined ? typeof value : defaultValue;

And then u get type of your first value and not the content of the variable

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM