简体   繁体   中英

Lifetime parameters in supertrait bounds

I'm trying to define a trait for an object that is convertible to and from a slice of bytes. I essentially want to say

trait Foo: AsRef<[u8]> + TryFrom<&[u8]> {}

Unfortunately, this refuses to compile unless I put a lifetime parameter on the reference, like so:

trait Foo<'a>: AsRef<[u8]> + TryFrom<&'a [u8]> {}

This doesn't make much sense to me, because the lifetime 'a is related to the eventual try_from() call and shouldn't have to be part of the object's type. (The implementation of try_from() copies the relevant bytes, so the lifetime of its parameter really isn't relevant.)

This seems like a more general issue than just slices, though; how do you specify lifetime parameters like this for supertrait bounds? (Apparently '_ doesn't work.) And is there a better/more idiomatic way to express this, or do I have to resort to some sort of hand-rolled custom nonsense like

pub trait TryFromRef<T> { type Error; fn try_from(value: &T) -> Result<Self, Self::Error>; }

?

A trait bound with a lifetime parameter that holds for all lifetimes, rather than for some particular lifetime, can be specified with a so-called higher-ranked trait bound , or HRTB. In your case this might look like

trait Foo: AsRef<[u8]> + for<'a> TryFrom<&'a [u8]> {}

Anything implementing Foo must satisfy TryFrom<&'a u8> for any and all choices of 'a , so there's no need for a lifetime on Foo itself.

See also

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM