简体   繁体   中英

Rust Type annotation Error in Index trait bounding

I am trying to implement a matrix data structure using the code below,

use std::ops::{Add, Index, IndexMut};
use num::{One, Zero};

type Idx2<'a> = &'a [usize];


pub trait MatrixTrait {
    fn zeros(shape: Idx2) -> Self;
    fn ones(shape: Idx2) -> Self;
}


#[derive(Default, Debug, Clone)]
pub struct Dense2DArray<T> {
    data: Vec<T>,
    shape: Vec<usize>,
}


impl <T> MatrixTrait for Dense2DArray<T>
    where
        T: Zero + One + Clone,
{
    fn zeros(shape: Idx2) -> Self {
        let data = vec![T::zero(); shape[0] * shape[1]];
        Self { data, shape: shape.to_vec() }
    }

    fn ones(shape: Idx2) -> Self {
        let data = vec![T::one(); shape[0] * shape[1]];
        Self { data, shape: shape.to_vec() }
    }
}


impl <T> Index<Idx2<'_>> for Dense2DArray<T> {
    type Output = T;

    fn index(&self, index: Idx2) -> &T {
        &self.data[index[0] * self.shape[1] + index[1]]
    }
}


impl <T> IndexMut<Idx2<'_>> for Dense2DArray<T> {
    fn index_mut(&mut self, index: Idx2) -> &mut T {
        &mut self.data[index[0] * self.shape[1] + index[1]]
    }
}


pub fn create_and_add_generic_matrix <'a, 'b, M, T> (a0: M) -> M
    where
        T: One + Add<Output=T> + Copy,
        M: MatrixTrait + Index<Idx2<'a>, Output = T> + IndexMut<Idx2<'b>>,
{
    let nx = 3; let ny = 2;
    let mut a1 = M::zeros(&[nx, ny]);

    for i in 0..nx {
        for j in 0..ny {
            a1[[i, j]] = a0[[i, j]] + T::one() + T::one();
        }
    }

    a1
}


fn main() {

    let nx = 3; let ny = 2;

    let a0 = Dense2DArray::<f64>::ones(&[nx, ny]);

    let b = create_and_add_generic_matrix(a0);

    println!("{:?}", b);
}

but I always get the error:

error[E0283]: type annotations needed
  --> linalg\src\matrices\mod.rs:56:26
   |
56 |         M: MatrixTrait + Index<Idx2<'a>, Output = T> + IndexMut<Idx2<'b>>,
   |                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^ cannot infer type for type parameter `M`
   |
   = note: cannot satisfy `M: Index<&'a [usize]>`

strangely, or maybe not, if I change the type to (and perform the required changes to the code such as removing the lifetimes):

type Idx2 = [usize; 2];

it works without problems, but I get restricted to 2D matrices.

I can't understand why the change in the way indexing is performed affects the type annotation of M , nor how should I resolve the issue?? Can anyone please help me understand what is happening?

Thanks.

This bound says that M must implement Index<Idx2<'a>, Output = T> for some specific caller-specified lifetime 'a , but what you actually want is to say that it must implement this trait for any possible lifetime 'a .

You need a higher-rank trait bound :

pub fn create_and_add_generic_matrix<M, T> (a0: M) -> M
where
    T: One + Add<Output=T> + Copy,
    M: MatrixTrait + for<'a> Index<Idx2<'a>, Output = T> + for<'a> IndexMut<Idx2<'a>>,

Note that you also have to borrow the index expression in your for loop, because an array isn't a slice:

a1[&[i, j]] = a0[&[i, j]] + T::one() + T::one();

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM