I am using the diesel to connect and fetch data from the Postgres database. I want to use pagination to fetch a large number of records from my database into a few records at a time. Let's say I have 1000 records of data, I need to fetch 20 data at a time with a total number of records. Below is my table and I have used the inner join to fetch data.
#Emplyee table
| employee-id | employee_name | empolyee_email|
| ----------- | --------------|------------- |
| 1 | ABC |abc@mail.com |
| 2 | xyz |xyz@mail.com |
# Account table
| account | employee-id | account-balnce | created_at|
| -------- | ---------- |--------------- |-----------|
| 1 | 1 | 2000 | 22/10/2021|
| 2 | 2 | 5000 | 01/09/2021|
Query which I used to fetch data.
let employee_account = employee::table
.inner_join(account::table.on(employee::dsl::employee_id.eq(account::dsl::employee_id)),)
.filter(account::dsl::employee_id.eq(employeeId))
.load::<(Employee_Details,)>(&self.pool.get()?)?;
I explored the Diesel doc and find the limit
( doc ) and offset
( doc ) functions look promising for implementing pagination. Please try to add these two functions before the load
function like below. Initialize page_size with 20 and offset with 0. Loop until no data returns. Increase the offset by the page size for next load.
let page_size = 20;
let mut offset = 0;
let employee_account = employee::table
.inner_join(account::table.on(employee::dsl::employee_id.eq(account::dsl::employee_id)),)
.filter(account::dsl::employee_id.eq(employeeId))
.limit(page_size)
.offset(offset)
.load::<(Employee_Details,)>(&self.pool.get()?)?;
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.