简体   繁体   中英

Laravel PHP asynchronous function call

I've a restful API (Laravel 5.4) that does the following scenario

  1. Validate some inputs
  2. Generate some PDF files
  3. Collect the generated PDF files in one archive file
  4. (If success) Send an email to the customer with the generated archive file URL, in order to download it.
  5. Return response (JSON) with the validation result or process results

The problem ofcourse is that as long as files number increases the response time increases as well ... so I need to do the following

  1. Validate some inputs
  2. Return response (JSON) with the validation result and a message that once the archive is created you will be notified by email

Then in another thread if all inputs were valid I want to

  1. Generate some PDF files
  2. Collect the generated PDF files in one archive file
  3. (If success) Send an email to the customer with generated archive file URL, in order to download it.
  4. (If failure) Send an email to the customer with problem.

I've read about many techniques Laravel Queues, Spatie\\Async package, ...etc, but I believe my case is much simple than all of that, and I don't want to add worthless overheads on my project .. so could you please guide me to the best optimized technique to solve/implement the above scenario.

so could you please guide me to the best optimized technique to solve/implement the above scenario.

I think that you already outlined the most optimized way to solve it given your problem. If your constraint is to minimize latency of the HTTP request than the work should be offloaded:

在此处输入图片说明

In order to model this, some sort of queue is usually used:

This way the HTTP publishes the archive request and returns immediately to the client, and then asynchronosly the archiver reads from the queue:

在此处输入图片说明


Using a queue has a lot of benefits:

  • Decouples HTTP from the archiver
  • Opens up pontential for durability
  • Supports back pressure
  • Supports horizontal scale out of archivers
  • Supports archivers to be provisioned on different hardware than HTTP

I think the implementation of the queue is largely dependent on your non-functional requirements:

  • Do archive requests need to be durable?
  • Do you need load balancing? of archival jobs?

Even if you were to go with laravel queues there are a couple of questions around durability that need to be answered. Ie if you were to use redis and larevel queue how do you handle durability? Is it ok to lose an archive request? Do you need to fsync every write to redis (AOF persistence?)Is it ok to lose writes?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM