Laravel

Efficient Batch Processing: Streamlining Database to 3rd Party API Integration in Laravel

Hey Alex,

If the database has more than 50K records, and I want to process all of them to a 3rd party API using a single command, how can I do it without overlapping? How can we quickly post all 50K records using just one command? I find it messy to create multiple commands to run multiple tasks on the server to handle these records for the 3rd party API, and I currently have over 20 commands to handle 50K records.

Currently:

  1. A job is active to collect records.
  2. The job processes records and adds them to the database.
  3. a command retrieves the data from the database per status and processes it to the 3rd party API and updates status on database.
  4. There are many commands to handle all of them because they should be processed together as much as possible.

Please advise on the best way to handle these records to the 3rd party without overlapping with fewer commands. I was planning to dispatch the records a second time once they are added to the database instead of running a command. Is that the best solution? If I do this, each record will be dispatched twice.

vijay34043
vijay34043
0
10
142
alex
alex
Moderator

Hi Vijay

Could you give me more detail on the kind of processing you're doing with the 3rd party API?

From what I've read, I'm guessing you want to find a better way to process something via an API once it has been added to the database? And do you have any code examples of what you're currently doing?

vijay34043
vijay34043

Hi Alex,

Currently, I have a webhook set up to collect leads from various webhook URLs, and I am storing these leads in a database using Laravel Queue. Additionally, I have several commands to handle these leads and post them to 3rd party APIs, such as ESP APIs like Maropost. The API connection itself isn't the issue here. The problem arises when I attempt to process these leads.

Currently, I select 50 leads for processing in one command, and then I execute multiple commands to process the remaining leads. For instance, if there are 500 leads in the queue, my 10 commands each process 50 leads. However, this method isn't efficient.

What would be the best way to process all 500 leads at once? I've tried using Laravel Chunk, but it's slow and sometimes times out. Additionally, the server requires multiple tasks to handle these leads. Consequently, I've created multiple commands to handle them separately.

Is there a more efficient approach to process all 500 leads together?

alex
alex
Moderator

Thanks for clarifying Vijay. So, when you say "my 10 commands each process 50 leads", are these separate commands (as in code)? Or you're just running 10 separate instances of a single command in order to deal with all of the leads?

Let me know and then I'll cook up a code example for you, for how you might more efficiently processes these in one go.

vijay34043
vijay34043

Yes, these are all separate commands. I am using offset for each of them and withoutoverlap Laravel function.

alex
alex
Moderator

Hey Vijay. I haven't forgotten about this! Just wondering whether the processing you're doing within the commands is being queued, or are you just running these commands normally after the leads have been processed in the queue?

vijay34043
vijay34043

Hey Alex,

Please let me know if you require any additional information. I've provided more details below.

vijay34043
vijay34043

Hey Alex,

I hope you're doing well.

Currently, I'm processing leads using Laravel Cron by executing these commands in the usual way. After conducting some research and having a discussion with you, I'm considering dispatching and enqueuing these processes again to enable processing by a third-party API. Is this the approach you would recommend? However, I'm concerned that this method may require more time to process everything collectively, given that the lead will end up being queued twice. I'd appreciate hearing your thoughts on this matter.

alex
alex
Moderator

Yes, to be honest this is the approach I’d recommend.

For any large batch processing like this, queuing will enable them to be pushed away from the main thread of your application and you won’t see any memory limits hit.

If you try it out, make sure to allow multiple processes to run your queues — this way you’ll see multiple jobs being processed at the same time, and shouldn’t cause too much delay.

I’m not at my laptop right now but if you need any configuration advice let me know.

I’d also recommend you use Laravel Horizon if you’re not already, which provides easy configuration for this.

Let me know if you need any more help!

vijay34043
vijay34043

Thank you, Alex. I appreciate your help.

I'll attempt to use Laravel Horizon. I understand that I could also utilize a Supervisor.

alex
alex
Moderator

Great. Let me know how it goes and l’ll be here to help with anything!