In this course, we're going to build out a Livewire CSV importer that handles
00:04
potentially millions of rows from a CSV file. Let's dive in straight away and have a look at how this works. So we're going to start with products. We're also going to be exploring these
00:13
customers as well so we can see that the importer component that we're building is going to work with multiple models even at the same time. We'll see that in just a second. So let's go ahead and upload a file.
00:25
You can see the maximum is 50 megabytes because we might have quite a few products in this particular document. We've got about 60,000 records, so not a huge amount. But we're going to go ahead and open that up and just wait for that to upload.
00:38
So once that's done, you can see that we've got this column mapping, which we can define per model. So basically we can choose which columns we're allowing the user to import. And the CSV file headers are going to be
00:51
read from this so we can easily map these up. So we've got the ID here and these are also required. You can choose whether they're required or not. It doesn't matter.
01:01
We've got a title here and we've got a price. So just some really simple data to get us started. So once we go ahead and hit import here, that's going to kick off an import batch and it's going to show us the progress
01:13
of the import. I've set this purposely a little bit slower so we can play around with it, but you can increase the batch size. I think the batch size here is about 3000 at the moment. So what we're doing is as these are getting imported,
01:27
we're upserting them into the database. So the ID that we chose in that column, if that already exists in the database for any reason, it will be ignored. But once we get down to the import job
01:38
itself, if you need to change how this works, you can handle this in any way you want. Now, the really great thing about this importer is all of this is happening in the background. Most of the time when you're importing stuff, you need to hang around on the page.
01:51
What we can actually do is refresh the page here. You can already see some of this data rolling in. And if we import, you can see we're just picking back up from the progress that we were at.
02:00
So you can literally close the browser window and just let this run in the background. It's all being handled on the server. We don't need to keep the page open for that. And obviously, that's really good for if you're importing a huge amount of records.
02:14
So we're going to go over to our customer section here and see that when we give this a refresh, we don't if I just refresh that, we don't see that status for the products import because these are sort of tied into two different models.
02:27
So we can have both of these going at the same time, just depending on how we've configured our queue. So let's go ahead and import these customers. I think actually we had a huge amount in here.
02:37
Yeah, we've actually got nearly half a million in this products one. I think we've got less in the customers one. You can see there are a few more fields here as well. So let's just choose all of these and we'll go ahead and hit import to
02:49
demonstrate that we can have pretty much two going at the same time if we need to. Now, the way I've set up my queue at the moment is I only have one process for this, but you can increase this. So essentially what's going to happen is
03:00
once this one is finished, then it's going to go on and start to import this. So this one is just waiting for the previous job. But that's just queue configuration. We don't need to do any of that in code.
03:10
We can configure that when we set up our config for our queue. Once again, we can refresh the page here and the import is just going to stay here and it's just going to wait until that's finished. We can also import more than one file at once.
03:25
So if you had, for example, some products or customers broken up into two different files, you can go ahead and import these at the same time. It doesn't matter. It's going to allow you to do this.
03:35
So let's just imagine this is the same file, but we'll imagine that they are two different files. We could go ahead and import them as well. And we basically just get two lots of these.
03:43
This will show the file name as well so we can keep track of it. So we can import multiple. And these will be just run either at the same time or one after each other. And there we go.
03:53
So it's pretty flexible. And my favorite part about this is that we can refresh the page. We don't need to hang around and sit and wait until this is finished. So really importantly about this importer,
04:04
we're setting this up to handle millions of records. So however big the CSV is, obviously within limits of file uploads and stuff like that, this can handle it because we're going to be streaming through the file itself.
04:18
And then we're going to be using generators to iterate over all of these records so we don't run into any memory issues. So what we're going to do is get started in the next episode and build this out from scratch.
04:30
Throughout the course, you'll also learn a ton about Livewire and you'll also learn about advanced Q functionality, things like generators and more. So let's head over to the next episode and get set up.
25 episodes•2 hrs 20 mins•2 years ago
Overview
Let's build a powerful CSV importer with Livewire, completely from scratch. This can handle millions of rows, be reused for multiple models, and by using job batches, doesn't require the browser to be open.
This course is for you if:
You need a robust importer component for your models that you have full control over
You want to brush up on some advanced Livewire concepts
You want to learn about job batching and queues in Laravel