How to Import Large CSV Files in Laravel with AJAX and Progress Bar
Importing large CSV files (over 100,000 rows) into a database can be tricky due to memory limits and PHP timeouts. This tutorial shows you how to import CSV files into a users table in Laravel using AJAX, with a progress bar, chunk processing, and duplicate checking. Optimized for XAMPP, this solution avoids the "Maximum execution time" error. Each code block includes detailed comments explaining every line, perfect for beginners and pros alike.
Features:
- Processes CSV in chunks (1000 rows) for efficiency.
- Displays a progress bar with accurate percentages.
- Skips duplicates based on email.
- Validates required fields (name, email).
- Handles UTF-8 encoding and logs errors.
Follow this step-by-step guide to handle large CSV files in Laravel.
Step 1: Set Up the Database
We need two tables: users
for storing name, email, and address, and import_progresses
for tracking import progress. Below are the migrations with detailed comments.
Users Table Migration
Creates the users
table with required fields.
<?php
// database/migrations/XXXX_XX_XX_create_users_table.php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
class CreateUsersTable extends Migration
{
public function up()
{
// Create the 'users' table in the database
Schema::create('users', function (Blueprint $table) {
$table->id(); // Auto-incrementing primary key
$table->string('name'); // Name column, required
$table->string('email')->unique(); // Email column, unique for duplicate checking
$table->string('address')->nullable(); // Address column, optional
$table->timestamps(); // Adds created_at and updated_at columns
});
}
public function down()
{
// Drop the 'users' table if migration is rolled back
Schema::dropIfExists('users');
}
}
Run: php artisan migrate
. Learn more about Laravel migrations.
Import Progress Table Migration
Tracks file position and progress for resumable imports.
<?php
// database/migrations/XXXX_XX_XX_create_import_progresses_table.php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
class CreateImportProgressesTable extends Migration
{
public function up()
{
// Create the 'import_progresses' table
Schema::create('import_progresses', function (Blueprint $table) {
$table->id(); // Auto-incrementing primary key
$table->string('file_path'); // Path to the stored CSV file
$table->unsignedBigInteger('file_position')->default(0); // Current position in the file
$table->unsignedBigInteger('rows_processed')->default(0); // Number of rows processed
$table->unsignedBigInteger('total_rows')->default(0); // Total rows in the CSV
$table->string('status')->default('pending'); // Status: pending, processing, completed, failed
$table->text('error_message')->nullable(); // Error message if import fails
$table->timestamps(); // Adds created_at and updated_at columns
});
}
public function down()
{
// Drop the 'import_progresses' table if migration is rolled back
Schema::dropIfExists('import_progresses');
}
}
Run: php artisan migrate
.
Step 2: Create the Import Progress Model
Manages import progress data.
<?php
// app/Models/ImportProgress.php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class ImportProgress extends Model
{
// Specify fields that can be mass-assigned
protected $fillable = ['file_path', 'file_position', 'rows_processed', 'total_rows', 'status', 'error_message'];
}
Step 3: Update the User Model
Configures Laravel’s default User
model for our fields.
<?php
// app/Models/User.php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class User extends Model
{
// Specify fields that can be mass-assigned
protected $fillable = ['name', 'email', 'address'];
}
Step 4: Create the Controller
The UserImportController
handles CSV upload and chunk processing with AJAX, checking duplicates on email
. Each line is commented.
<?php
// app/Http/Controllers/UserImportController.php
namespace App\Http\Controllers;
use App\Models\ImportProgress; // Import the ImportProgress model
use App\Models\User; // Import the User model
use Illuminate\Http\Request; // Import Request class for HTTP requests
use Illuminate\Support\Facades\Log; // Import Log facade for logging
class UserImportController extends Controller
{
public function import(Request $request)
{
// Check if request is not AJAX; if so, render the import view
if (!$request->ajax()) {
return view('import_users'); // Load import_users.blade.php
}
// Validate the uploaded file
$request->validate([
'csvFile' => 'required|file|mimes:csv,txt', // Ensure file is CSV or text
]);
try {
// Get the uploaded file and store it
$file = $request->file('csvFile');
$filePath = $file->store('csv_uploads'); // Save in storage/csv_uploads
$fullPath = storage_path('app/' . $filePath); // Get full file path
// Count total rows, excluding header
$totalRows = 0;
if (($handle = fopen($fullPath, 'r')) !== false) {
fgetcsv($handle); // Skip header row
while (fgetcsv($handle)) {
$totalRows++; // Increment row count
}
fclose($handle); // Close file
}
// Create or retrieve import progress record
$progress = ImportProgress::firstOrCreate(
['file_path' => $filePath], // Unique by file path
[
'file_position' => 0, // Start at file beginning
'rows_processed' => 0, // No rows processed yet
'total_rows' => $totalRows, // Total rows in CSV
'status' => 'pending', // Initial status
]
);
// Return JSON response for AJAX
return response()->json([
'success' => true, // Indicate success
'progress_id' => $progress->id, // Progress record ID
'total_rows' => $totalRows, // Total rows for progress bar
'message' => 'Import started. Processing first chunk...', // Status message
]);
} catch (\Exception $e) {
// Log error and return failure response
Log::error("Import initiation failed: " . $e->getMessage());
return response()->json([
'success' => false, // Indicate failure
'message' => 'Failed to start import: ' . $e->getMessage(), // Error message
], 500); // HTTP 500 status
}
}
public function processChunk(Request $request, $progressId)
{
// Ensure request is AJAX
if (!$request->ajax()) {
return response()->json(['success' => false, 'message' => 'Invalid request'], 400);
}
// Find progress record or fail
$progress = ImportProgress::findOrFail($progressId);
// Check if import is completed
if ($progress->status === 'completed') {
return response()->json([
'success' => true, // Indicate success
'completed' => true, // Import is done
'message' => 'Import completed successfully!', // Completion message
'rows_processed' => $progress->rows_processed, // Rows processed
'total_rows' => $progress->total_rows, // Total rows
]);
}
// Check if import failed
if ($progress->status === 'failed') {
return response()->json([
'success' => false, // Indicate failure
'message' => 'Import failed: ' . $progress->error_message, // Error message
], 500); // HTTP 500 status
}
try {
// Set batch processing parameters
$batchSize = 50; // Rows per database insert
$maxRowsPerChunk = 1000; // Max rows per chunk
$batchData = []; // Store rows for insertion
$rowsProcessedInChunk = 0; // Track rows in this chunk
// Open CSV file
$filePath = storage_path('app/' . $progress->file_path);
if (($handle = fopen($filePath, 'r')) !== false) {
// Ensure UTF-8 encoding
stream_filter_append($handle, 'convert.iconv.ISO-8859-1/UTF-8', STREAM_FILTER_READ);
// Move to current file position
fseek($handle, $progress->file_position);
// Define expected CSV header
$expectedCsvHeader = ['Name', 'Email', 'Address'];
// Validate header for first chunk
if ($progress->file_position == 0) {
$header = fgetcsv($handle); // Read header
$header = array_map('trim', $header); // Trim whitespace
if ($header !== $expectedCsvHeader) {
// Log error and fail import
Log::error("Invalid CSV header: " . json_encode($header) . ", Expected: " . json_encode($expectedCsvHeader));
$progress->update(['status' => 'failed', 'error_message' => 'Invalid CSV header']);
return response()->json([
'success' => false,
'message' => 'Invalid CSV header',
], 500);
}
// Log header validation
Log::info("CSV header validated: " . json_encode($header));
} else {
// Use expected header
$header = $expectedCsvHeader;
}
// Process rows in chunk
$continue = true;
do {
if ($row = fgetcsv($handle)) {
// Check column count
if (count($header) !== count($row)) {
// Skip row with warning
Log::warning("Skipping row due to column mismatch: " . json_encode($row));
continue;
}
// Map CSV data
$data = array_combine($header, $row);
$mappedData = [
'name' => $data['Name'] ?? null, // Get Name
'email' => $data['Email'] ?? null, // Get Email
'address' => $data['Address'] ?? null, // Get Address
];
// Validate required fields
if (empty($mappedData['name']) || empty($mappedData['email'])) {
// Skip row with warning
Log::warning("Skipping row due to missing fields: " . json_encode($mappedData));
continue;
}
// Sanitize data
$name = $this->sanitizeText($mappedData['name']);
$email = $this->sanitizeText($mappedData['email']);
$address = $this->sanitizeText($mappedData['address']);
// Check for duplicate email
if (User::where('email', $email)->exists()) {
// Skip duplicate
Log::info("Skipped duplicate row: Email=$email");
continue;
}
// Add row to batch
$batchData[] = [
'name' => $name, // Sanitized name
'email' => $email, // Sanitized email
'address' => $address, // Sanitized address
'created_at' => now(), // Current timestamp
'updated_at' => now(), // Current timestamp
];
// Increment chunk counter
$rowsProcessedInChunk++;
} else {
// End of file
$continue = false;
break;
}
// Stop at chunk size
if ($rowsProcessedInChunk >= $maxRowsPerChunk) {
break;
}
} while ($continue);
// Insert batch data
if (!empty($batchData)) {
try {
// Log insertion attempt
Log::info("Attempting to insert batch of " . count($batchData) . " rows");
// Insert in smaller batches
foreach (array_chunk($batchData, $batchSize) as $chunk) {
User::insert($chunk); // Insert batch
// Log success
Log::info("Inserted batch of " . count($chunk) . " rows");
}
} catch (\Exception $e) {
// Log error
Log::error("Failed to insert batch: " . $e->getMessage() . " Data: " . json_encode($batchData));
}
} else {
// Log no data
Log::warning("No data to insert in this chunk");
}
// Update progress
$progress->update([
'file_position' => ftell($handle), // Current position
'rows_processed' => $progress->rows_processed + $rowsProcessedInChunk, // Update count
'status' => $row ? 'processing' : 'completed', // Update status
]);
// Log chunk processing
Log::info("Processed chunk with $rowsProcessedInChunk rows, total: {$progress->rows_processed}");
// Close file
fclose($handle);
// Return JSON response
return response()->json([
'success' => true, // Indicate success
'completed' => $progress->status === 'completed', // Completion status
'rows_processed' => $progress->rows_processed, // Rows processed
'total_rows' => $progress->total_rows, // Total rows
'message' => $progress->status === 'completed' ? 'Import completed!' : 'Processed chunk, continuing...', // Message
]);
}
} catch (\Exception $e) {
// Update progress on error
$progress->update([
'status' => 'failed', // Set to failed
'error_message' => 'Failed to process chunk: ' . $e->getMessage(), // Store error
]);
// Log error
Log::error("Import failed: " . $e->getMessage());
// Return error response
return response()->json([
'success' => false, // Indicate failure
'message' => 'Import failed: ' . $e->getMessage(), // Error message
], 500); // HTTP 500 status
}
}
private function sanitizeText($value)
{
// Return null if value is null
if (is_null($value)) {
return null;
}
// Trim whitespace
$value = trim($value);
// Return null if empty
if ($value === '') {
return null;
}
// Convert to UTF-8
$value = mb_convert_encoding($value, 'UTF-8', 'UTF-8,ISO-8859-1');
// Remove control characters
$value = preg_replace('/[\x00-\x08\x0B\x0C\x0E-\x1F\x7F-\x9F]/u', '', $value);
// Return sanitized value
return $value;
}
}
Create: php artisan make:controller UserImportController
. See more on Laravel AJAX.
Step 5: Create the View
The import_users.blade.php
view includes a form and progress bar, using AJAX for uploads and chunk processing.
<!-- resources/views/import_users.blade.php -->
<!DOCTYPE html>
<html>
<head>
<!-- Define HTML5 document -->
<title>CSV User Import</title>
<!-- Include CSRF token for AJAX -->
<meta name="csrf-token" content="{{ csrf_token() }}">
<!-- Load jQuery -->
<script src="https://code.jquery.com/jquery-3.6.0.min.js"></script>
<!-- Inline CSS for styling -->
<style>
/* Progress container */
#progress {
margin-top: 20px;
font-family: Arial, sans-serif;
}
/* Progress bar container */
#progress-bar {
width: 100%;
height: 30px;
background: #ddd;
border-radius: 5px;
position: relative;
overflow: hidden;
}
/* Progress bar fill */
#progress-fill {
height: 100%;
background: #4caf50;
width: 0;
transition: width 0.3s ease-in-out;
}
/* Percentage text */
#progress-text {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
color: #fff;
font-weight: bold;
text-shadow: 1px 1px 1px rgba(0,0,0,0.5);
}
/* Status message */
#status {
margin-top: 10px;
color: #333;
}
/* Row count display */
#rows-processed-container {
margin-bottom: 10px;
font-size: 16px;
}
</style>
</head>
<body>
<!-- Page heading -->
<h1>Upload User CSV</h1>
<!-- Form for CSV upload -->
<form id="import-form" enctype="multipart/form-data">
<!-- CSRF token -->
@csrf
<!-- File input -->
<input type="file" name="csvFile" accept=".csv,.txt" required>
<!-- Submit button -->
<button type="submit">Upload</button>
</form>
<!-- Progress bar, hidden initially -->
<div id="progress" style="display: none;">
<!-- Row count -->
<p id="rows-processed-container">Rows Processed: <span id="rows-processed">0</span></p>
<!-- Progress bar -->
<div id="progress-bar">
<!-- Progress fill -->
<div id="progress-fill"></div>
<!-- Percentage text -->
<span id="progress-text">0%</span>
</div>
<!-- Status message -->
<p id="status">Initializing...</p>
</div>
<!-- JavaScript for AJAX -->
<script>
$(document).ready(function() {
// Initialize total rows
let totalRows = 0;
// Handle form submission
$('#import-form').on('submit', function(e) {
e.preventDefault(); // Prevent default submission
let formData = new FormData(this); // Create FormData
$('#progress').show(); // Show progress bar
$('#status').text('Uploading file...'); // Set status
$('#rows-processed').text('0'); // Reset count
$('#progress-fill').css('width', '0%'); // Reset bar
$('#progress-text').text('0%'); // Reset percentage
// Send AJAX POST for upload
$.ajax({
url: '{{ route("user.import") }}', // Import route
type: 'POST', // HTTP method
data: formData, // Form data
processData: false, // No data processing
contentType: false, // Browser sets type
headers: {
'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content') // CSRF token
},
success: function(response) {
// Handle success
if (response.success) {
totalRows = response.total_rows || 100000; // Set total rows
console.log('Total rows:', totalRows); // Debug
$('#status').text(response.message); // Update status
processChunk(response.progress_id); // Start chunks
} else {
$('#status').text('Error: ' + response.message); // Show error
}
},
error: function(xhr) {
// Handle error
$('#status').text('Error: ' + (xhr.responseJSON?.message || 'Upload failed'));
}
});
});
// Process chunks via AJAX
function processChunk(progressId) {
$.ajax({
url: '{{ route("user.import.process", ["progressId" => ":progressId"]) }}'.replace(':progressId', progressId), // Chunk route
type: 'GET', // HTTP method
headers: {
'X-CSRF-TOKEN': $('meta[name="csrf-token"]').attr('content') // CSRF token
},
success: function(response) {
// Handle success
if (response.success) {
$('#rows-processed').text(response.rows_processed); // Update count
totalRows = response.total_rows || totalRows; // Update total
// Calculate percentage
let progressPercent = totalRows > 0 ? Math.min((response.rows_processed / totalRows) * 100, 100) : 0;
$('#progress-fill').css('width', progressPercent + '%'); // Update bar
$('#progress-text').text(Math.round(progressPercent) + '%'); // Update percentage
$('#status').text(response.message || 'Processing...'); // Update status
// Continue if not completed
if (!response.completed) {
setTimeout(function() {
processChunk(progressId); // Next chunk
}, 1000); // 1-second delay
}
} else {
$('#status').text('Error: ' + response.message); // Show error
}
},
error: function(xhr) {
// Handle error
$('#status').text('Error: ' + (xhr.responseJSON?.message || 'Chunk processing failed'));
}
});
}
});
</script>
</body>
</html>
Explore more on Laravel Blade.
Step 6: Define Routes
Routes for the import form, upload, and chunk processing.
<?php
// routes/web.php
use App\Http\Controllers\UserImportController; // Import controller
// Routes for import form (GET) and upload (POST)
Route::get('/user-import', [UserImportController::class, 'import'])->name('user.import');
Route::post('/user-import', [UserImportController::class, 'import'])->name('user.import');
// Route for chunk processing (GET)
Route::get('/user-import/process/{progressId}', [UserImportController::class, 'processChunk'])->name('user.import.process');
Step 7: Test the Import
Test with a sample CSV (users.csv
):
Name,Email,Address
John Doe,john@example.com,123 Main St
Jane Smith,jane@example.com,456 Oak Ave
John Doe,john@example.com,789 Pine Rd
The third row is skipped as a duplicate, logged as Skipped duplicate row: Email=john@example.com
.
Steps:
- Visit
http://localhost/user-import
(adjust for XAMPP). - Upload
users.csv
. - Monitor progress bar and row count.
- Check:
SELECT COUNT(*) FROM users;
(expect 2 rows).
Step 8: Debugging Tips
Troubleshoot your Laravel CSV import:
- Logs: Check
storage/logs/laravel.log
for:Skipped duplicate row
: Duplicate check working.No data to insert
: All rows skipped (check CSV).Invalid CSV header
: Verify header is Name,Email,Address
.
- Inserts: Run
SELECT COUNT(*) FROM users;
. Progress Bar: Check browser console (F12) for Total rows: <number>
.
max_execution_time=60
memory_limit=512M
Restart Apache.
Conclusion
This guide showed you how to import large CSV files in Laravel with AJAX, including a progress bar and duplicate checking. The commented code is ready to use, and the chunk processing ensures scalability. Have questions or issues? Leave a comment below or subscribe for more Laravel tutorials!