Skip to content
Explain to Dev
Explain to Dev

Empowering developers with the knowledge to build, create, and innovate in the software world.

  • Home
  • About
  • Java
  • Python
  • PHP
  • .NET
  • Node.js
  • SQL
  • Privacy Policy
Explain to Dev

Empowering developers with the knowledge to build, create, and innovate in the software world.

How to Optimize PHP Scripts That Handle Large Datasets to Avoid Memory Exhaustion

etd_admin, March 24, 2025March 24, 2025

When working with large datasets in PHP, memory exhaustion is a common issue. If your script tries to load too much data at once, it can consume all available memory, leading to crashes. To avoid this, you need to optimize PHP scripts that handle large datasets using efficient techniques.

Use Generators Instead of Arrays

Instead of loading entire datasets into memory, use generators (yield). Generators return data one item at a time, reducing memory usage.

function readLargeFile($filePath) {
    $handle = fopen($filePath, 'r');
    if (!$handle) {
        throw new Exception("Cannot open file");
    }

    while (($line = fgets($handle)) !== false) {
        yield $line; // Returns one line at a time
    }

    fclose($handle);
}

foreach (readLargeFile('data.txt') as $line) {
    echo $line;
}

Generators only keep the current line in memory instead of the entire file.

Use Database Pagination Instead of Fetching Everything

If you’re fetching data from a database, avoid retrieving all records at once. Use pagination (LIMIT and OFFSET) to process chunks of data.

$pdo = new PDO("mysql:host=localhost;dbname=testdb", "user", "password");
$limit = 1000;
$offset = 0;

while (true) {
    $stmt = $pdo->prepare("SELECT * FROM large_table LIMIT :limit OFFSET :offset");
    $stmt->bindValue(':limit', $limit, PDO::PARAM_INT);
    $stmt->bindValue(':offset', $offset, PDO::PARAM_INT);
    $stmt->execute();
    
    $rows = $stmt->fetchAll(PDO::FETCH_ASSOC);
    
    if (empty($rows)) {
        break;
    }

    foreach ($rows as $row) {
        // Process each row
    }

    $offset += $limit;
}

It limits memory usage by processing small chunks of data instead of loading everything at once.

Stream Large Files Instead of Reading Them All at Once

If your script processes large files (e.g., CSV, JSON), avoid file_get_contents() because it loads the entire file into memory. Instead, use fopen() and fgets().

$handle = fopen('large.csv', 'r');

while (($row = fgetcsv($handle)) !== false) {
    // Process each row
    print_r($row);
}

fclose($handle);

It reads the file line by line, keeping memory usage low.

Optimize PHP’s Memory Settings

If necessary, increase PHP’s memory limit cautiously. Edit php.ini or use:

ini_set('memory_limit', '512M'); // Adjust as needed

However, increasing memory should be a last resort. Instead, optimize PHP scripts that handle large datasets using better techniques.

Use Unbuffered Queries for Large Result Sets

By default, PHP loads all query results into memory. Use PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false to disable buffering.

$pdo = new PDO("mysql:host=localhost;dbname=testdb", "user", "password", [
    PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false
]);

$stmt = $pdo->query("SELECT * FROM large_table");
while ($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
    // Process each row
}

It fetches one row at a time instead of storing all rows in memory.

PHP MemoryOptimizationPHP

Post navigation

Previous post
Next post
©2025 Explain to Dev | WordPress Theme by SuperbThemes
Explain to Dev
Explain to Dev

Empowering developers with the knowledge to build, create, and innovate in the software world.

  • Home
  • About
  • Java
  • Python
  • PHP
  • .NET
  • Node.js
  • SQL
  • Privacy Policy