Current Location: Home> Latest Articles> How to handle the result set of large data volumes: streaming reading

How to handle the result set of large data volumes: streaming reading

M66 2025-05-17

When developing PHP applications, especially when processing large amounts of data, how to effectively get data from a database and prevent memory overflow is a common and important issue. mysqli_result is one of the common functions used in PHP to process query results, but when faced with a large amount of data, if you do not use streaming reading, it will often lead to excessive memory usage, which will cause memory overflow problems.

This article will introduce how to use the mysqli_result function and streaming reading techniques to effectively process the result set of large data volumes to avoid memory overflow.

What is streaming reading?

Streaming reading refers to reading data line by line, rather than loading all results into memory at once. Through streaming reading, we are able to process data sets that are not suitable for one-time loading, avoiding the risk of excessive memory usage. Streaming reading is very useful especially when querying large data volumes.

mysqli_result provides streaming reading function, we can use MYSQLI_USE_RESULT to execute queries and process the result set line by line. Unlike MYSQLI_STORE_RESULT , the latter will load all data into memory, while MYSQLI_USE_RESULT reads data from the server line by line, which can significantly reduce the memory usage.

Use the mysqli_result function to stream data

Suppose we have a database query that needs to be processed, which returns a large amount of data. Here is a simple code example that demonstrates how to use streaming reading to process data:

 <?php
// Database connection configuration
$host = 'localhost';
$user = 'root';
$password = 'password';
$dbname = 'example_database';

// Create a database connection
$conn = new mysqli($host, $user, $password, $dbname);

// Check if the connection is successful
if ($conn->connect_error) {
    die("Connection failed: " . $conn->connect_error);
}

// Execute a query,use MYSQLI_USE_RESULT Perform streaming reading
$query = "SELECT id, name, email FROM users WHERE status = 'active'";
$result = $conn->query($query, MYSQLI_USE_RESULT);

// Check whether the query is successful
if ($result === false) {
    die("Query failed: " . $conn->error);
}

// Read data loop
while ($row = $result->fetch_assoc()) {
    // Process each row of data
    echo "ID: " . $row['id'] . " - Name: " . $row['name'] . " - Email: " . $row['email'] . "<br>";
}

// Release query results
$result->free();

// Close the database connection
$conn->close();
?>

Code parsing:

  1. Database Connection : We first created a connection to the MySQL database.

  2. Query execution : When executing the query, we used MYSQLI_USE_RESULT as the second parameter. In this way, the result set will not be loaded into memory at once, but will get data from the database line by line as needed.

  3. Line by line reading : Through the fetch_assoc() method, we read the query results line by line and process each row of data.

  4. Release resources : After processing the data, call free() to release the result set resource.

  5. Close the connection : Finally close the database connection.

Through the above code, we can effectively avoid the risk of memory overflow, especially when the query result set is very large.

Tips for optimizing streaming reading

There are some tips to help further optimize performance during streaming reading, especially when the result set data is very large:

  1. Limit the result set size of a query : You can limit the result set size of each query through paging. For example, LIMIT can be used to limit the number of query results and read data in batches. For example:

     $offset = 0;
    $limit = 1000;
    $query = "SELECT id, name, email FROM users WHERE status = 'active' LIMIT $offset, $limit";
    

    By adjusting offset and limit , we can read data in batches to avoid memory overflow by reading a large amount of data at once.

  2. Index optimization : Ensure that the relevant tables in the database have appropriate indexes, so that queries will be more efficient and reduce the burden of database reading.

  3. Process data step by step : When reading data, you can process the data step by step and free up memory that you no longer need in time. For example, if data processing involves file operations, you can write to the file for every certain amount of data processed instead of keeping all data in memory.

  4. Background task processing : For very large data sets, you can consider transferring tasks to background execution. For example, use a queue system (such as RabbitMQ) or breaking tasks into multiple batches for asynchronous processing.

Conclusion

Using the mysqli_result function with streaming reading can effectively avoid the problem of memory overflow when dealing with large data sets. Through reasonable query design, streaming reading and paging technologies, we can efficiently process large data sets in PHP while maintaining system performance and stability.

I hope the tips provided in this article can help you better process large data query result sets in actual development and improve application stability and efficiency.