PHP Performance Optimization: Beyond OPcache
Deep dive into advanced PHP performance techniques including memory management, profiling tools, and architectural patterns that go beyond basic caching.
Everyone enables OPcache and calls it a day. That’s fine for small apps, but if you’re pushing real traffic through PHP, you need to look deeper. Here’s what actually matters once OPcache is already running.
Memory Patterns That Kill Performance
PHP’s memory model trips people up constantly. It uses reference counting and garbage collection, which sounds fine until you write code that fights against it.
Here’s a common mistake:
function processLargeDatasetBad(array $data): array {
$allItems = array_map([$this, 'transform'], $data);
return array_filter($allItems, [$this, 'shouldProcess']);
}
This transforms the entire dataset into memory, then filters it. If you’re processing 100,000 rows, you’re holding 200,000 in memory at peak. It’s wasteful and it shows up in your memory graphs.
Better approach:
function processLargeDataset(array $data): array {
$result = [];
foreach ($data as $item) {
if ($this->shouldProcess($item)) {
$result[] = $this->transform($item);
}
}
return $result;
}
Filter first, transform second. Peak memory stays constant. This is the kind of thing that doesn’t matter in development with toy datasets but kills you in production.
Profiling Tools That Actually Help
Xdebug is slow as hell in dev, but its profiling mode is useful when you need it:
xdebug_start_trace();
$result = $this->expensiveOperation();
xdebug_stop_trace();
You get cachegrind files you can analyze with KCachegrind or similar tools. The overhead is significant, so don’t run this on production traffic. Use it locally to find the hot paths, then optimize those.
For production, Blackfire.io is less intrusive:
$probe = \BlackfireProbe::getMainInstance();
$probe->enable();
$this->criticalPathCode();
$probe->disable();
It lets you profile in prod without destroying your response times. Worth the money if you’re dealing with performance-sensitive code.
Database Queries: The Usual Suspect
Most PHP performance problems trace back to database calls. Either you’re making too many, or the ones you’re making are slow.
Prepared statements help, but you need to use them correctly:
class OptimizedUserRepository
{
public function findActiveUsers(): array
{
$stmt = $this->pdo->prepare('
SELECT id, email, created_at
FROM users
WHERE status = :status
AND created_at > :since
');
$stmt->bindValue(':status', 'active', PDO::PARAM_STR);
$stmt->bindValue(':since', $this->getThresholdDate(), PDO::PARAM_STR);
return $stmt->fetchAll(PDO::FETCH_ASSOC);
}
}
The bindValue calls with explicit types let MySQL optimize the query plan better. Small detail, but it compounds when you’re running millions of queries.
If you’re doing async work, ReactPHP’s MySQL connector helps:
use React\MySQL\Factory;
use React\MySQL\QueryResult;
$factory = new Factory();
$connection = $factory->createLazyConnection('mysql://user:pass@localhost/db');
$connection->query('SELECT * FROM users WHERE active = 1')
->then(function (QueryResult $result) {
return $result->resultRows;
});
This lets you run multiple queries concurrently instead of waiting on each one sequentially. Good for aggregating data from multiple sources.
Generators for Memory-Heavy Operations
If you’re processing large files or datasets, generators keep memory usage flat:
function readLargeFile(string $filename): \Generator
{
$handle = fopen($filename, 'r');
while (($line = fgets($handle)) !== false) {
yield trim($line);
}
fclose($handle);
}
// Process millions of records with constant memory
foreach (readLargeFile('massive-dataset.txt') as $line) {
processLine($line);
}
Without generators, you’d need to load the entire file into an array first. With generators, you’re only holding one line in memory at a time. The difference shows up immediately once your datasets get large enough.
Array Operations That Don’t Suck
This pattern shows up everywhere and it’s terrible:
$result = [];
foreach ($arrays as $array) {
$result = array_merge($result, $array); // Creates new array each time
}
Every array_merge call creates a new array and copies everything over. If you’re merging 1,000 arrays with 100 elements each, you’re doing millions of unnecessary copies.
Use the spread operator instead (PHP 7.4+):
$result = [];
foreach ($arrays as $array) {
array_push($result, ...$array); // Modifies existing array
}
Or if you’re on PHP 8.1+, just use array unpacking in the initial assignment when possible.
What Actually Matters
Profile first. Every application has different bottlenecks, and optimizing the wrong thing wastes time. Run Blackfire on your production traffic for a week, identify the top 5 slowest endpoints, then optimize those.
Most PHP performance problems come down to:
- Too many database queries (N+1 problems)
- Inefficient memory usage (copying data unnecessarily)
- Missing indexes on database queries
- Loading more data than you need
Fix those and you’ve solved 90% of your performance issues. The rest is micro-optimizations that only matter at extreme scale.
OPcache gets you 70% of the way there. These techniques handle the remaining 30% when you actually need them.