Loading, please wait...

picture of me
Joe Rushton Fullstack Web Developer

How To: Improve performance when working with MySQL and PHP

03rd April 2017 Reading Time: 7 mins

These basic tips will not only help to increase your websites popularity with faster page loading times but will also improve your applications scalability. The title may imply that this is strictly for mysql and php but realistically any RDBMS (mysql, oracle, ms sql etc) and any programming/scripting language that involves interaction with the aforementioned RDBMS will suffice.


1) Creating multiple records at once

You've probably used the INSERT statement before but if you haven't, expect it to look something like this

INSERT INTO `users` (name, email) VALUES (joe, joebloggs@gmail.com)

Which creates a new record in the users table with a name of "joe" and an email of "joebloggs@gmail.com". If you wanted to create several different users in a single page request, there are 2 main ways to do so.

INSERT INTO `users` (name, email) VALUES (ronnie, ronniecoleman@hotmail.com)

INSERT INTO `users` (name, email) VALUES (johnny, johnnycolemane@gmail.com)

Either with multiple queries as shown above, or the more efficient method, which is to concatenate multiple sets of values into a single SQL statement which could look like this

INSERT INTO `users` (name, email) VALUES (ronnie, ronniecoleman@hotmail.com), (johnny, johnnycolemane@gmail.com)

Now for a PHP demonstration to give you some appication. The following function is a very dumbed down concept of storing image references in a database table, the idea is that we build up an SQL query through string concatenation and we do so by looping through an array of data which could have come from a user selecting or uploading an image.

function storeImages($images = array()) {
	$sql = "INSERT INTO images (`post_id`, `filename`) VALUES";

	foreach ($images as $row) {
		$sql .= ' (' . $row['post_id'] . ', ' . $row['filename'] . '),';

	$sql = rtrim($sql, ',');

	$result = $this->db->query($sql);

	return $result->num_rows;


2) Avoid executing queries inside loops

I'm definitely guilty of writing something like this in my early days of my job

// assume we've just done a SELECT on the posts table to get the $posts array
foreach ($posts as &$post) {
   // find all comments matching the post_id
   $comments = $db->query("SELECT * FROM comments WHERE post_id = " . $post['id']);

   // populate the 'comments' array key of the current post record (notice the & before $post in the foreach)
   while ($row = $comments->fetch_assoc()) {
      $post['comments'][] = $row;

The end result - the `$posts` array contains posts along with all of their comments. The downside? A query is executed for every single post which on a very small scale, wouldn't matter too much but as soon as we start getting into the 10s/100s/1000s, it starts to get really slow and is generally bad practice.

Some better alternatives are to use a JOIN or fetch all post data and comments data separately and link them up through PHP. A SQL JOIN is better in most cases and may look like this

SELECT * FROM posts LEFT JOIN comments ON posts.id = comments.post_id

3) Indexing large tables

If you have a table with 10s of thousands of rows, inserting just one index on a column clause can speed up SELECT query performance significantly. The obvious downfall is that you sacrifice write (INSERT/UPDATE/DELETE) performance so it depends entirely on your use case and getting the right balance.

Queries that fetch large result sets or are repeated numerous times within a single request would most likely benefit from some trial and error indexes. A good place to start is by looking at the WHERE claus and establishing which column(s) you're filtering on and applying indexes to them, e.g:

CREATE INDEX basket_index
ON baskets (basket_id, user_id);


Sometimes switching out your GROUP BY clause with a SELECT DISTINCT can have a huge improvement on performance. An example I've had has reduced a query from 10s to 0.01s and the results set was exactly the same. Functionally they are the same when your query doesn't have any aggregate functions (e.g. MIN(), COUNT()) however from experience, I've noticed DISTINCT perform faster on larger data sets especially involving sub queries.

5) Fetch the least amount of rows required

This is quite the obvious suggestion but sometimes the obvious needs stating. Use conditions (WHERE) to narrow down the results set rather than trying to process the data in PHP. Use conditions in the ON clause rather than the WHERE clause if it applies to the join table. Example:

SELECT id, title, body FROM posts

INNER JOIN categories ON posts.category_id = categories.id

WHERE categories.featured = 1

Notice the WHERE clause contains a mention of the join table. It'd be much better to write it as

SELECT id, title, body FROM posts

INNER JOIN categories ON posts.category_id = categories.id AND categories.featured = 1
Back to Homepage

Find me on social media