How to have multiple unique columns in SingleStore
laravel-tips Jack Ellis · Jan 4, 2023You've used MySQL for many years and have multiple unique keys set up on your tables. Then your data grows, you say Goodbye to MySQL, and you move to SingleStore. But hold on a minute; SingleStore only allows you to use one unique key per table. And if you're using an auto-incrementing ID, which most of us are in the Laravel world, your one unique key is gone.
The simple solution
Before I get into this, I will give all credit to Carl Sverre. I told him how I was enforcing uniqueness via atomic locks, and he knew of a better way. Let's get to it.
Step 1: Modify your users table
Our users table is going to be the create_users_table migration with one modification. We have removed the unique() method on the email field, so the table won't enforce it.
Schema::create('users', function (Blueprint $table) { $table->id(); $table->string('name'); $table->string('email'); $table->timestamp('email_verified_at')->nullable(); $table->string('password'); $table->rememberToken(); $table->timestamps();});
Step 2: Create your email_users table
So this is where the beauty comes in. We can't enforce email uniqueness on the users table, but we can enforce it on another table. Stay with me, folks.
Schema::create('users_emails', function (Blueprint $table) { $table->string('email'); $table->primary('email');});
Step 3. Creating a user
We now have two tables involved when creating a user. To keep this simple, let's imagine we have a static function called registerUser somewhere in our application. This is what we might do:
function registerUser($email, $password) { try { DB::transaction(function() use ($email, $password) { // We create our user as usual User::forceCreate([ 'email' => $email, 'password' => Hash::make($password) ]); // Now, here's the beautiful part // If the email already exists in this table // it will throw an exception, enforcing uniqueness DB::table('users_emails') ->insert(['email' => $email]); }); } catch (\Exception $e) { // Handle the exception }}
Step 4: Editing a user
How easy was that? And now, let me show you what we'd do when editing a user.
function changeEmail($userId, $oldEmail, $newEmail) { try { DB::transaction(function() use ($userId, $oldEmail, $newEmail) { // Modify the email in the users table User::where('id', $userId) ->update(['email' => $newEmail]); // We have to delete the email because the primary key is immutable DB::table('users_emails') ->where('email', $oldEmail) ->delete(); // And then we insert the new email of course DB::table('users_emails') ->insert(['email' => $newEmail]); }); } catch (\Exception $e) { // Handle the exception }}
How epic is that?
Step 5: Deleting a user
And now deleting a user is pretty similar to editing a user.
function deleteUser(User $user) { try { DB::transaction(function() use ($user) { // Delete the users email DB::table('users_emails') ->where('email', $user->email) ->delete(); // Goodbye my lover, goodbye my friend $user->delete(); }); } catch (\Exception $e) { // Handle the exception. You wouldn't get a "unique" error here though. }}
How cool is that?
My previous recommendation was to use atomic locks to enforce uniqueness, which is acceptable, but this is better. With our transaction approach, there are fewer queries, and things are faster. And uniqueness is enforced by the database.
You have duplicate data, but who cares? Databases are about trade-offs. Technology is about trade-offs. Life is about trade-offs. Throw in 10,000,000 email addresses, but you have the database enforcing uniqueness thanks to your transactions? I'll take that any day.
BIO
Jack Ellis, CTO + teacher
Recent blog posts
Tired of how time consuming and complex Google Analytics can be? Try Fathom Analytics: