DEV Community

Cover image for 💡 Database Development: It’s Not Just About Querying!
Ram kumar Shrestha
Ram kumar Shrestha

Posted on

💡 Database Development: It’s Not Just About Querying!

As backend/database engineers, we often think of databases as places to store and retrieve data. But sometimes, with a bit of exploration, we realize they can do much more.

Recently, I faced a unique requirement in my project with PostgreSQL: after inserting a row into public.test_table, every existing row in that same table needed to be updated to reflect the new information.

Initial Approach

At first, my instinct was to handle this logic in the backend, creating a method in the NestJS service layer to process each row update. But then I paused and thought: could PostgreSQL handle this independently?

A bit of research later, I found the answer: Yes, with triggers! By leveraging PostgreSQL triggers and functions, we can move this logic directly to the database, improving both performance and maintainability.

The Solution: PostgreSQL Trigger and Function

Using a trigger to automate updates means that every time a row is inserted, PostgreSQL takes over and runs the update logic on all rows.

Here’s the process:

  1. Define a function in PostgreSQL to specify the logic applied to all rows.
  2. Create a trigger that executes this function after every insert on public.test_table.

Example Code

Here’s a simplified example of what this looks like in PostgreSQL :

-- Step 1: Create a function that updates all rows in the table
CREATE OR REPLACE FUNCTION update_all_rows() 
RETURNS TRIGGER AS $$
BEGIN
    -- Custom logic applied to all rows; here’s a simple example
    UPDATE public.test_table
    SET column_name = NEW.column_name;
    RETURN NEW;
END;
$$ LANGUAGE plpgsql;

-- Step 2: Create a trigger that calls the function after every insert
CREATE TRIGGER after_insert_trigger
AFTER INSERT ON public.test_table
FOR EACH ROW EXECUTE FUNCTION update_all_rows();
Enter fullscreen mode Exit fullscreen mode

Including in a Migration File

For consistency across environments, adding this code to a migration file ensures it is applied automatically in each deployment. Here’s a sample migration file using TypeORM:

import { MigrationInterface, QueryRunner } from "typeorm";

export class AddUpdateAllRowsTriggerToTestTable implements MigrationInterface {
    public async up(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`
            CREATE OR REPLACE FUNCTION update_all_rows() 
            RETURNS TRIGGER AS $$
            BEGIN
                UPDATE public.test_table
                SET column_name = NEW.column_name;
                RETURN NEW;
            END;
            $$ LANGUAGE plpgsql;

            CREATE TRIGGER after_insert_trigger
            AFTER INSERT ON public.test_table
            FOR EACH ROW EXECUTE FUNCTION update_all_rows();
        `);
    }

    public async down(queryRunner: QueryRunner): Promise<void> {
        await queryRunner.query(`
            DROP TRIGGER IF EXISTS after_insert_trigger ON public.test_table;
            DROP FUNCTION IF EXISTS update_all_rows;
        `);
    }
}
Enter fullscreen mode Exit fullscreen mode

Key Benefits

Efficiency: Processing the logic directly within the database reduces network load and speeds up data handling.
Maintainability: Centralizing logic in the database means fewer lines of application code and a more maintainable backend.
Portability: Using a migration file ensures that this functionality is automatically set up in every environment.

Takeaway

This experience was a reminder of the untapped capabilities in PostgreSQL and the benefits of letting the database handle certain logic independently.

🚀 Have you ever found a creative solution by using database features to streamline your code? Let's share insights and learn from each other!

Top comments (0)