Trigger Cookbook

Here we provide examples using the built-in triggers of django-pgtrigger and triggers that require raw SQL. While most examples are practical application examples, some exist to illustrate a starting point of how one can use triggers for more complex cases.

Validating field transitions

Similar to how one can configure a finite state machine on a model field with django-fsm, the pgtrigger.FSM trigger ensures that a field can only do configured transitions.

The example below ensures that the status field of a model can only transition from “unpublished” to “published” and from “published” to “inactive”. Any other updates on the status field will result in an exception:

class MyModel(models.Model):
    """Enforce valid transitions of the "status" field"""
    status = models.CharField(max_length=32, default="unpublished")

    class Meta:
        triggers = [
            pgtrigger.FSM(
                name="status_fsm",
                field="status",
                transitions=[
                    ("unpublished", "published"),
                    ("published", "inactive"),
                ]
            )
        ]

Note

pgtrigger.FSM can be supplied with a condition to only enforce the state transitions when a condition is met.

Note

The pgtrigger.FSM trigger only works for non-null CharField fields.

Mirroring a field

Here we create a pgtrigger.Trigger that runs before an update or insert to keep two fields in sync.

class MyModel(models.Model):
    int_field = models.IntField()
    in_sync_int = models.IntField(help_text="Stays the same as int_field")

    class Meta:
        triggers = [
            pgtrigger.Trigger(
                name="keep_in_sync",
                operation=pgtrigger.Update | pgtrigger.Insert,
                when=pgtrigger.Before,
                func="NEW.in_sync_int = NEW.int_field; RETURN NEW;",
            )
        ]

Note

When writing a pgtrigger.Before trigger, be sure to return the row over which the operation should be applied. Returning no row will prevent the operation from happening. See the Postgres docs for more information.

Soft-delete models

Rather than fully deleting a model, one can “soft delete” it by setting a field to an inactive state. The pgtrigger.SoftDelete trigger takes the field as an argument and a value to set on delete, which defaults to False. For example:

class SoftDeleteModel(models.Model):
    # This field is set to false when the model is deleted
    is_active = models.BooleanField(default=True)

    class Meta:
        triggers = [
            pgtrigger.SoftDelete(name="soft_delete", field="is_active")
        ]


m = SoftDeleteModel.objects.create()
m.delete()

# The model will still exist, but it is no longer active
assert not SoftDeleteModel.objects.get().is_active

pgtrigger.SoftDelete works with nullable CharField, IntField, and BooleanField fields.

Note

When using pgtrigger.SoftDelete, keep in mind that Django will still perform cascading operations. For example, a foreign key to SoftDeleteModel with on_delete=models.CASCADE will be deleted by Django when the parent model is soft deleted.

Append-only models

Here we create an append-only model using the pgtrigger.Protect trigger for the UPDATE and DELETE operations:

class AppendOnlyModel(models.Model):
    my_field = models.IntField()

    class Meta:
        triggers = [
            pgtrigger.Protect(
                name="append_only",
                operation=(pgtrigger.Update | pgtrigger.Delete)
            )
        ]

Note

This table can still be truncated. Although Django doesn’t support this database operation, one can still protect against this by adding the pgtrigger.Truncate operation.

Official interfaces

pgtrigger.Protect triggers can be combined with pgtrigger.ignore to create “official” interfaces for doing database operations in your application.

Here we protect inserts on our custom User model and force engineers to use create_user to create them:

@pgtrigger.ignore("my_app.User:protect_inserts")
def create_user(**kwargs):
    return User.objects.create(**kwargs)


class User(models.Model):
    class Meta:
        triggers = [
            pgtrigger.Protect(name="protect_inserts", operation=pgtrigger.Insert)
        ]

We’ve ignored the protection trigger for the create_user function by providing its full path to pgtrigger.ignore. All users must use create_user to create User objects, otherwise an exception will happen.

Note

Ignoring triggers is covered in the Ignoring Execution section.

Conditional deletion protection

Here we only allow models with a deletable flag to be deleted:

class DynamicDeletionModel(models.Model):
    is_deletable = models.BooleanField(default=False)

    class Meta:
        triggers = [
            pgtrigger.Protect(
                name="protect_deletes",
                operation=pgtrigger.Delete,
                condition=pgtrigger.Q(old__is_deletable=False)
            )
        ]

Redundant update protection

Here we raise an error when someone makes a redundant update to the database:

class RedundantUpdateModel(models.Model):
    redundant_field1 = models.BooleanField(default=False)
    redundant_field2 = models.BooleanField(default=False)

    class Meta:
        triggers = [
            pgtrigger.Protect(
                name="protect_redundant_updates",
                operation=pgtrigger.Update,
                condition=pgtrigger.Condition(
                    "OLD.* IS NOT DISTINCT FROM NEW.*"
                )
            )
        ]

Freezing published models

Here we have a Post model with a status field. We only allow edits to this model when its status is not “published”.

class Post(models.Model):
    status = models.CharField(default="unpublished")
    content = models.TextField()

    class Meta:
        triggers = [
            pgtrigger.Protect(
                name="freeze_published_model",
                operation=pgtrigger.Update,
                condition=pgtrigger.Q(old__status="published")
            )
        ]

We extend this example by allowing a published model to be able to be edited, but only when transitioning it to an “inactive” status.

class Post(models.Model):
    status = models.CharField(default="unpublished")
    content = models.TextField()

    class Meta:
        triggers = [
            pgtrigger.Protect(
                name="freeze_published_model_allow_deactivation",
                operation=pgtrigger.Update,
                condition=(
                  pgtrigger.Q(old__status="published")
                  & ~pgtrigger.Q(new__status="inactive")
            )
        ]

Versioned models

Here we write a pgtrigger.Trigger trigger that dynamically increments a model version before an update is applied.

We do this with two triggers:

  1. One that protects updating the version field of the model. We don’t want people tampering with this field.

  2. A trigger that increments the version of the NEW row before an update is applied. We ignore updating the version if there are no changes.

class Versioned(models.Model):
    """
    This model is versioned. The "version" field is incremented on every
    update, and users cannot directly update the "version" field.
    """
    version = models.IntegerField(default=0)
    char_field = models.CharField(max_length=32)

    class Meta:
        triggers = [
            # Protect anyone editing the version field directly
            pgtrigger.Protect(
                name="protect_updates",
                operation=pgtrigger.Update,
                condition=pgtrigger.Q(old__version__df=pgtrigger.F("new__version"))
            ),
            # Increment the version field on changes
            pgtrigger.Trigger(
                name="versioning",
                when=pgtrigger.Before,
                operation=pgtrigger.Update,
                func="NEW.version = NEW.version + 1; RETURN NEW;",
                # Don't increment version on redundant updates.
                condition=pgtrigger.Condition("OLD.* IS DISTINCT FROM NEW.*")
            )
        ]

Note

The return value from pgtrigger.Before triggers is what Postgres uses when executing the operation. NULL values tell Postgres to ignore the operation entirely.

Keeping a search vector updated

When using Postgres full-text search, keep django.contrib.postgres.search.SearchVectorField fields updated using pgtrigger.UpdateSearchVector. Here we keep a search vector updated based on changes to the title and body fields of a model:

class DocumentModel(models.Model):
    search_vector = SearchVectorField()
    title = models.CharField(max_length=128)
    body = models.TextField()

    class Meta:
        triggers = [
            pgtrigger.UpdateSearchVector(
                name="add_title_and_body_to_vector",
                vector_field="search_vector",
                document_fields=["title", "body"],
            )
        ]

pgtrigger.UpdateSearchVector uses Postgres’s tsvector_update_trigger to keep the search vector updated. See the Postgres docs for more info.

Note

pgtrigger.UpdateSearchVector triggers are incompatible with pgtrigger.ignore and will raise a RuntimeError if used.

Statement-level triggers and transition tables

So far most of the examples have been for triggers that fire once per row. Statement-level triggers are fired once per statement and allow more flexibility and performance tuning for some scenarios.

Instead of OLD and NEW rows, statement-level triggers can use “transition tables” to access temporary tables of old and new rows. One can use the pgtrigger.Referencing construct to configure this. See this StackExchange example for more explanations about transition tables.

Note

Transition tables are only available in Postgres 10 and up.

Here we have a history model that keeps track of changes to a field in the tracked model. We create a statement-level trigger that logs the old and new fields to the history model:

class HistoryModel(models.Model):
    old_field = models.CharField(max_length=32)
    new_field = models.CharField(max_length=32)


class TrackedModel(models.Model):
    field = models.CharField(max_length=32)

    class Meta:
        triggers = [
            pgtrigger.Trigger(
                name="track_history",
                level=pgtrigger.Statement,
                when=pgtrigger.After,
                operation=pgtrigger.Update,
                referencing=pgtrigger.Referencing(old="old_values", new="new_values"),
                func=f"""
                    INSERT INTO {HistoryModel._meta.db_table}(old_field, new_field)
                    SELECT
                        old_values.field AS old_field,
                        new_values.field AS new_field
                    FROM old_values
                        JOIN new_values ON old_values.id = new_values.id;
                    RETURN NULL;
                """,
            )
        ]

With this statement-level trigger, we have the benefit that only one additional query is performed, even on bulk inserts to the tracked model. Here’s some example code to illustrate what the results look like.

TrackedModel.objects.bulk_create([LoggedModel(field='old'), LoggedModel(field='old')])

# Update all fields to "new"
TrackedModel.objects.update(field='new')

# The trigger should have tracked these updates
print(HistoryModel.values('old_field', 'new_field'))

>>> [{
  'old_field': 'old',
  'new_field': 'new'
}, {
  'old_field': 'old',
  'new_field': 'new'
}]

Note

When considering use of statment-level triggers for performance reasons, keep in mind that additional queries executed by triggers do not involve expensive round-trips from the application. A less-complex row-level trigger may be worth the performance cost.

Ensuring child models exist

Consider a Profile model that has a OneToOne to Django’s User model:

class Profile(models.Model):
    user = models.OneToOneField(User, on_delete=models.CASCADE)

We use a “deferrable” trigger to ensure a Profile exists for every User. Deferrable triggers can execute at the end of a transaction, allowing us to check for the existence of a Profile after creating a User.

This example is continued in the Deferrable Triggers section.

Tracking model history and changes

Check out django-pghistory to snapshot model changes and attach context from your application (e.g. the authenticated user) to the event.