DEV Community

Cover image for Spring Data — Power of Domain Events
Semyon Kirekov
Semyon Kirekov

Posted on • Edited on

Spring Data — Power of Domain Events

Domain Event is one of the ideas from Domain Driven Design. Once you become familiar with the technique, you won't be willing to deal without it anymore. So, in this article, I'm showing you an example of application development. We're doing the process step by step as the new requirements come. That shall give us a clear understanding of the Domain Events valueness.

Our stack is Java 11 + Spring Boot + Hibernate.

cover_image

Assuming we're creating the books selling service. Authors can place their books on sale, whilst customers can buy them.

Let's define the primary business entities. The Book itself.



@Entity
@Table
public class Book {
  @Id
  @GeneratedValue(strategy = IDENTITY)
  private Long id;

  private String name;

  private String description;

  private OffsetDateTime dateCreated;

  private OffsetDateTime lastDateUpdated;

  @ManyToOne(fetch = LAZY)
  @JoinColumn(name = "author_id")
  private Author author;

  private int price;

  @OneToMany(fetch = LAZY, mappedBy = "book", cascade = ALL)
  private List<BookSale> bookSales = new ArrayList<>();

  // getters, setters
}


Enter fullscreen mode Exit fullscreen mode

And the BookSale.



@Entity
@Table
public class BookSale {
  @Id
  @GeneratedValue(strategy = IDENTITY)
  private Long id;

  private int priceSold;

  private OffsetDateTime dateSold;

  @ManyToOne(fetch = LAZY)
  @JoinColumn(name = "book_id")
  private Book book;

  // getters, setters
}


Enter fullscreen mode Exit fullscreen mode

A Book instance has the name, author, date of creation, last date of update, price and the list of all sales.

For the sake of simplicity, we assume that books have a single author and all the prices have the same currency.

Ok, the minimum domain model is grounded. It's time to implement the business requirements.

1. Each book sell should be registered

That's the whole idea of our system.

Here is the first attempt.



@Service
public class BookSaleServiceImpl implements BookSaleService {
  private final BookRepository bookRepository;
  private final BookSaleRepository bookSaleRepository;

  @Override
  @Transactional
  public void sellBook(Long bookId) {
    final var book =
        bookRepository.findById(bookId)
            .orElseThrow(() -> new NoSuchElementException(
                "Book is not found"
            ));
    BookSale bookSale = new BookSale();
    bookSale.setBook(book);
    bookSale.setDateSold(OffsetDateTime.now());
    bookSale.setPriceSold(book.getPrice());
    bookSaleRepository.save(bookSale);
  }
}


Enter fullscreen mode Exit fullscreen mode

If you work with Spring regularly, you have probably seen similar code snippets many times. The design architecture that we have grounded here can be described as Anemic Domain Model. It means that we place the vast majority of the business logic inside the service layer. Whilst the entities act just as simple data structures with getters and setters.

Many authors classify this style as an anti-pattern. But why is that? This approach seems natural, isn't it? Besides, the business logic is not so complex in our case. Well, there are no problems in such a primitive example. At least, for now.

2. An author should be notified with every 100 of their book sales

We want to let the author know that their books are being sold.

How do we implement the feature? Well, the naive approach is to put functionality inside the sellBook method.



@Service
public class BookSaleServiceImpl implements BookSaleService {
  private final BookRepository bookRepository;
  private final BookSaleRepository bookSaleRepository;
  private final EmailService emailService;

  @Override
  @Transactional
  public void sellBook(Long bookId) {
    final var book =
        bookRepository.findById(bookId)
            .orElseThrow(() -> new NoSuchElementException(
                "Book is not found"
            ));
    BookSale bookSale = new BookSale();
    bookSale.setBook(book);
    bookSale.setDateSold(OffsetDateTime.now());
    bookSale.setPriceSold(book.getPrice());
    bookSaleRepository.save(bookSale);

    int totalSoldBooks = book.getBookSales().size();
    if (totalSoldBooks % 100 == 0) {
      Author author = book.getAuthor();
      emailService.send(author.getEmail(), "Another 100 books of yours have been sold!");
    }
  }
}


Enter fullscreen mode Exit fullscreen mode

The first thing that appears is that the transaction is still being processed when the emailService.send method is called. Firstly, it's a performance penalty. Secondly, there is a probability that the transaction shall be rolled back in the end. In that case, we don't want to send any emails.

We can fix that by applying programmatic transactions.



@Service
public class BookSaleServiceImpl implements BookSaleService {
  private final BookRepository bookRepository;
  private final BookSaleRepository bookSaleRepository;
  private final EmailService emailService;
  private final TransactionTemplate transactionTemplate;

  @Override
  public void sellBook(Long bookId) {
    final var savedBook = transactionTemplate.execute(status -> {
      final var book =
          bookRepository.findById(bookId)
              .orElseThrow(() -> new NoSuchElementException(
                  "Book is not found"
              ));
      BookSale bookSale = new BookSale();
      bookSale.setBook(book);
      bookSale.setDateSold(OffsetDateTime.now());
      bookSale.setPriceSold(book.getPrice());
      bookSaleRepository.save(bookSale);
      return book;
    });

    int totalSoldBooks = savedBook.getBookSales().size();
    if (totalSoldBooks % 100 == 0) {
      Author author = savedBook.getAuthor();
      emailService.send(author.getEmail(), "Another 100 books of your have been sold!");
    }
  }
}


Enter fullscreen mode Exit fullscreen mode

But one issue remains. This approach breaks Single-Responsibility Principle (SRP) and Open-Closed Principle (OCP). The better option is the Decorator Pattern.



@Service
public class EmailNotifierBookSaleService implements BookSaleService {
  @ActualBookSaleServiceQualifier
  private final BookSaleService origin;
  private final BookRepository bookRepository;
  private final EmailService emailService;

  @Override
  public void sellBook(Long bookId) {
    origin.sellBook(bookId);
    final var savedBook = bookRepository.findById(bookId).orElseThrow();
    int totalSoldBooks = savedBook.getBookSales().size();
    if (totalSoldBooks % 100 == 0) {
      Author author = savedBook.getAuthor();
      emailService.send(author.getEmail(), "Another 100 books of your have been sold!");
    }
  }
}


Enter fullscreen mode Exit fullscreen mode

EmailNotifierBookSaleService injects BookSaleService interface. In production environment, this one is going to be the BookSaleServiceImpl implementation (the Qualifier annotation points to that one). But in the test environment, we could use a stub or a mock.

That does look much better. The functionality is split between two services. And each of them can be tested individually.

3. Each book update should be archived

Analysts have decided that every possible book update (including book sale) should be archived. Here is the BookArchive entity.



@Entity
@Table
public class BookArchive {
  @Id
  @GeneratedValue(strategy = IDENTITY)
  private Long id;

  private String name;

  private String description;

  private int soldBooks;

  @ManyToOne(fetch = FetchType.LAZY)
  @JoinColumn(name = "author_id")
  private Author author;

  @ManyToOne(fetch = FetchType.LAZY)
  @JoinColumn(name = "book_id")
  private Book book;

  private OffsetDateTime dateArchiveCreated;

  private OffsetDateTime lastDateVersionUpdated;

  // getters, setters
}


Enter fullscreen mode Exit fullscreen mode

How do we track book sales? Well, we could add the functionality directly to BookSaleServiceImpl but we have already pointed out that it's a bad way. So, another decorator comes in.



@Service
public class ArchiveBookSaleService implements BookSaleService {
  @ActualBookSaleServiceQualifier
  private final BookSaleService origin;
  private final BookRepository bookRepository;
  private final BookArchiveRepository bookArchiveRepository;

  @Override
  @Transactional
  public void sellBook(Long bookId) {
    Book book = bookRepository.findById(bookId).orElseThrow();

    BookArchive bookArchive = new BookArchive();
    bookArchive.setBook(book);
    bookArchive.setName(book.getName());
    bookArchive.setDescription(book.getDescription());
    bookArchive.setDateArchiveCreated(OffsetDateTime.now());
    bookArchive.setLastDateVersionUpdated(requireNonNullElse(book.getLastDateUpdated(), book.getDateCreated()));
    bookArchive.setSoldBooks(book.getBookSales().size());

    bookArchiveRepository.save(bookArchive);

    origin.sellBook(bookId);
  }
}


Enter fullscreen mode Exit fullscreen mode

Some important details should be pointed out.
The sellBook method is wrapped with @Transactional. The reason is that an archived record should be created in the same transaction as the BookSale itself. If the main operation fails, we don't want to store any archives.

The bookRepository.findById(id) method is called two times during the execution. But since there is one transaction, Hibernate returns the cached instance from the persistence context on the second call. So, there are no additional database round-trips.

The second point is the @ActualBookSaleServiceQualifier. EmailNotifierBookSaleService does not start any transaction. It means that origin has to be of type BookSaleServiceImpl. Therefore we have to edit EmailNotifierBookSaleService in order not to inject BookSaleServiceImpl twice.

So, here is the schema of the current process.

book sale flow

If the system is not complex, this approach might be sufficient. But the Book Selling Application can be a huge enterprise solution. See, we have only started the development but there are already two decorators. Besides, the order of wrapping matters as well. That's why we had to change qualifiers.

It seems a bit overcomplicated, isn't it? Well, this is not the end.

4. Administrators should be able to update a book's name and description. Authors should be notified of every update by email

That does make sense as well. For example, there might be typos. The requirement can be split into three different functionalities:

  1. Book info update
  2. Book archiving
  3. Notifying by email

But here is the thing. If we keep following the same approach as before, there will be the primary service with business logic and two additional decorators. Déjà vu, isn't it? Every time the new requirement comes in we have to wrap the service layer with new decorators. What's the problem? Well, some of the previously implemented functionality has to be repeated. For example, book archiving. No matter what exactly happened to the book, the new archive record should be created. Because that's what analysts need. Emails are also the case. The difference is the frequency of their sendings.

So, what's the better solution? That is the moment when Domain Events come in. But first, we have to do some refactoring.

No Anemic Domain Model

What kind of requests do we have so far? Only two of them. A book sale request and its info updating. Let's rewrite the Book entity a bit.



@Entity
@Table
public class Book {
  @Id
  @GeneratedValue(strategy = IDENTITY)
  private Long id;

  private String name;

  private String description;

  private OffsetDateTime dateCreated;

  private OffsetDateTime lastDateUpdated;

  @ManyToOne(fetch = LAZY)
  @JoinColumn(name = "author_id")
  private Author author;

  private int price;

  @OneToMany(fetch = LAZY, mappedBy = "book", cascade = ALL)
  private List<BookSale> bookSales = new ArrayList<>();

  public void sell() {
    final var bookSale = new BookSale();
    bookSale.setBook(this)
    bookSale.setDateSold(OffsetDateTime.now())
    bookSale.setPriceSold(price);
    bookSales.add(bookSale);
  }

  public void changeInfo(String name, String description) {
    this.name = name;
    this.description = description;
    lastDateUpdated = OffsetDateTime.now();
  }
}


Enter fullscreen mode Exit fullscreen mode

I want to pay attention to sell and changeInfo methods. The first one registers a new book sale. And the second one updates the book's name and description.

Seems that nothing has changed so far. We just combined the functionality that could be performed via setters calling. Well, it's true. But let's move forward. Now let's refactor the BookSaleServiceImpl.



@Service
public class BookSaleServiceImpl implements BookSaleService {
  private final BookRepository bookRepository;

  @Transactional
  @Override
  public void sellBook(Long bookId) {
    final var book = bookRepository.findById(bookId).orElseThrow(() -> new NoSuchElementException(
        "Book is not found"
    ));
    book.sell();
    bookRepository.save(book);
  }
}


Enter fullscreen mode Exit fullscreen mode

The code does not look like a list of commands anymore. The business case is transparent now. Besides, the Book.sell method might be reused in different application services. But the business rule remains the same.

The service that updates book info is gonna look familiar.



@Service
public class BookUpdateServiceImpl implements BookUpdateService {
  private final BookRepository bookRepository;

  @Transactional
  @Override
  public void updateBookInfo(Long bookId, String name, String description) {
    final var book = bookRepository.findById(bookId).orElseThrow(() -> new NoSuchElementException(
        "Book is not found"
    ));
    book.changeInfo(name, description);
    bookRepository.save(book);
  }
}


Enter fullscreen mode Exit fullscreen mode

Introducing Domain Events

Now let's jump to the book archiving case. What if each book update published an event that would trigger book archiving? Well, Spring does have ApplicationEventPublisher bean that allows to publish events and subscribe to them by @EventListener usage.



@Service
public class BookSaleServiceImpl implements BookSaleService {
  private final BookRepository bookRepository;
  private final ApplicationEventPublisher eventPublisher;

  @Transactional
  @Override
  public void sellBook(Long bookId) {
    final var book = bookRepository.findById(bookId).orElseThrow(() -> new NoSuchElementException(
        "Book is not found"
    ));
    // event publishing happens here
    eventPublisher.publishEvent(new BookUpdated(book));
    book.sell();
    bookRepository.save(book);
  }
}

@Component
public class BookUpdatedListener {
  @EventListener
  public void archiveBook(BookUpdated bookUpdated) {
    // do archiving
  }
}


Enter fullscreen mode Exit fullscreen mode

Though it helps us to decouple the selling and the archiving process, it also forces us not to forget to publish BookUpdated on any book change.

We could provide ApplicationEventPublisher as a delegate to updating methods.



@Entity
@Table
public class Book {
  ...

  public void sell(Supplier<? extends ApplicationEventPublisher> publisher) {
    publisher.get().publishEvent(new BookUpdated(this));
    final var bookSale = new BookSale();
    bookSale.setBook(this)
    bookSale.setDateSold(OffsetDateTime.now())
    bookSale.setPriceSold(price);
    bookSales.add(bookSale);
  }

  public void changeInfo(String name, String description, Supplier<? extends ApplicationEventPublisher> publisher) {
    publisher.get().publishEvent(new BookUpdated(this));
    this.name = name;
    this.description = description;
    lastDateUpdated = OffsetDateTime.now();
  }


Enter fullscreen mode Exit fullscreen mode

That's better. But anyway, we have to inject this ApplicationEventPublisher instance to every service that somehow interacts with Book.

Is there any better solution? Sure. Embrace @DomainEvents.

@DomainEvents



@Entity
@Table
public class Book {
  ...

  @Transient
  private final List<Object> domainEvents = new ArrayList<>();

  @DomainEvents
  public Collection<Object> domainEvents() {
    return Collections.unmodifiableList(this.domainEvents);
  }

  @AfterDomainEventPublication
  public void clearDomainEvents() {
    this.domainEvents.clear();
  }

  private void registerEvent(Object event) {
    this.domainEvents.add(event);
  }

  public void sell() {
    final var bookUpdated = new BookUpdated(this);
    final var bookSale = new BookSale();
    bookSale.setBook(this)
    bookSale.setDateSold(OffsetDateTime.now())
    bookSale.setPriceSold(price);
    bookSales.add(bookSale);
    registerEvent(bookUpdated);
  }

  public void changeInfo(String name, String description) {
    final var bookUpdated = new BookUpdated(this);
    this.name = name;
    this.description = description;
    lastDateUpdated = OffsetDateTime.now();
    registerEvent(bookUpdated);
  }
}


Enter fullscreen mode Exit fullscreen mode

Every time the client calls sell or changeInfo method a BookUpdated event is added to domainEvents list. As you may guess, there is no direct publishing. So, how do events reach event listeners? When we call Repository.save method Spring collects the events by looking for @DomainEvents annotation. Then the cleanup is being processed (@AfterDomainEventPublication).

We could simplify it. Spring provides AbstractAggregateRoot class that already contains the required functionality. So, that's the less verbose option.



@Entity
@Table
public class Book extends AbstractAggregateRoot<Book> {
  ...

  public void sell() {
    final var bookUpdated = new BookUpdated(this);
    final var bookSale = new BookSale();
    bookSale.setBook(this)
    bookSale.setDateSold(OffsetDateTime.now())
    bookSale.setPriceSold(price);
    bookSales.add(bookSale);
    registerEvent(bookUpdated);
  }

  public void changeInfo(String name, String description) {
    final var bookUpdated = new BookUpdated(this);
    this.name = name;
    this.description = description;
    lastDateUpdated = OffsetDateTime.now();
    registerEvent(bookUpdated);
  }
}


Enter fullscreen mode Exit fullscreen mode

We forgot about email events. This could be tempting to declare BookSaleEmailEvent or BookChangeInfoEmailEvent. But that's would not be domain-oriented. You see, sending an email is just an implementation detail. There could be dozens of other options. Logging, putting a message to Kafka, triggering a job, etc. It's essential to focus on business use cases but not on functional behaviour.

So, the right way is to declare BookSold and BookChangedInfo events.



@Entity
@Table
public class Book extends AbstractAggregateRoot<Book> {
  ...

  public void sell() {
    final var bookUpdated = new BookUpdated(this);
    final var bookSale = new BookSale();
    bookSale.setBook(this)
    bookSale.setDateSold(OffsetDateTime.now())
    bookSale.setPriceSold(price);
    bookSales.add(bookSale);
    registerEvent(bookUpdated);
    registerEvent(new BookSold(this));
  }

  public void changeInfo(String name, String description) {
    final var bookUpdated = new BookUpdated(this);
    this.name = name;
    this.description = description;
    lastDateUpdated = OffsetDateTime.now();
    registerEvent(bookUpdated);
    registerEvent(new BookChangedInfo(this));
  }
}


Enter fullscreen mode Exit fullscreen mode

Capturing events

The @EventListener annotation is a simple and convenient way to track Spring events. But there is a caveat. We don't need just to capture events. We want the listeners to be invoked in particular moments of the transaction lifecycle.

For example, the archiving should be done just before the transaction commit. If something goes wrong with the main request or the archiving itself, the whole transaction has to be rolled back.

On the contrary, an email should be sent right after transaction commit. If the request has not successfully proceeded, there is no need to notify anyone.

@EventListener annotation is not powerful enough to satisfy our needs. But no worries. @TransactionalEventListener to the rescue!

The difference is that the annotation provides the phase attribute. It declares the point of the transaction lifecycle when we listener has to be called. There are four possible values.

  1. BEFORE_COMMIT
  2. AFTER_COMMIT - the default one
  3. AFTER_ROLLBACK
  4. AFTER_COMPLETION

The first three options are self-explanatory. The AFTER_COMPLETION is the combination of AFTER_ROLLBACK and AFTER_COMMIT.

For example, that's how archiving books might be implemented.



@Component
public class BookUpdatedListener {
  private final BookArchiveRepository bookArchiveRepository;

  @TransactionalEventListener(phase = BEFORE_COMMIT)
  public void archiveBook(BookUpdated bookUpdated) {
    BookArchive bookArchive = BookArchive.createNew(bookUpdated);
    bookArchiveRepository.save(bookArchive);
  }
}


Enter fullscreen mode Exit fullscreen mode

BookArchive.createNew just encapsulated the logic of creating the new BookArchive instance that has been described previously.

See? Piece of cake! Capturing BookChangedInfo and BookSold would be similar.



@Component
public class BookChangedInfoListener {
  private final EmailService emailService;

  @TransactionalEventListener(phase = AFTER_COMMIT)
  public void notifyAuthorByEmail(BookChangedInfo bookChangedInfo) {
    String email = bookChangedInfo.getAuthorEmail();
    emailService.send(email, "Your book's info has been changed");
  }
}

@Component
public class BookSoldListener {
  private final EmailService emailService;

  @TransactionalEventListener(phase = AFTER_COMMIT)
  public void notifyAuthorIfNeeded(BookSold bookSold) {
    int totalSoldBooks = bookSold.getTotalSoldBooksCount();
    if (totalSoldBooks % 100 == 0) {
      String email = bookSold.getAuthorEmail();
      emailService.send(email, "Another 100 books of your have been sold!");
    }
  }
}


Enter fullscreen mode Exit fullscreen mode

There is an important detail about @TransactionalEventListener. Sometimes you need to invoke the commands in a new transaction on the AFTER_COMMIT phase. If so, make sure you put @Transactional(propagation = REQUIRES_NEW) as well. REQUIRES_NEW parameter is crucial. Because there might be a chance that previous transaction resources have not been cleaned up yet. So, we have to make sure that Spring starts a new one.

And now we can get rid of those decorators. So, here is the comparison between the first set-up and the final architecture.

The First Attempt

The First Attempt

The Final Architecture

The Final Architecture

The first approach puts all business logic inside the service layer, whilst domain classes act as simple data structures. This pattern is called Transaction Script. If your system is small and non-complex, then it's fine to design the architecture across this pattern. But when it grows, it becomes hard to maintain.

By the way, you probably don't need Spring Data and Hibernate, if you apply the Transaction Script pattern. Since all business rules are bound to services, Hibernate will bring overhead and not so many benefits. Instead, you could try to use JDBI, JOOQ or even plain JDBC.

The final architecture turns it upside down. Domain Entities encapsulate the business logic and the services act as thin wrappers (Rich Domain Model). No matter who interacts with the Book entity, the business rules remain the same. All the additional functionality is driven by Domain Events. That allows us to expand the system infinitely. Domain Events can trigger a variety of business operations. Putting a message to the queue, performing audit actions, notifying users, applying CQRS pattern, etc.

Conclusion

In my opinion, Hibernate combined with Spring Data is meant to be used with Domain Events. The benefits are worth it. I'm curious how do you apply persistence in your project? Do you prefer an Anemic or Rich Domain Model? Please, leave your comments down below. Thanks for reading!

Resources

  1. Domain Event
  2. Domain Driven Design
  3. Anemic Domain Model
  4. Programmatic transactions
  5. Single-Responsibility Principle (SRP)
  6. Open-Closed Principle (OCP)
  7. Decorator Pattern
  8. Spring Qualifier Annotation
  9. Hibernate First-level Cache
  10. Spring Event Listener
  11. Spring Domain Events
  12. Spring Transactional Propagation and Isolation
  13. Transaction Script Pattern
  14. JDBI
  15. JOOQ
  16. CQRS Pattern

Top comments (6)

Collapse
 
jhonborris profile image
Jhon Borris

Nice article, very informative, a must read by all developers

Collapse
 
vilgodskiy_sergey profile image
Vilgodskiy Sergey

Special respect for "Rich Domain Model". I don't know why, but a lot of developers prefer Anemic one, and it makes me sad :(

Collapse
 
kirekov profile image
Semyon Kirekov

@vilgodskiy_sergey speaking about Rich Domain Model, I wrote an article about that dev.to/kirekov/rich-domain-model-w...

Collapse
 
vilgodskiy_sergey profile image
Vilgodskiy Sergey

Cool, thank you! I will definitely read it!

Collapse
 
cezarcruz profile image
Cezar Cruz

First time I see that, and looks amazing. There is some performance drawback?

Collapse
 
kirekov profile image
Semyon Kirekov

@cezarcruz
These two approaches have approximately the same performance.

Though there might be a problem if you have too many listeners called on BEFORE_COMMIT phase. As long as they act synchronously, it can impact the transaction running time.

Anyway, it's not the end of the world. You can make your listeners run asynchronously

@Component
public class Listener {
    @TransactionalEventListener(phase = AFTER_COMMIT)
    @Async
    void captureEvent(SomeEvent event) {
        // do stuff
    }
}
Enter fullscreen mode Exit fullscreen mode

But not all listeners should be run asynchronously. For example, email sender is a perfect candidate. Whilst the one that stores archive records is not.