Bloggo, Buildo's blog
Coding

Diving Into Java and Spring Boot at Buildo

We recently used Java and Spring Boot for the first time at Buildo. We were keen to gain experience with them, see what the state of Java development is in 2024, and see how we could reuse our experience from the Scala world.

Tommaso Petrucciani
Full Stack Software Engineer & Backend Lead
June 28, 2024
10
minutes read

Scala and TypeScript are the main programming languages we’ve been using for years at Buildo. We used Scala as our main language for backend development since the start, but we've been using TypeScript and Node more and more over the years while their ecosystem matured and commercial interest in it grew. Since we’re a consultancy working with some very different projects and clients, we also used other languages occasionally as needed for specific projects.

In our recent project, we used Java – this was a first for us at Buildo. Our client, in the healthcare business, needed to avoid further fragmentation of their already complex technological stack. As well as reacting to that customer need, we wanted to make the most of it to explore the Java world. Several of us had previous experience or knowledge of Java, but we had never used it at Buildo, and we wanted to see how it had evolved in recent years and how our experience with Scala and TypeScript could be reused there. This post shares our choices and experience over this journey.

Tech Stack Choice and Objectives

Apart from the programming language, we had no hard tech constraints in the project. So at the start we had the interesting but a bit daunting task to select a complete stack for a web application backend.

We wanted to balance different concerns to evaluate frameworks and libraries. Using established and widespread ones would make our experience more reusable with future customers and projects. However, we also wanted an up-to-date stack and one where we could leverage our experience with Scala and TypeScript by adopting functional programming patterns to a significant degree. Our customer was also interested to see if our tech choices and patterns could be a seed for innovation in their other projects in the future.

So these were the three main choices we made:

  • to use Java 21, which had just been released, and leverage newly stabilized features like pattern matching and virtual threads;
  • to use Spring Boot because it’s so widespread that working with it would be most valuable (though more lightweight frameworks, especially Helidon, look closer to our best-loved patterns);
  • to use jOOQ for database access.

The good and the not-so-good

Overall we’re satisfied with the balance we struck, though there’s plenty more to explore and improve in future projects. These are some thoughts on our experiences with Spring Boot, with new Java features themselves, and more. Not all was new to us of course, since when working with Scala we already use Java libraries to some extent (Log4j 2 or the Java Date/Time API, to name a couple).

Spring Boot

Choosing a full-fledged framework allowed us to speed up bootstrapping and guide many of the many individual choices in terms of structuring the code, routing, error handling, and more. In the Scala world, we’ve tended to distrust frameworks and piece together a stack with several more orthogonal libraries (for example, in recent projects: ZIO, http4s, tapir, Slick or Doobie, circe). But the advantage of a comprehensive and opinionated framework was undeniable at the start.

We were a bit worried by all the automagic stuff – annotations for dependency injection and the like. However, it all worked quite well in practice and we didn't run into many issues. It is still a concern of ours, however, that there are lots of behaviours in a large framework that might not be clear, especially to new users.

Extensive automated tests were very valuable to ensure the application behaved as expected, including in corner cases (e.g., concerning validation and error handling) which are more likely to be missed.

Modern Java

Considering Java, we were happy with the new-ish features of records, sealed interfaces, and pattern matching. These allowed us to write domain models and business logic which are close to what we are used to in Scala with case classes and sealed traits (or Scala 3 enumerations). Overall this means boilerplate is reduced.

For example, modelling the outcome of a request as

sealed interface UpdateComponentRequestResult {
    record Started()
        implements UpdateComponentRequestResult {}
    record AgentError(HttpStatusCode responseCode, String responseBody)
        implements UpdateComponentRequestResult {}
    record FailedToContactAgent(Exception exception)
        implements UpdateComponentRequestResult {}
}

and pattern matching on it as

switch (result) {
    case UpdateComponentRequestResult.Started _ -> { ... }
    case UpdateComponentRequestResult.AgentError(var responseCode, var responseBody) -> { ... }
    case UpdateComponentRequestResult.FailedToContactAgent(var exception) -> { ... }
}

feels safe and familiar.

We also used Optional extensively to limit the use of null and therefore have more type safety. This is contentious in Java since the standard use of Optional is more limited to return values to support fluent APIs, whereas the use for types of fields or method parameters is often discouraged (see for example https://nipafx.dev/design-java-optional/), not as a full-fledged replacement for null. However, it allowed us to maintain a pattern familiar from Scala, without replacing it with nullability annotations whose behaviour and limitations we might not understand as well or which might be more dependent on additional tooling. We did use annotations at the application boundaries though, for validation.

Overall we felt that, especially thanks to records, sealed interfaces, and switch expressions, writing domain models and business logic with immutability and type safety in mind are more convenient than in the past and closer to Scala, albeit with more syntactic heaviness (those endless .stream()....toList()!).

Editor support and tooling is also obviously very good at least when working with IntelliJ, and an improvement for that compared to Scala, though we found VS Code support less robust at the start (e.g., we found this issue for the recently introduced string templates, though it’s since been fixed).

To be noted is that this was a fairly small application and had quite simple orchestration and, especially, no need to manage complex concurrency. This meant we worked in a straightforward one-thread-per-request style, backed by virtual threads. This looks simpler than using explicitly async abstractions, like Promise in TypeScript or Future/ZIO in Scala, but we still have to try out the new structured concurrency API.

jOOQ

The application had fairly simple requirements for persistence (in terms of data complexity and workload). For project-specific reasons, we decided to use the H2 database in embedded mode. To work with it, we used jOOQ for database access and Flyway to manage the schema evolution with migrations.

This let us stay close to how we prefer to work also in other tech stacks and limited the risk of misunderstanding the behaviour of our tools. Specifically, we already use Flyway in Scala projects, and we typically use Slick to access the database: jOOQ felt familiar because it is essentially a (mostly) type-safe query builder to express SQL queries in a Java embedded DSL, making it easy to understand what SQL they will result in.

In contrast, abstracting away from SQL, e.g. with Hibernate and Spring Data JPA, could mean we run into more surprising behaviour and gotchas (as we did in the past with ORMs like TypeORM and, to a lesser extent, Prisma). This is the same concern we had with using Spring, but for the database layer we thought the tradeoffs leant towards staying closer to SQL, though this meant a bit more boilerplate. We think the choice worked out fine: we didn’t have issues or run into anything unexpected with jOOQ, and the database access code, while a bit verbose, is easy to understand and edit.

Validation and OpenAPI Generation

Validation is an area we’re less happy about. Let’s consider validation of fields in JSON HTTP request bodies. We’ve used the well-established Bean Validation (this is a good intro article to it) and it worked out fine. But null checking is still suboptimal because we must manually make sure that fields are either Optional<SomeType> or marked with a @NotNull annotation.

This is due to our usage of Optional within the application boundaries to avoid null, and using @NotNull only at the boundaries – something we might re-evaluate in the future. More generally, though, the validation approach with annotations doesn’t tie in directly the validation to be performed with the Java type obtained when parsing the request, meaning they can be mismatched: e.g., I don’t check @NotNull and yet I don’t mark the type as Optional.

We usually prefer a “parse, don’t validate” approach, where parsing an unknown input into a model and validating it go hand in hand in a single schema object. Zod is a good example of this in the TypeScript world. However, deserializing JSONs with Jackson is itself type safe and limits the risks of the annotation-based approach (e.g., in contrast to the same approach used in NestJS in the TypeScript world which we found was more error-prone).

Finally, we often use an OpenAPI specification to link the backend and frontend. We prefer to describe the endpoints in the backend code (we use tapir in Scala) and generate the specification from them. Then, we generate client code and data models for the frontend application from the specification.

Here, we used SpringDoc to generate the OpenAPI from our controllers. We ran into a few issues though, mostly concerning arrays or maps in request bodies, where schema annotations were incorrectly applied to both the array or map itself and its elements. For example, in this controller parameter

@RequestBody
@ArraySchema(schema = @Schema(minLength = 0, maxLength = 1000))
@Size(min = 1, max = 100) List<@Size(min = 0, max = 1000) String> logs

we added a redundant @ArraySchema to define the length constraint of strings in the array, otherwise that constraint (already expressed by the second @Size annotation) was not picked up correctly. Even so, the generated specification is partially incorrect (minLength should be 0, not 1):

  requestBody:
        content:
          application/json:
            schema:
              maxItems: 100
              minItems: 1
              type: array
              items:
                maxLength: 1000
                minLength: 1
                type: string
        required: true

Given these issues, in future Java projects we’ll try to understand better the behaviour and limitation of SpringDoc or assess alternatives.

For the Future

Overall we’re satisfied with how this stack worked out during the project. While we don’t claim we’ve become Java experts or have the broad knowledge we have in our main tech stacks, we were able to get started quickly and deliver the project with good quality and speed. We could capitalize on our experience with Scala, both concerning FP patterns that can be applied, to some extent, in modern Java and concerning knowledge of the JVM and some Java libraries that could be reused. We’re looking forward to extend our knowledge and refine this stack in a future project!

Tommaso Petrucciani
Full Stack Software Engineer & Backend Lead

Tommaso joined Buildo as a full-stack engineer in 2019. He loves functional programming languages (and did a PhD on them) but at Buildo he has broadened his interest to all backend development and software architecture and is now the backend Tech Lead.

Still curious? Dive deeper

Coding
Is It Time for You to Switch to Next.js?

March 21, 2024

12

minutes read

Security & DevOps
Building Value for Developers With an Internal Developer Platform

June 13, 2024

7

minutes read

People & Org Design
Bringing OKRs to Buildo: A Journey of Alignment and Autonomy

May 21, 2024

12

minutes read

Let's get down to business

Are you searching for a reliable partner to develop your tailor-made software solution? We'd love to chat with you and learn more about your project.