liminfo

Designing Event-Driven APIs with AsyncAPI

Define asynchronous messaging architectures that REST APIs cannot express using the AsyncAPI specification. A practical guide covering Kafka/WebSocket channel documentation, code generation, and CI/CD validation.

AsyncAPIEvent-Driven APIEvent-Driven ArchitectureKafkaWebSocketAPI DocumentationAsynchronous APIAsyncAPI Generator

Problem

Your microservices exchange events through Kafka topics and WebSocket channels, but there is no documentation for which service publishes or subscribes to which messages. When new team members join, it takes days to figure out Kafka topic names, message formats, and protocol bindings. Schema changes break backward compatibility and cause outages. Just like OpenAPI (Swagger) for REST APIs, you need a standard specification to document event-driven APIs and automate schema validation and code generation.

Required Tools

AsyncAPI CLI

The official CLI tool for validating, converting, and bundling AsyncAPI documents. Provides commands like validate, generate, and diff.

AsyncAPI Studio

A web-based editor for visually editing AsyncAPI documents with real-time preview capabilities.

Apache Kafka

A large-scale event streaming platform. AsyncAPI defines topics, partitions, groupId, and more through Kafka bindings.

Node.js

Runtime environment for AsyncAPI Generator. Generates documentation, client code, and server scaffolding from templates.

Solution Steps

1

Understanding the AsyncAPI Spec Structure

The AsyncAPI specification has a structure similar to OpenAPI but is optimized for asynchronous messaging. At the top level, you define the asyncapi version, info (API metadata), servers (broker connection info), channels (message channels), and components (reusable schemas). Instead of OpenAPI's paths, AsyncAPI uses channels. Instead of HTTP methods, it uses send/receive operations to express message flow directions.

# asyncapi.yaml - Basic structure
asyncapi: 3.0.0
info:
  title: Order Processing API
  version: 1.0.0
  description: |
    Event-driven API for the order processing microservice.
    Publishes events for order creation, payment, and shipping status changes.
  contact:
    name: Platform Team
    email: platform@example.com
  license:
    name: Apache 2.0

servers:
  production:
    host: kafka.prod.example.com:9092
    protocol: kafka
    description: Production Kafka cluster
    security:
      - $ref: '#/components/securitySchemes/saslAuth'
  staging:
    host: kafka.staging.example.com:9092
    protocol: kafka
    description: Staging Kafka cluster

channels:
  orderCreated:
    address: orders.created
    messages:
      OrderCreatedMessage:
        $ref: '#/components/messages/OrderCreated'
  orderStatusChanged:
    address: orders.status-changed
    messages:
      OrderStatusChangedMessage:
        $ref: '#/components/messages/OrderStatusChanged'

operations:
  publishOrderCreated:
    action: send
    channel:
      $ref: '#/channels/orderCreated'
    summary: Publishes an event when a new order is created
  subscribeOrderStatus:
    action: receive
    channel:
      $ref: '#/channels/orderStatusChanged'
    summary: Subscribes to order status change events

components:
  securitySchemes:
    saslAuth:
      type: scramSha256
      description: SASL/SCRAM-SHA-256 authentication
2

Defining Channels and Messages

A channel is the path through which messages flow -- it corresponds to a topic in Kafka or an endpoint in WebSocket. In AsyncAPI 3.0, operations use action: send (publish) or receive (subscribe) to specify the message flow direction. Each message can define a payload (body schema), headers (metadata), and correlationId (tracking ID).

# Channel and message definitions (AsyncAPI 3.0)
channels:
  orderCreated:
    address: orders.created
    description: Order creation event channel
    messages:
      OrderCreatedMessage:
        $ref: '#/components/messages/OrderCreated'

  paymentProcessed:
    address: payments.processed
    description: Payment completion event channel
    messages:
      PaymentProcessedMessage:
        $ref: '#/components/messages/PaymentProcessed'

  userNotifications:
    address: /ws/notifications
    description: Per-user real-time notification WebSocket channel
    messages:
      NotificationMessage:
        $ref: '#/components/messages/Notification'

operations:
  publishOrderCreated:
    action: send
    channel:
      $ref: '#/channels/orderCreated'
    summary: Order service publishes new order creation events
    traits:
      - $ref: '#/components/operationTraits/commonHeaders'

  subscribePayment:
    action: receive
    channel:
      $ref: '#/channels/paymentProcessed'
    summary: Order service subscribes to payment completion events
    traits:
      - $ref: '#/components/operationTraits/commonHeaders'

  receiveNotification:
    action: receive
    channel:
      $ref: '#/channels/userNotifications'
    summary: Client receives real-time notifications

components:
  messages:
    OrderCreated:
      name: OrderCreated
      title: Order Created Event
      summary: Event published when a new order is created
      contentType: application/json
      correlationId:
        location: '$message.header#/correlationId'
      headers:
        type: object
        properties:
          correlationId:
            type: string
            format: uuid
          eventTime:
            type: string
            format: date-time
          source:
            type: string
            enum: [order-service]
      payload:
        $ref: '#/components/schemas/OrderCreatedPayload'

    PaymentProcessed:
      name: PaymentProcessed
      title: Payment Processed Event
      contentType: application/json
      payload:
        $ref: '#/components/schemas/PaymentProcessedPayload'

    Notification:
      name: Notification
      title: User Notification
      contentType: application/json
      payload:
        $ref: '#/components/schemas/NotificationPayload'
3

Writing Message Payloads with JSON Schema

Define the data structure of message payloads rigorously using JSON Schema. Use required fields, enum constraints, and format specifications to establish a clear contract between producers and consumers. Separating schemas into components/schemas allows reuse across multiple messages and reduces duplication via $ref references.

# components/schemas section - Message payload schemas
components:
  schemas:
    OrderCreatedPayload:
      type: object
      required:
        - orderId
        - customerId
        - items
        - totalAmount
        - currency
        - createdAt
      properties:
        orderId:
          type: string
          format: uuid
          description: Unique order identifier
          examples:
            - '550e8400-e29b-41d4-a716-446655440000'
        customerId:
          type: string
          format: uuid
          description: Unique customer identifier
        items:
          type: array
          minItems: 1
          items:
            $ref: '#/components/schemas/OrderItem'
          description: List of ordered products
        totalAmount:
          type: number
          format: double
          minimum: 0
          description: Total order amount
        currency:
          type: string
          enum: [KRW, USD, EUR, JPY]
          description: Currency code (ISO 4217)
        shippingAddress:
          $ref: '#/components/schemas/Address'
        createdAt:
          type: string
          format: date-time
          description: Order creation timestamp (ISO 8601)

    OrderItem:
      type: object
      required: [productId, productName, quantity, unitPrice]
      properties:
        productId:
          type: string
          description: Product ID
        productName:
          type: string
          description: Product name
        quantity:
          type: integer
          minimum: 1
          description: Order quantity
        unitPrice:
          type: number
          format: double
          minimum: 0
          description: Unit price

    Address:
      type: object
      required: [street, city, country, postalCode]
      properties:
        street:
          type: string
        city:
          type: string
        country:
          type: string
        postalCode:
          type: string

    PaymentProcessedPayload:
      type: object
      required: [paymentId, orderId, status, amount, processedAt]
      properties:
        paymentId:
          type: string
          format: uuid
        orderId:
          type: string
          format: uuid
        status:
          type: string
          enum: [SUCCESS, FAILED, REFUNDED]
          description: Payment processing result
        amount:
          type: number
          format: double
        failureReason:
          type: string
          description: Reason for failure (when status is FAILED)
        processedAt:
          type: string
          format: date-time

    NotificationPayload:
      type: object
      required: [notificationId, type, title, body, timestamp]
      properties:
        notificationId:
          type: string
          format: uuid
        type:
          type: string
          enum: [ORDER_UPDATE, PAYMENT, SHIPPING, PROMOTION]
        title:
          type: string
          maxLength: 100
        body:
          type: string
          maxLength: 500
        link:
          type: string
          format: uri
        timestamp:
          type: string
          format: date-time
4

Configuring Kafka Bindings

AsyncAPI bindings allow you to include protocol-specific configurations within the specification. Kafka bindings document operational settings such as topic partition count, replica count, groupId, and clientId. Message bindings can define Kafka message keys to explicitly document the partitioning strategy.

# Kafka binding configuration
channels:
  orderCreated:
    address: orders.created
    messages:
      OrderCreatedMessage:
        $ref: '#/components/messages/OrderCreated'
    bindings:
      kafka:
        topic: orders.created
        partitions: 6
        replicas: 3
        topicConfiguration:
          cleanup.policy: [delete]
          retention.ms: 604800000      # 7-day retention
          max.message.bytes: 1048576   # Max message size 1MB
          min.insync.replicas: 2

  orderStatusChanged:
    address: orders.status-changed
    messages:
      OrderStatusChangedMessage:
        $ref: '#/components/messages/OrderStatusChanged'
    bindings:
      kafka:
        topic: orders.status-changed
        partitions: 3
        replicas: 3
        topicConfiguration:
          cleanup.policy: [compact]     # Keep only latest value per key
          retention.ms: -1             # Retain indefinitely

operations:
  subscribeOrderStatus:
    action: receive
    channel:
      $ref: '#/channels/orderStatusChanged'
    bindings:
      kafka:
        groupId:
          type: string
          enum: [shipping-service-group]
        clientId:
          type: string
          enum: [shipping-service]
        bindingVersion: '0.5.0'

components:
  messages:
    OrderCreated:
      name: OrderCreated
      contentType: application/json
      bindings:
        kafka:
          key:
            type: string
            description: |
              Uses customerId as the message key so that
              orders from the same customer are assigned
              to the same partition.
          schemaIdLocation: header
          schemaIdPayloadEncoding: confluent
          bindingVersion: '0.5.0'
      payload:
        $ref: '#/components/schemas/OrderCreatedPayload'

    OrderStatusChanged:
      name: OrderStatusChanged
      contentType: application/json
      bindings:
        kafka:
          key:
            type: string
            description: Uses orderId as the key
          bindingVersion: '0.5.0'
      payload:
        $ref: '#/components/schemas/OrderStatusChangedPayload'

# WebSocket binding example (notification channel)
  userNotifications:
    address: /ws/notifications
    bindings:
      ws:
        method: GET
        query:
          type: object
          properties:
            token:
              type: string
              description: JWT authentication token
          required: [token]
        bindingVersion: '0.1.0'
5

Generating Documentation with AsyncAPI Generator

AsyncAPI Generator takes an AsyncAPI document as input and automatically generates HTML documentation, client/server code, and type definitions. Use @asyncapi/html-template to generate interactive API documentation and @asyncapi/nodejs-template to scaffold a Node.js project. TypeScript type generation ensures consistent message types across producers and consumers.

# Install AsyncAPI CLI
npm install -g @asyncapi/cli

# Validate AsyncAPI document
asyncapi validate asyncapi.yaml
# output: File asyncapi.yaml is valid!

# Generate HTML documentation
asyncapi generate fromTemplate asyncapi.yaml \
  @asyncapi/html-template \
  -o docs/async-api \
  --force-write
# -> Generates docs/async-api/index.html (interactive API docs)

# Generate Node.js project scaffolding
asyncapi generate fromTemplate asyncapi.yaml \
  @asyncapi/nodejs-template \
  -o generated/nodejs \
  --force-write \
  -p server=production
# -> Generates Kafka producer/consumer boilerplate

# Generate TypeScript types (modelina)
asyncapi generate models typescript asyncapi.yaml \
  -o generated/types
# -> Generates TypeScript interfaces for message payloads

# Generated type example (generated/types/OrderCreatedPayload.ts)
# export interface OrderCreatedPayload {
#   orderId: string;
#   customerId: string;
#   items: OrderItem[];
#   totalAmount: number;
#   currency: 'KRW' | 'USD' | 'EUR' | 'JPY';
#   shippingAddress?: Address;
#   createdAt: string;
# }

# Real-time preview in AsyncAPI Studio
# Paste YAML at https://studio.asyncapi.com to visually inspect
# channel diagrams, message schemas, and binding information

# Bundle multiple AsyncAPI files into one
asyncapi bundle asyncapi.yaml \
  channels/orders.yaml \
  channels/payments.yaml \
  -o bundled-asyncapi.yaml
6

Integrating AsyncAPI Validation into CI/CD

Integrate AsyncAPI document validation and backward compatibility checks (breaking change detection) into your CI/CD pipeline whenever the document changes. Use the asyncapi diff command to analyze schema differences between the previous and current versions to detect breaking changes proactively. Automating documentation generation ensures that code and documentation always stay in sync.

# .github/workflows/asyncapi-ci.yml
name: AsyncAPI CI

on:
  pull_request:
    paths:
      - 'asyncapi/**'
      - 'asyncapi.yaml'

jobs:
  validate:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install AsyncAPI CLI
        run: npm install -g @asyncapi/cli

      # 1. Validate document
      - name: Validate AsyncAPI document
        run: asyncapi validate asyncapi.yaml

      # 2. Detect breaking changes (compare with previous version)
      - name: Check breaking changes
        run: |
          git fetch origin main
          git show origin/main:asyncapi.yaml > /tmp/asyncapi-old.yaml
          asyncapi diff /tmp/asyncapi-old.yaml asyncapi.yaml \
            --overrides breaking=true
          # Exits with non-zero code if breaking changes are found

      # 3. Generate HTML documentation
      - name: Generate documentation
        run: |
          asyncapi generate fromTemplate asyncapi.yaml \
            @asyncapi/html-template \
            -o docs/async-api \
            --force-write

      # 4. Generate TypeScript types and verify build
      - name: Generate TypeScript types
        run: |
          asyncapi generate models typescript asyncapi.yaml \
            -o src/generated/async-types
          npx tsc --noEmit  # Verify type compatibility

      # 5. Deploy generated docs to GitHub Pages
      - name: Deploy docs
        if: github.ref == 'refs/heads/main'
        uses: peaceiris/actions-gh-pages@v3
        with:
          github_token: ${{ secrets.GITHUB_TOKEN }}
          publish_dir: ./docs/async-api

# package.json scripts
# {
#   "scripts": {
#     "asyncapi:validate": "asyncapi validate asyncapi.yaml",
#     "asyncapi:docs": "asyncapi generate fromTemplate asyncapi.yaml @asyncapi/html-template -o docs",
#     "asyncapi:types": "asyncapi generate models typescript asyncapi.yaml -o src/types",
#     "asyncapi:diff": "asyncapi diff asyncapi-prev.yaml asyncapi.yaml",
#     "asyncapi:bundle": "asyncapi bundle asyncapi.yaml -o dist/asyncapi.yaml"
#   }
# }

Common Mistakes

Confusing publish/subscribe semantics in AsyncAPI 3.0, resulting in reversed message flow directions

AsyncAPI 3.0 replaces publish/subscribe with operations using action: send/receive. "send" means the service sends a message, and "receive" means it receives one. This differs from 2.x where "publish" meant the broker receives. When migrating to 3.0, always double-check the direction of each operation.

Omitting protocol bindings, leaving out operational details like Kafka groupId and partition count

While bindings are optional, they contain essential operational information for production environments. For Kafka, always specify topic partition count, retention policy, and consumer groupId. Without this information, the documentation alone cannot convey the full system design.

AsyncAPI version and Generator template version mismatch causing code generation failures

AsyncAPI 3.0 documents require Generator templates that support 3.0. First validate your document with asyncapi validate, then check the template README for supported versions. When migrating from 2.x to 3.0, use the asyncapi convert command to help with the transition.

Not defining required fields or type constraints in message schemas, allowing invalid messages to be published

List all mandatory fields in the required array and use constraints like enum, format, minimum, and maxLength. Running asyncapi validate alongside JSON Schema validation in CI prevents contract violations before they reach production.

Related liminfo Services