Reading Time: 6 minutes

In this post we shall see how you can build Continuous Integration Pipelines for your PlantUML diagrams. If you are into any sort of designing or development, you most probably know about PlantUML which is one of the best UML Designing tool. The main power lies in the fact that it provides Diagram as a Code for all types of design and diagramming needs. This “As-A-Code” culture is lot prominent these days, due to multiple reason, primarily being ease of use and management with Infra and devops.

But there is a problem as well. The diagrams needs to be manually updated, since the source is NOT a diagram, but a code. But what if you can have a pipeline that automatically generates all your images from the diagram sources. A little tricky but we shall see in this post how you can do so.

PS: you can quickly skip the recap and introduction. Click here to go directly to Pipeline Setup.

Quick Recap

In very sort, plant UML enables you to create diagram like below with simple text based code very simply. Refer below

I have done a 30 min masterclass that covers everything from installation to rendering. In case you have not already checked it, you can refer to my previous post here or checkout my video below.

The Problem with PlantUML – Generate Rendered Images

The main problem is the fact that the source code of the diagram is a a code. You need to generate a PNG image and then refer in your pages. The images can be referred from a Standalone Site or GitPages which provides dynamic updates as you update rendered images form the puml files. But as the number of diagrams grow, this can become a challenge. Challenge from the fact that there can be chances when the some of the images are missed out being updated.

Solution

This is something might or might not happen but why take a risk. We can simply refer the pipeline as we see below and it will auto generate all the “UPDATED” images in the code and update it on your Git based repository.

Lets Start the Pipeline!!!

We know the problem, lets start with the tools that we are going to need. I have specifically tried to keep it simple, and agnostic. Basically if you know the approach, you can simply refer it anywhere.

PS: I am using this on a “PRIVATE/ORG” repo with limited access to match the most common use case.

Tools Used

We shall be using

  1. Azure Devops
  2. Git based repository
  3. VS Code

Steps to follow

Basically the Pseudocode loos like following

  1. clone the repository
  2. Find the PUML files
  3. Render all the diagrams
  4. Stage the Changes (git detects only the changed images)
  5. Add git user metadata
  6. commit and Push the changes.

All this is achieved using a simple shell script that looks like following

#!/bin/bash
git clone https://your-repo:[email protected]/your-account/project-name/_git/diagram-repo
cd test-repo
git config --global user.email "[email protected]"
git config --global user.name "CI/CD Agent"
mkdir -p images
newExtension=".png"
for i in *.puml; do
    [ -f "$i" ] || break
    newFileName=${i/.puml/.png}    
    finalFileName="images/${newFileName}"
    cat "$i" | java -Djava.awt.headless=true -jar ../plantuml.jar -p -tpng > $finalFileName
done
git add .
git commit -m "CI/CD - Diagram update"
git push

Few pointers here:

  1. Images folder is harcoded, you can update as needed or bring as a pipeline variable or skip alltogether.
  2. on Line 2 $PAT is my Personal Access Token (PAT) that is configured as pipeline secret. This is used to commit the changes.
  3. You can keep the file as a part of source, or can simply put on AzureStorage/S3 and call it (wget) from there.

With this simple piece of code all the heavy lifting is done, but what we need to provide all the fodder to the image. This part is done in the Docker build. The Dockerfile looks like following.

FROM openjdk:14-jdk-alpine3.10
ARG PAT=1
ENV PLANTUML_VERSION=1.2020.9
RUN \
  apk update && \
  apk add --no-cache graphviz git wget ca-certificates ttf-dejavu fontconfig && \
  apk add --no-cache bash && \
  wget "http://downloads.sourceforge.net/project/plantuml/${PLANTUML_VERSION}/plantuml.${PLANTUML_VERSION}.jar" -O plantuml.jar && \
  apk del wget ca-certificates
COPY . .
RUN ["chmod", "+x", "./proc2.sh"]
RUN ./proc2.sh
CMD ["/bin/bash"]

Few Pointers here

  1. Line 2 are the Azure Devops args being passed in. This is basically the PAT that is used in the Git.
  2. Line 3 defines the version of PlantUML being referred.
  3. The APK repositories are updated and dependencies installed at Line 6

Finally with Docker file ready as well, now all we need to glue everything was the azure devops pipeline configuration file. Refer below.

trigger:
- main

resources:
- repo: self

variables:
  tag: '$(Build.BuildId)'

stages:
- stage: Build
  displayName: Build image
  jobs:
  - job: Build
    displayName: Build
    pool:
      vmImage: ubuntu-latest
    steps:
    - task: [email protected]
      displayName: Build an image
      inputs:
        command: build
        dockerfile: '$(Build.SourcesDirectory)/Dockerfile'
        tags: |
          $(tag)
        arguments: '--build-arg PAT=$(PAT)'

Lastly, the Pipeline variables look like this

With this setup and trigger mode as everything on main repo, as soon as the puml files are updated the diagrams are rendered and committed back.

CI Builds

The diagrams are committed as following.

With the images being auto updated, any of the sites or github pages referring will be get the latest picture. This also absolves developers of responsibility to update any images. They just commit code which can be annotated and validated.

Hope you like the content and it helped you out. Do post any questions in the comments below.

Hand-on Lab

Hope it helped you out. The detailed video and handson lab shall be available shortly below.

References:

Docker Build nakatt : nakatt/plantuml:1.2020.9
personal blog: https://vineetyadav.com/devops/plantuml-ci-pipeline.html

With an experience of more then 12 yrs in IT, Author is currently working as an Architect in Cloud and Microservices and helps organizations leverage to full potential in the Cloud Revolution. Author has worked across different domains such as Defense, Oil and Gas, Casinos, MedTech etc. In the part time, author loves to play Warcraft & AOE, write blogs and learn new things.

Leave a Reply

Your email address will not be published. Required fields are marked *