Updating the Website

As part of the release process we generate and publish the website. An (intentional) side-effect of this process is to create a branch for the release (that is, {page-isisrel}). This branch can then be used for documentation updates.

Update docs

In the regular isis repo:

  • Check out the branch:

    git checkout {page-isisrel}
  • make documentation updates, and commit changes

  • make sure config properties are up-to-date:

    This is most easily done by rebuilding all:

    pushd bom
    mvn clean install -DskipTests -Dreleased

    Though it might be sufficient to just build the core/config module

    mvn clean install -pl core/config
  • make sure the tooling is also built:

    mvn -D module-tooling -D skip.essential install -DskipTests
  • generate the website:

    sh preview.sh

    This will write to antora/target/site; we’ll use the results in the next section.

    this requires Java 11 for the projdoc tooling.

Publish website

We now copy the results of the Antora website generation over to the isis-site repo:

  • in the isis-site repo, check out the asf-site branch:

    cd ../isis-site
    git checkout asf-site
    git pull --ff-only
  • still in the isis-site repo, run the copyover.sh script:

    sh copyover.sh

    This deletes all the files in content/ except for the schema and versions directories, and copies the generated Antora site to isis-site repo’s contents directory:

    #!/usr/bin/env bash
    pushd content
    for a in $(ls -1 | grep -v schema | grep -v versions)
        rm -rf $a
    pushd ../isis
    cp -Rf antora/target/site/* ../isis-site/content/.
    git add .
  • Commit the changes and preview:

    git commit -m "updates website"
    sh preview.sh
  • If everything looks ok, then push the changes to make live, and switch back to the isis repo:

    git push origin asf-site

Update the Algolia search index

Create a algolia.env file holding the APP_ID and the admin API_KEY, in the root of isis-site:

This file should not be checked into the repo, because the API_KEY allows the index to be modified or deleted.

We use the Algolia-provided docker image for the crawler to perform the search (as per the link:as per docs):

cd content
docker run -it --env-file=../algolia.env -e "CONFIG=$(cat ../algolia-config.json | jq -r tostring)" algolia/docsearch-scraper:v1.16.0

This posts the index up to the Algolia site.

Additional config options for the crawler can be found here.