This guide covers best practices for developing Buildkite plugins, with examples from this modernized template.
For plugins with <100 lines of logic:
- Keep logic in appropriate hook files (
hooks/command,hooks/pre-command, etc.) - Use shared utilities from
lib/shared.bashandlib/plugin.bash - Skip complex directory structures
Consider modules (shared functionality in lib/modules/) when having:
- Complex shared logic across multiple hooks
- Distinct feature areas (auth, deploy, notify)
- >200 lines of logic in a single file
Consider providers (cloud-specific implementations in lib/providers/) when having:
- Multiple cloud providers (AWS vs GCP vs Azure implementations)
- Different backends with unique authentication/configuration
- Provider-specific logic that doesn't apply to other implementations
For provider-specific handling, create separate modules like lib/providers/aws.bash, lib/providers/gcp.bash, etc.
Use for complex plugins only:
- Expensive dependency checking that should run once (
check_dependencies docker aws) - Authentication setup that persists across multiple hooks
- Setting up environment variables shared between pre-command, command, and post-command hooks
Note: Most plugins shouldn't need an environment hook only to perform validations and its logic should be kept inside the hook that contains the actual functionality.
Use for:
- Main plugin execution logic
- Processing that requires validated environment
- Operations that should only run once
- Pre-command: Setup that affects the main command
- Post-command: Reporting, artifact handling (success path only)
- Pre-exit: Cleanup operations (guaranteed to run even on cancellation)
Always use bash strict mode to catch errors early:
#!/bin/bash
set -euo pipefail # Exit on error, undefined vars, pipe failures-e: Exit immediately if any command fails-u: Exit on undefined variables-o pipefail: Fail on any command in a pipeline
Always add error traps to show where failures occur:
#!/bin/bash
set -euo pipefail
# Load shared utilities
source "$(dirname "${BASH_SOURCE[0]}")/../lib/shared.bash"
# Set up error reporting early
setup_error_trap
# Your plugin logic hereThis provides essential debugging information for both developers and users when plugins fail unexpectedly.
# In environment hook - fail fast
validate_required_config "registry URL" "${registry_url}"
check_dependencies docker aws
# In command hook - validate before expensive operations
if ! docker info >/dev/null 2>&1; then
log_error "Docker daemon is not running"
exit 1
fiUse the provided logging helpers for consistent output:
# Bad
echo "Error: Authentication failed" >&2
# Good - use logging helpers with descriptive messages
log_error "Failed to authenticate with registry ${registry_url}. Check your credentials and network connectivity."
log_warning "AWS CLI not found, trying docker login directly"
log_info "Processing ${image_count} images"
log_success "All images pushed successfully"# Try preferred method, fall back to alternative when available
if ! command_exists aws; then
log_warning "AWS CLI not found, trying docker login directly"
# Alternative authentication method
fi# Required - fail immediately if missing
api_url=$(plugin_read_config API_URL "")
validate_required_config "API URL" "${api_url}"
# Optional - provide sensible default
timeout=$(plugin_read_config TIMEOUT "30")# Handle both single values and arrays
if plugin_read_list_into_result TAGS; then
for tag in "${result[@]}"; do
log_info "Processing tag: ${tag}"
done
fiPlugin Tester - Run all tests:
docker run -it --rm -v "$PWD:/plugin:ro" buildkite/plugin-testerPlugin Linter - Validate plugin structure:
# Replace 'your-plugin-name' with your actual plugin name
docker run -it --rm -v "$PWD:/plugin:ro" buildkite/plugin-linter --id your-plugin-name --path /pluginShellCheck - Static analysis for shell scripts:
shellcheck hooks/* tests/* lib/*.bash lib/modules/* lib/providers/*Test individual functions:
@test "validates required config" {
export BUILDKITE_PLUGIN_MYPLUGIN_API_TOKEN=""
run validate_required_config "API token" "${BUILDKITE_PLUGIN_MYPLUGIN_API_TOKEN}"
assert_failure 1 # ensure it fails with exit code 1
assert_output --partial "API token is required"
}Test full plugin execution with realistic scenarios:
@test "handles missing dependencies gracefully" {
# Mock missing command
run hooks/command
assert_failure
assert_output --partial "Missing required dependencies"
}tests/
โโโ command.bats # Plugin functionality tests
# Bad - slow network call in environment hook
validate_api_connectivity "${api_url}"
# Good - defer expensive operations to command hook
export PLUGIN_API_URL="${api_url}"Instead of doing:
get_account_id() {
aws sts get-caller-identity --query Account --output text
}
# somewhere else
ACCOUNT_ID="$(get_account_id)"Cache the result of the call:
get_account_id() {
# Cache account ID lookup
if [[ -z "${CACHED_ACCOUNT_ID:-}" ]]; then
CACHED_ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
export CACHED_ACCOUNT_ID
fi
echo "$CACHED_ACCOUNT_ID"
}
# somewhere else
ACCOUNT_ID="$(get_account_id)"# Prevent injection attacks
if [[ ! "${region}" =~ ^[a-z0-9-]+$ ]]; then
log_error "Invalid region format: ${region}"
exit 1
fiDo not reference secrets directly in options or in the pipeline or with special names.
The following pipeline will get SECRET_API_TOKEN interpolated in the step that does the pipeline upload and not correctly redacted afterwards (without modifying the agent's configuration):
plugins:
- myplugin#v1.0.0:
my-secret-variable: $SECRET_API_TOKENInstead, take an environment variable name and take advantage of bash's dereference functionality:
plugins:
- myplugin#v1.0.0:
secret-variable-name: MY_VARIABLEAnd in the code:
SECRET_VARIABLE_NAME="$(plugin_read_config secret-variable-name)"
SECRET_TOKEN="${!SECRET_VARIABLE_NAME}"Always support debug mode:
enable_debug_if_requested # Enables set -x if BUILDKITE_PLUGIN_DEBUG=truelog_info "Authenticating with ${registry_host}"
log_info "Pushing ${image_count} images"
log_success "All images pushed successfully"Keep README examples simple and focused. Show the most common use cases clearly rather than trying to cover every scenario.