Building Modern CI/CD Pipelines with Jenkins and Groovy: A Practical Guide (Part 2)
Welcome to the second part of our series on building production-ready CI/CD pipelines, where we will finish up our exploration into Jenkins and Groovy basics. If you somehow got here without reading part 1, you’ll want to familiarize yourself with that portion beforehand (also it would be kind of weird to read part 2 first!). In this part, we’re going to discuss how to integrate Groovy with Jenkins.
The Declarative Pipeline Structure
Jenkins offers two pipeline syntaxes. We'll start with declarative because it's more structured and easier to understand:
pipeline {
agent any
environment {
APP_NAME = 'my-awesome-app'
BUILD_VERSION = "${env.BUILD_NUMBER}"
AWS_REGION = 'us-east-1'
}
stages {
stage('Build') {
steps {
echo "Building ${APP_NAME} version ${BUILD_VERSION}"
sh 'mvn clean package'
}
}
stage('Test') {
steps {
echo 'Running tests...'
sh 'mvn test'
}
}
stage('Package') {
steps {
script {
def timestamp = new Date().format('yyyyMMdd-HHmmss')
def artifactName = "${APP_NAME}-${BUILD_VERSION}-${timestamp}.jar"
echo "Creating artifact: ${artifactName}"
sh "cp target/*.jar artifacts/${artifactName}"
}
}
}
}
post {
success {
echo 'Pipeline completed successfully!'
}
failure {
echo 'Pipeline failed. Check the logs.'
}
}
}
Key components explained:
agent: Where the pipeline runs (any available agent, or specific labeled nodes)
environment: Variables available throughout the pipeline
stages: The major phases of your pipeline
steps: Individual actions within a stage
script: Allows you to use scripted Groovy inside declarative pipelines
post: Actions to take after pipeline completion
Adding Intelligence with Script Blocks
The script block is where your Groovy knowledge shines. It lets you add logic to declarative pipelines:
pipeline {
agent any
stages {
stage('Deploy') {
steps {
script {
// Determine environment based on branch
def targetEnv = 'dev'
if (env.BRANCH_NAME == 'main') {
targetEnv = 'production'
} else if (env.BRANCH_NAME == 'develop') {
targetEnv = 'staging'
}
echo "Deploying to ${targetEnv} environment"
// Different configurations per environment
def envConfig = [
dev: [instances: 1, size: 'small'],
staging: [instances: 2, size: 'medium'],
production: [instances: 5, size: 'large']
]
def config = envConfig[targetEnv]
sh """
aws ecs update-service \\
--cluster ${targetEnv}-cluster \\
--service web-service \\
--desired-count ${config.instances} \\
--region ${AWS_REGION}
"""
}
}
}
}
}
Why this pattern rocks: You can maintain a single Jenkinsfile that intelligently handles all environments, reducing duplication and errors.
Practical Pipeline: Multi-Environment Deployment
Let's build something closer to what you'd use in production:
pipeline {
agent any
environment {
AWS_REGION = 'us-east-1'
ECR_REPO = '123456789012.dkr.ecr.us-east-1.amazonaws.com/my-app'
APP_NAME = 'my-web-app'
}
parameters {
choice(
name: 'DEPLOY_ENV',
choices: ['dev', 'staging', 'production'],
description: 'Target environment for deployment'
)
booleanParam(
name: 'RUN_TESTS',
defaultValue: true,
description: 'Run test suite before deployment'
)
}
stages {
stage('Checkout') {
steps {
checkout scm
script {
env.GIT_COMMIT_SHORT = sh(
script: 'git rev-parse --short HEAD',
returnStdout: true
).trim()
}
}
}
stage('Build Docker Image') {
steps {
script {
def imageTag = "${env.BUILD_NUMBER}-${env.GIT_COMMIT_SHORT}"
sh """
docker build \\
-t ${APP_NAME}:${imageTag} \\
-t ${APP_NAME}:latest \\
.
"""
// Store for later stages
env.IMAGE_TAG = imageTag
}
}
}
stage('Run Tests') {
when {
expression { params.RUN_TESTS == true }
}
steps {
sh "docker run --rm ${APP_NAME}:${env.IMAGE_TAG} npm test"
}
}
stage('Push to ECR') {
steps {
script {
sh """
aws ecr get-login-password --region ${AWS_REGION} | \\
docker login --username AWS --password-stdin ${ECR_REPO}
docker tag ${APP_NAME}:${env.IMAGE_TAG} ${ECR_REPO}:${env.IMAGE_TAG}
docker tag ${APP_NAME}:${env.IMAGE_TAG} ${ECR_REPO}:${params.DEPLOY_ENV}-latest
docker push ${ECR_REPO}:${env.IMAGE_TAG}
docker push ${ECR_REPO}:${params.DEPLOY_ENV}-latest
"""
}
}
}
stage('Deploy to ECS') {
steps {
script {
def clusterName = "${params.DEPLOY_ENV}-cluster"
def serviceName = "${APP_NAME}-service"
echo "Deploying ${env.IMAGE_TAG} to ${params.DEPLOY_ENV}"
sh """
aws ecs update-service \\
--cluster ${clusterName} \\
--service ${serviceName} \\
--force-new-deployment \\
--region ${AWS_REGION}
"""
echo "Deployment initiated. Service will update automatically."
}
}
}
stage('Verify Deployment') {
steps {
script {
def clusterName = "${params.DEPLOY_ENV}-cluster"
def serviceName = "${APP_NAME}-service"
timeout(time: 10, unit: 'MINUTES') {
waitUntil {
def result = sh(
script: """
aws ecs describe-services \\
--cluster ${clusterName} \\
--services ${serviceName} \\
--region ${AWS_REGION} \\
--query 'services[0].deployments[0].rolloutState' \\
--output text
""",
returnStdout: true
).trim()
echo "Deployment status: ${result}"
return result == 'COMPLETED'
}
}
echo "Deployment completed successfully!"
}
}
}
}
post {
success {
echo """
Deployment successful!
Environment: ${params.DEPLOY_ENV}
Image: ${ECR_REPO}:${env.IMAGE_TAG}
Commit: ${env.GIT_COMMIT_SHORT}
"""
}
failure {
echo """
Deployment failed!
Check the logs above for details.
"""
}
always {
// Cleanup
sh "docker rmi ${APP_NAME}:${env.IMAGE_TAG} || true"
}
}
}
What makes this production-ready:
Parameterization: You can trigger with different environments
Conditional execution: Tests can be skipped if needed (useful for hotfixes)
Git integration: Tags images with commit hash for traceability
Error handling: post blocks handle success/failure scenarios
Verification: Waits for deployment to complete before finishing
Cleanup: Removes local images to save disk space
Key Takeaways
Today we covered the foundational building blocks:
Groovy gives you power: String interpolation, collections, and closures make pipeline code elegant
Declarative pipelines provide structure: Easy to read and understand
Script blocks add flexibility: Use Groovy's full capabilities when needed
Think in data structures: Use maps and lists to represent your infrastructure
Parameters make pipelines reusable: One pipeline for multiple environments
What's Next?
In Part 3 of this series, we'll explore using GitHub Actions as a CI/CD tool, since it is the most widely used platform for personal projects and organizations.