Pages

Friday, August 25, 2017

Jenkins - Integrating with Amazon S3 Bucket

In a DevOps process, it is often necessary to store configuration as well as artifacts in a repository or on cloud. Jenkins provides us with an integration to Amazon s3 bucket by which we can upload the configuration or jobs to the S3 bucket. Being a reliable source of storage and cheap ,S3 buckets are easy to configure ,track and manage objects. A default back up job was configured to upload the configuration as well as job details every day using the Amazon S3 bucket plugin.
Download the S3  publisher plugin from “Manage plugins”, and configure the plugin from the “Manage Jenkins” -> “Configure System”.
Configure with the Access key and Secret key from your Amazon 3 bucket. Once done create a free style job. Enter the below commands in the “Execute shell” in “Build actions” 
# Delete all files in the workspace
rm -rf *
# Create a directory for the job definitions
mkdir -p $BUILD_ID/jobs
# Copy global configuration files into the workspace
cp $JENKINS_HOME/*.xml $BUILD_ID/
# Copy keys and secrets into the workspace

if [ -e "$JENKINS_HOME/identity.key" ]; then
  cp $JENKINS_HOME/identity.key $BUILD_ID/
fi

if [ -e "$JENKINS_HOME/secret.key" ]; then
  cp $JENKINS_HOME/secret.key $BUILD_ID/
fi

if [ -e "$JENKINS_HOME/secret.key.not-so-secret" ]; then
  cp $JENKINS_HOME/secret.key.not-so-secret $BUILD_ID/
fi

if [ -e "$JENKINS_HOME/secrets" ]; then
  cp -r $JENKINS_HOME/secrets $BUILD_ID/
fi

# Copy user configuration files into the workspace
cp -r $JENKINS_HOME/users $BUILD_ID/
# Copy job definitions into the workspace
rsync -am --include='config.xml' --include='*/' --prune-empty-dirs --exclude='*' $JENKINS_HOME/jobs/ $BUILD_ID/jobs/
# Create an archive from all copied files (since the S3 plugin cannot copy folders recursively)
tar czf $BUILD_ID.tar.gz $BUILD_ID/
# Remove the directory so only the archive gets copied to S3
rm -rf $BUILD_ID
Now in the “Post Build” action choose the “publish artifacts to S3 bucket” which will allow us the configuration options as,

The configuration files as well as build files will be saved in tar.gz and will be uploaded to the destination S3 bucket defined.
Schedule the Job to run for every mid night which will upload the configuration files and build details.

7 comments :

  1. This blog gives a lots of Information, It's Useful,Thanks for the Information


    AWS Online Training

    ReplyDelete
  2. I try to deploy my project build to s3 bucket using s3 publisher plugin but I want to move only sub-directory and files of dist folder of bulid to my s3 bucket. currently the dist folder is also created and contain my build content.
    I'have already tried /dist// , dist/, dist//

    ReplyDelete
  3. anyone knows how to set the permissions? my files are uploaded to s3 without permissions? any idea?

    ReplyDelete
  4. How can we delete the old files from s3 directory before copying new files after the build?

    ReplyDelete
  5. We are a group of volunteers and starting a new initiative in a community. Your blog provided us valuable information to work on.You have done a marvellous job!

    DevOps Training in Pune

    ReplyDelete