In my previous post I talked about creating a private cookbook repository with Jenkins and Berkshelf. In this post i’ll discuss briefly how to use very similar methods to manage open source chef servers, hosted chef accounts or private chef organisations.
At Dyn we use private chef and store each of our private chef organisations in version control using Github. Each repository is very lightweight and contains only the bare minimum of information required for Jenkins to assemble the required cookbooks, data bags, roles and environments and upload them to the destination organisation after running any necessary tests. The repo structure looks a lot like a basic berks layout i’ve included it below with some of the cruft removed. We use the berksfile to specify what cookbooks at what versions should be uploaded into the target organisation.
This should look fairly familiar to existing berkshelf users with a couple of exceptions so i’ll jump straight in to explaining what happens to translate this into a up to date chef organisation on a private chef server with a couple of initial notes…
Stage One: Checkout the chef-data from git into the working directory using the value stored in chef-data.version as the tag to checkout. This ensures that we always get the correct chef data in each organisation and can be running different versions in different places e.g. test, staging, prod.
Stage Two: Run berks install in the working directory to ensure all required cookbooks and dependencies are up to date within the ‘berkshelf’. We use the cookbookrepo I described in my previous post to source the cookbooks.
Stage Three: As the first step of synchronising the chef data on the server we delete any roles, environments, data bags and data bag items that have been removed in this version of the chef data and are no longer required.
Stage Four: Upload new and modified data bags, roles and environments to the chef server. We store the destination config for knife in .chef/knife.rb so we suffix our knife commands with -c .chef/knife.rb to ensure we act on the correct chef server.
Stage Five: Run berks upload using the target information stored within the git repo. We store a berksconfig.json in the .berkshelf folder of the checkout. This specifies the destination we are uploading to so the command run is “berks install -c .berkshelf/target.json”
Stage Six: Finally once everything is completed and the chef server is up to date we run “thor version:bump auto –default patch” to mark the version in git. This way if we wanted to revert for any reason we know we can go back to the previous tag in the repository and get the state exactly as it used to be.
Finally altogether as a script for Jenkins to run
bundle exec berks install
if [ -f chef-data.version ]; then
git clone firstname.lastname@example.org:Organisation/dyn_chef_data.git
git checkout $TAG;
for i in `ls data_bags`; do
bundle exec knife data bag create $i -c $WORKSPACE/.chef/target.rb
bundle exec knife data bag from file $i data_bags/$i -c $WORKSPACE/.chef/target.rb
for i in `ls roles`; do
bundle exec knife role from file $i roles/$i -c $WORKSPACE/.chef/target.rb
for i in `ls environments`; do
bundle exec knife environment from file $i environments/$i -c $WORKSPACE/.chef/target.rb
bundle exec berks upload -c .berkshelf/target.json