We've talked about a code driven development process ever since Features came out and we talked about it even more when "drush make" hit the streets. Unfortunately, it's been all talk ever since. We've made a few attempts at a code driven development process, but nothing solid and nothing has ever been seen through to the launch of a site.
The "why"s of never following through are many and not worth getting into. I'm sure if you're reading this you're already familiar with some of them. What is worth getting into is what we're doing now about code driven development.
We've got sites all over the place. Sites in different states of development and clients who have different expectations of when they can do what to a site still in development. Some want to give us content to input during development and get an early stage of their site while others want to do the content entry themselves and generally while development is still in full swing. Our past attempts at code driven development have always ended the second a client starts to input content and it just becomes easier to fall back into old habits of pushing and pulling around a database dump.
Our Solutions so far
Casper is our newly minted base make and install profile. We've taken a look at all our Drupal 7 sites in the past and currently in development and picked out the most common modules amongst them. Modules that we know without a doubt we'll use on every site from this point on. Modules that we consistently use for development, client administration, configuration management, etc. Modules that we've used in the past, we're comfortable with and we know aren't going anywhere.
The install profile includes configuration changes that we do almost every single time (or should be) when starting a new project for a client. Things like setting up client administration roles/permissions, customizing our text formatters, configuration of WYSIWYG editors and switching the admin theme from seven to Rubik. We also do a few things to make development easier like turning on devel and enabling devel_catcher.
This means we can run a drush site-install command and 30 seconds later we've got a consistent Drupal 7 install for a client site and we're hours ahead of the game. Everything from this point on is unique to the client's site we're building.
We run casper from another site specific make file:
api = 2 core = 7.x projects = "drupal" projects[casper][type] = profile projects[casper][download][type] = "git" projects[casper][download][url] = "https://bitbucket.org/bitbucket_path/casper.git"
This let's us add site specific contrib modules and the clients theme to our build process and at the same time allows us to be constantly updating Casper with the latest and greatest and benefit from that during every rebuild process. If we're introducing a change to Casper that could cause issues on already existing or in development sites then we branch Casper, adjust the site specific make file and carry on.
We've also created a few skeleton Features loosely modeled after an MVC architecture. These are generally the only Features we'll create for a typical clients site. We've decided on the MVC architecture versus a section or feature (lowercase f) architecture to make exporting of these Features easier and more maintainable. You can read about MVC as it applies to more involved software development, but essentially we're putting content types in the model feature, module and miscellaneous configuration in the controller feature and views exports in the view feature. There's a bit more to it and some things we're assessing as they come up, but that's the gist.
We've chosen to go with Bitbucket as our repo host. For a small team like ours with a lot of client projects, Bitbucket was a good match in both price and ease of use now that they have a teams feature.
We have one repo dedicated to Casper which includes the make file, an install profile and an info file.
Our client projects repo includes a stub make file, the three MVC skeleton features and the sites custom theme by default and that's it.
Content aside that's all that should ever be in our clients repo (maybe a few custom modules, but we should try our best to use everything that Drupal has to offer). Basically, blueprints on how to build the entire site from scratch and the theme to make her look purdy. Our stub make file references our Casper base install so not only have we eliminated duplication of Drupal core and contrib modules, but we've even removed the duplication of references to them.
All modules from Casper show up in the /profiles directory in a Casper subdirectory and all contrib modules and custom features defined in the site specific make file show up in the /sites/all folder. Seems pretty nice to me.
Hiccups and Glue
There are still a few things that don't run as smooth as we'd like them to. A sub-install profile (using profiler) was something we looked into for daisy chaining a site specific profile from the Casper install profile. This would have allowed us to run a site specific install profile after running Casper's so any site specific contrib modules would automagically be enabled and any site specific general configuration would happen post install. We couldn't get this working elegantly (we haven't given up completely) so for now we've gone with marking these additional contrib modules as dependencies of our core model feature and putting any configuration into the features themselves through exportables or in some cases hook_install()'s. Not such a bad compromise. You turn on the features and you get all the contrib modules you need for the site enabled.
We also wrestled with the "rebuild" idea. Does every change by a development team working on one project require a full site rebuild? If it's just some Feature code that's being exported, it should just be a git pull and a Feature revert. But what happens when a new contrib module has been added to the make file? Does that need to be communicated to the team?
The glue we're working on to make this task easier for now is a custom drush command that does a few different things at once.
- a Features diff to make sure you get everything you need to save into code (this is rudiemntry and won't detect new config that needs to be added)
- a drush make file diff to make sure you put any new modules you have into the site specific make file
- a git pull to get the newest code from the repo
- parse of the new drush make file to download or delete any modules from other developers
- a Features revert to use the new Feature code from other developers
Basically an in-place rebuild. It's not bullet-proof and developers will still need to be aware of what they've done and what they need to do that this automated process won't do for them, but it definitely saves a lot of time and confusion.
We're not quite ready to send our drush build commands out into the wild. We're also in heavy development on some extra drush commands to facilitate the setup and "pushing" of client sites through the entire dev process. A command suite that ranges from setting up new development users within our dev environment (complete with SSH keys added to Bitbucket) to automagically pushing an entire codebase and database to a production server from our dev environment.
We're hard at work on the various components of this workflow and we'll have some follow up posts that get a little more specific and hopefully some code that you can grab and play with.
Anyone else out there doing something similar or something completely different that's working for you? Let us know in the comments.
Title by Evan Barter