Running Your First Deployer Deploy
Now that the Deployer configuration file's finished, let's run the first deploy.
To do that, here in the console, I'll call
./vendor/bin/dep deploy production, which begins the deployment to production.
You can see that a notice is displayed as each task begins, and that when it's completed, a tick appears next to it. And that's how easy it is to do a deploy.
Now let's have a look at the directory structure. I've been deploying to a local virtual machine, just to keep everything simple. I'll
ssh to it so we can have a look at the created directory structure.
Changing to the directory, I'll run tree on the directory to give a quick listing. Limiting the depth to two, you can see that it's created the three base directories, as well as the first release, inside the
We can also see that
current is a symlink to that release.
Now let’s finish up by looking at a few of the other commands, specifically, list, current, cleanup, and rollback.
Running Deployer, by calling
vendor/bin/dep list, you can see that it lists all of the available options and commands, including deploy, which we’ve already seen, as well as current, cleanup, and rollback. Let’s now have a look at these three.
Perhaps you're curious as to which release is the current one.
You could do what I've done, but that's a bit of unnecessary effort.
A simpler way is to run the current command. Similar to the deploy command, we run it by calling
dep current followed by the server we want to check.
This prints out the directory name which the current release is symlinked to.
Now let's look at rollback. Let's say that something went wrong with the last deploy. Perhaps the code had a bug, which broke the site. So you need to roll it back in a hurry.
To do that, as with deploy and current, we specify
dep rollback production.
This then changes the symlink to the previous release, assuming there's more than one release available. We can confirm that it's done that by running current again.
Now let's look at cleanup. You may remember we set
keep_releases to five.
Let's fast-forward to a point where we have five or more releases and we need to free up space. Let's assume that I've changed
keep_releases to 3, for the sakes of a simple example.
Let's clean up releases older than the first three by running cleanup, which I'll do now. Running tree again in the virtual machine you can see that it has removed the other releases, cleaning up the directory in the process. Very handy!
Now that we've stepped through all of the practical steps of creating and making a deployment, let's finish up by looking at the available help options.
Of the two, likely the best place to start is Gitter. If you're not familiar with Gitter, it's a web-based chat room and IM service for people using GitHub repositories. Whilst not as full-featured as some other solutions, such as Slack, it's still a really handy and effective tool.
Anyway, to access it, from the Deployer documentation, click the minimised menu button here in the top left, and then click Discuss, second from the bottom. This will redirect to the Deployer chat room in Gitter.
There, like any other chat room, or IRC channel, you can ask all the questions you need to. Just remember to be respectful and follow the usual rules and etiquette. Alternatively, if you have an issue or find a bug, you can log it, or search for it in the issues tracker in the GitHub repository.
Clicking "Fork me on GitHub", then Issues on the right-hand side, you can see the current issues. If you think you've found a bug, search here first. Then, if it's not available, add it to the list.
And that brings us to the end of the series. I hope you've seen just how light, yet flexible Deployer is, and seen just how quickly you can get started with it, regardless of project size. I strongly encourage you to get started, today even, and try it out in your projects.
If you have any issues, get in to the group chat, or check out the GitHub issues to see if someone can help you, or if anyone's had the same issue you face. It's an excellent tool, and I'm very keen to see where it goes over time.