GIthub – Travis Ci – Heroku CI – CD


  1. Create a GitHub repository;
  2. Setup a Rails application;
  3. Create an account on Travis CI and link it with your repository;
  4. Create an account on Heroku and link it with the repository;
  5. Start the heroku console from the terminal;
  6. Setup a travis yml file;
  7. Push it.


If you just want to know about how to connect Travis CI and Heroku jump to the step 3 Create an account on Travis CI and link with your repository.

1. Create a GitHub repository

Create a GitHub repository with your account and clone it using HTTPS or SSH, to keep this simple as possible I’ll pick HTTPS:

$ git clone

After clone, enter at the directory using:

$ cd rails-test-app

2. Setup a Rails application

To initiate the Rails application you need to have already installed ruby and rails on your computer, for this, I highly recommend you to use RVM.

You’ll also need a database to run your app, install PostegreSQL for that. Follow the instructions from the official website picking your OS option.

Now you can run the follow commands in your terminal:

First install bundler:

$ gem install bundler

Install Rails:

$ gem install rails

Finally, create a Rails app with PostgreSQL using the name of the current folder:

$ rails new --database=postgresql .



Ok, it’s done, after all that code running down by your screen, your terminal is available again!

Just to check if everything is working as well, type the follow command to start the Rails server and check localhost:3000 in your browser.

$ rails server

If it’s the same screen below, great! Keep going, if isn’t, try to check the steps that you did until now.

Send the created app to GitHub, commit and push to master:

$ git add .
$ git commit -m "Add initial structure"
$ git push origin master

3. Create an account on Travis CI and link it with your repository

To do that you can go to, to make it easier create your account using your Github.

You will see the Authorize Screen telling to accept and link Travis CI with your Github account.

Inside your account, on Accounts option click on the switch besides the repository name as the gif below.

Check the repo again going to Settings and Integrations & services. Oh, can you see Travis CI over there?

4. Create an account on Heroku and link with the repository

At create a new account, you need to confirm your email and all that stuff. When you have finished, click on “Create New App”.

Inside your application, go to “Deploy”, search for “Deployment method” and then select the GitHub option.

A wild window appears, “Authorize Heroku”, accept it, and to complete the connection just search by the name of the repository and connect!

At Deploy tab on the “Automatic deploys” section don’t forget to check the “Wait for CI to pass before deploy” option and enable the Automatic Deploys:

Back to your Settings page on GitHub Repository, on the Webhooks, guess who is there?

5. Start the Heroku console from the terminal

Now that you have Heroku linked with the repo, install the Heroku CLI and login with our new account.

To install the Heroku CLI just follow the official documentation, if you are using a Mac OSX like me, just use:

$ brew install heroku/brew/heroku

Now with heroku installed, enter with your account:

$ heroku login 

Put your email and password so you would see this message:

“Logged in as”

Create a remote reference to your repo:

$ heroku git:remote -a rails-test-app-article

Done! Now you can directly push to your Heroku and deploy your app by the terminal, however, we will learn in the last steps how to put all these things together!

6. Setup a travis yml file

Let’s start creating a file

$ touch .travis.yml

Now open the created file and paste this code:

language: ruby
- bundler
- bundle exec rake db:create
- bundle exec rake db:migrate
- bundle exec rake assets:precompile
provider: heroku
secure: KEY
app: rails-test-app-article
repo: felipeluizsoares/rails-test-app

In this yml file I’m defining:

  • The language to the Travis CI knows what to do to run my code;
  • What I want to cache, in this case is the bundler, in a Node JS example would be node_modules;
  • The scripts to run before the script itself, I’m creating the DB, running the migrations and pre-compiling the assets;
  • Creating a deploy task to run at Heroku, to do that Is needed the API KEY (we don’t have it yet), the name of the app at Heroku(rails-test-app-article) and the name of the repo at Github(felipeluizsoares/rails-test-app)

To get the API KEY from Heroku just run a command on terminal, although this key needs to be secret so we should encrypt that before putting it in the file.

Install the Travis CI gem to be able to use the encrypt from Travis:

$ gem install travis

Now, run this code that will envoque the encrypt and pass the Heroku API KEY

$ travis encrypt $(heroku auth:token)

You probabaly will see this message: “Detected repository as yourname/reponame, is this correct? |yes|”

Answer with yes and 🎉 🎉 you have your API KEY!

So replace the KEY at the secure: line on the yml file by your key.

7. Push it

In this last step, push everything you did to GitHub, commit the yml file and push it to a new branch at the repo.

$ git branch -b add-travis-yml-file
$ git add .
$ git commit -m "Add travis yml file"
$ git push -u origin add-travis-yml-file

Inside your repository on Github open a new PR from your new branch, target the master and the CI will be running there.

When you merge the PR, go to your Heroku Dashboard and check the lastest activity.

Now evertytime you open a PR Travis will be running the tests and when this PR were merged Heroku will deploy it automatically!


Use this power!

Now you can apply this knowledge to your stack and every time that someone pushes a PR to the project you are working on, check if the tests are passing before merging. You can block the merges when the CI isn’t checked and with these additional steps you are protecting your codebase.

Make it easier!

Forget about deploying to developer every time just to check if something is working or showing some feature to another developer, let Heroku do that for you! When your stack is getting more consistent, you can apply the same automatically deploy to staging and production, check more about it on Heroku Pipelines documentation!

I hope you have learned with this, let me know if you have any questions in the comments 🙂

Free SSL certs

No more excuses for HTTP traffic websites.

Big corps (including PayPal, Google and lastly the WordPress) have announced that they will require hosts to have SSL (or HTTPS) available for certain services, APIs, webhooks and OAuth.
First of all I assume, your site is perfectly loading via and you are on a private network (that means you are the only owner of the IP you are using).

Install the certbot client

Go to this website and simply select your operating system and the web-server client. Follow the steps to install certbot-auto. In my case, I used the following couple lines.

chmod a+x certbot-auto

And the next you need a very simple config.ini file, I put mine under /etc/letsencrypt/config.ini, it includes following. Don’t forget to change “” to your email address.
rsa-key-size = 4096
email =
Our certificate client ready, this will allow us to install and update the certificate.

Create the SSL certificate

Go to the directory where you installed your certbot-auto client. And simply run the following commands. Don’t forget to change to your domain name (and of course the directory of the files)

certbot-auto certonly --webroot -w /var/www/html/domain1 -d -d -w /var/www/html/domain1/sub -d --config /etc/letsencrypt/config.ini --agree-tos --keep
certbot-auto certonly --webroot -w /var/www/html/domain2 -d -d -w /var/www/html/domain2/sub -d --config /etc/letsencrypt/config.ini --agree-tos --keep

You can run the code above for your other domains/subdomains similarly.

If everything goes smoothly (hopefully, it will). It will generate the certificate files under /etc/letsencrypt/live/ and /etc/letsencrypt/live/ we will use them in the next step.

Attach them to your domain

Now edit your ssl configuration file at /etc/httpd/conf.d/ssl.conf. And copy the below code for each domain/subdomain.

<VirtualHost *:443>
    DocumentRoot "/var/www/html/domain1"
    SSLEngine on
    SSLCertificateFile /etc/letsencrypt/live/
    SSLCertificateKeyFile /etc/letsencrypt/live/
    SSLCertificateChainFile /etc/letsencrypt/live/
    SSLProtocol All -SSLv2 -SSLv3
    SSLHonorCipherOrder on

Let’s automate it to renew after 90 days

SSLs generated by Let’s Encrypt is valid only for 90 days. You need to renew the certificate before it expires so there is no downtime through your HTTPS traffic. I use crontab for this using the code below.

0 0 1 * * /var/www/
And my looks like this…
# Renew Let's Encrypt SSL cert
/opt/letsencrypt/letsencrypt-auto renew --config /etc/letsencrypt/config.ini --agree-tos

if [ $? -ne 0 ]
        ERRORLOG=`tail /var/log/letsencrypt/letsencrypt.log`
        echo -e "The Lets Encrypt Cert has not been renewed! \n \n" $ERRORLOG | mail -s "Lets Encrypt Cert Alert" "FIX IT! :)"
        service httpd reload
exit 0

Please note that we piped the result to a very well known mail command. So you get notification if it fails to renew. Feel free to change the script the way you want it. And do not forget to comment below if you find this post useful ?

Migrating git repos from different providers

I recently had to migrate over 70 repositories from gitlab to bitbucket. There are import tools provided by providers but they require access to each other in order for them to work. Because the providers we were using could not visibly see each other over the web without being connected to the same VPN this was not an option.

A nice little workaround to get all information from a git repository can be found on my public github here:

Using scala traits in a java code base

I am currently working on a code base with a mix of both Java and Scala. All of the apis have used dropwizard with scala but Scala support stopped at version 0.7.1. This is inhibiting us from upgrading to the latest version of dropwizard and also adopting Java 1.8. We have started re writing all of the Scala dropwizard apis back to java.

One of the problems we have encountered are that these dropwizard projects have a dependency on other projects written in scala. To minimize the amount of code to re write and get code released as early as possible, we only want to refactor what we need to enable us to get on the latest version of dropwizard and compile using java 8.

As Scala uses the adoption of traits quite heavily this has caused us some problems. Traits are very similar to Interfaces in Java but also provide implementation of its methods. If you try and implement the Scala trait in java code you will get a compile error as the trait has implemented behavior, as it’s not a regular Java interface.

Step-by-step guide

A quick and easy way around this is to create a wrapper abstract class in the Scala library which extends the trait. You can then extend this abstract class in your java code and get the implemented method behavior through this wrapper class. The example below allows you to call the concactStrings in the scala trait in your java code e.g.

// scala libary

package mathew

// the original trait

trait MathewTrait{

    def concatStrings(stringOne: String, stringTwo b) =  stringOne + stringTwo


// the wrapper class

abstract class MathewTraitWrapper extends MathewTrait

// java code

package mathew;

public class JavaMathew extends MathewTraitWrapper {

    public String JavaConcatStrings() {

    return concatStrings("Test message1","Test message 2");



Service worker and offline content

service worker

I have recently been working on a project to share driving licence information using an offline method for users. I created an apple wallet offline pass first. I have now started looking at alternatives to apple wallet that would work across other platforms.

Introducing service workers

A service worker is a script that stands between your website and the network, giving you, among other things, the ability to intercept network requests and respond to them in different ways. The idea being that we create a simple HTML representation of the apple wallet driving licence share pass and make this available offline using service worker technology. We will also use other progressive web application methods like the manifest.json to create an app like user experience so a user has a shortcut icon on their phone to the driving licence share pass.

Registering a service worker

You make a service worker take effect by registering it. This registration is done from outside the service worker, by another page or script on your website. On my website, a global site.js script is included on every HTML page. I register my service worker from there.

When you register a service worker, you (optionally) also tell it what scope it should apply itself to. You can instruct a service worker only to handle stuff for part of your website (for example, ‘/blog/’) or you can register it for your whole website (‘/’) like I do.

Service worker life-cycle

A service worker does the bulk of its work by listening for relevant events and responding to them in useful ways. Different events are triggered at different points in a service worker’s life-cycle.

Once the service worker has been registered and downloaded, it gets installed in the background. Your service worker can listen for the install event and perform tasks appropriate for this stage.

In our case, we want to take advantage of the install state to pre-cache a bunch of assets that we know we will want available offline later.

After the install stage is finished, the service worker is then activated. That means the service worker is now in control of things within its scope and can do its thing. The activate event isn’t too exciting for a new service worker, but we’ll see how it’s useful when updating a service worker with a new version.

Exactly when activation occurs depends on whether this is a brand-new service worker or an updated version of a pre-existing service worker. If the browser does not have a previous version of a given service worker already registered, activation will happen immediately after installation is complete.

Once installation and activation are complete, they won’t occur again until an updated version of the service worker is downloaded and registered.

Beyond installation and activation, we’ll be looking primarily at the fetch event today to make our service worker useful. But there are several useful events beyond that: sync events and notification events, for example.

The fetch event intercepts any request made by the user. We can then use this event to return requests from cache and not the network.

I adopted a cache first falling back to network on all the assets and page responsible for rendering the html version of the apple wallet driving licence share pass.

Exposing localhost to outside world 

I come across something really useful while working on my latest project. If you want to show something you are working on to anyone in the world without deploying your code you can use localtunnel.

A very useful tool to expose a port on your local machine to the outside world. This means you can show your local development of a website/service to anyone with an internet connection.

Github repo:

  • Install NPM: sudo apt install npm
  • Install localtunnel: sudo npm install -g localtunnel
  • Install NODE JS: sudo apt-get install nodejs-legacy
  • Run the following with your port of what you want to expose: sudo lt –port 9000 -s subdomainnamehere

Accessibility fixes

While building the Apply for Design service the team ensure the services can be used by all citizens of the United Kingdom. No user should be excluded on the basis of disability. To do so would breach the Equality Act 2010. The service must also comply with any other legal requirements, As a starting point, your service should aim to meet Level AA of the Web Content Accessibility Guidelines (WCAG) 2.0.

When we tested the service at, we had a number of issues raised which we have now reviewed and fixed at source. These fixes have been applied to our ensuring we have a high level of accessibility in our common library for generating HTML mark up.

During testing on JAWS we found that although we were using correct accessibility tags of a fieldset and legend we were including to much hint text inside the legend. The additional text from the page has been pulled into the label meaning that the label is extremely long, as it includes such additions as how to answer the question. This is confusing for screen reader users as it is difficult to know which field the user is selecting. A shorter fieldset and legend for example, question and answer only will avoid confusion.

We chose to change our markup to place hint text outside the legend to fix this issue. E.g.



<span class=”form-label-bold text”>Do you wish to defer publication of your design?</span> </legend>

<span class=”form-hint text”>Deferring publication limits your protection to the date of filing only.</span>

<span class=”form-hint text”>The majority of designs applications are submitted without being deferred.</span>

A very small change but without testing this with real users this would not have  been picked up as an issue for users that use JAWS software in this way.

The video below gives a great demonstration of how this small change gives the users a much better experience:

Backup WordPress site

This article should explain how to manually backup your WordPress site. I used this method to transfer this site from FatCow hosting to now hosting this site and others on the Microsoft Azure platform.

Backup WordPress Database using phpMyAdmin

The database is the most valuable part of your website. This contains all information that will change most often. Luckily, backing up your WordPress database is pretty straight forward and can be done using a handy tool called phpMyAdmin, which is usually available through your cPanel.

Let’s dive into the steps:

Log in to your cPanel and click the phpMyAdmin icon in the Databases section.

In phpMyAdmin you will see a list of database names in the left column of the home page. Simply click on the database that you wish to back up and select the Export tab at the top of the screen.

Make sure the export method is “Quick”and the format is “SQL”

Click Go button. This will download a .sql file to your computer.


The download process can usually take from a few seconds to a few minutes, depending on how large your database is. The downloaded SQL file can be used to import at anytime when you need to restore or migrate your site.

Alternatively, if you’re not comfortable with the steps, or familiar with phpMyadmin, you could also backup your database within your WordPress admin panel. To do this,

Head to WordPress dashboard » Tools » Export » All content and click download export file. This will download a XML file to your computer. This file contains your posts, pages, comments, custom post types, categories, tags, and users.

However, you can’t deny the fact that phpMyAdmin is the best and efficient tool to backup your WordPress database.

Backup WordPress Files (wp-content)

To access wp-content you’ll need either an FTP client or cPanel file manager. Let’s start with File manager tool:
Option 1: Backup your wp-content folder using File manager

Login to yourcPanelaccount.

Navigate to File Managericon under the “File Management” section.

Click on it and a pop-up will appear. In the pop-up, select Web Root (public_html/www)and click Go.

The File Manager will now load in a new window and show your files. Ensure you are in the public_htmlfolder.

Once there, navigate to the “wp-content”folder, right-click on it and select “Compress”.

Select Zip Archive as compression type and then click compress File(s). This will create a file called wp-content.zipand place it within your root folder.

Wait for the archiving to finish. When it’s ready, refresh the file manager and look for wp-content.zipfile. (By downloading the file that you need the time taken to complete the backup will be significantly reduced.)

Simply double click on it to begin download.

File Manager

This might take a long time, maybe an hour or more depending on your connection speed and the size of your website. Once done, don’t forget to delete the zip file in your root folder to save disk space.

Option 2: Backup your wp-content folder using FTP (FileZilla)

This part of the article assumes you already have an FTP account and Filezilla software installed on your computer. If you don’t have CPanel access on your host, you will have to get yourself an FTP client such as Filezilla. FTP clients let you move your website’s files from your hosting account to your computer, and vice versa

Open Filezilla and connect to your host with your FTP information.

After you have connected, select the public_html directory from the right pane.

Create a folder on your desktop and download the wp-content folder to it by simply dragging the folder over from the right pane to the left pane.


Congratulations, you’ve successfully backed up your WordPress site.

Set up a WordPress site in the Cloud with Azure

If you run a WordPress site using a site such as GoDaddy, 123reg etc then you really need to move off this shared hosting platform and transfer your site to your own personal vm in the cloud. You can set this up for as little as £10 a month for the A0 setup but I would recommend the A1 price tear for roughly £20 a month.

If you have a MSDN subscription you can also use the credits for the charges. It is a good time to grab a VM or two and setup your own servers, where you can host your own blog and showcase your awesome open source projects. VMs are your own mini server, where you get full remote desktop access and do whatever you like. You don’t have to be limited to a web based control panel. It’s your own baby server on the cloud, fully scalable and redundant.

Creating your VM

Search for Bitnami images in the azure dashboard and click the create button. It is that simple to setup a LAMP environment VM with WordPress pre installed on apache with MySQL as the database. Bitnami images also have phpmyadmin pre-installed to allow you to manage your database.

BItnami VM

Once the VM is created, it will look like this:

VM View

Go to the “Endpoints” tab, and see the public port that has been opened for SSH. SSH is the Remote Desktop Protocol (RDP) equivalent for Linux.


So, the server’s DNS is and SSH public port is 54423.

Connect to the VM

Let’s get Putty and configure it to connect to this VM:


Put the DNS and the port. Then put a name on “Saved Sessions” and click Save. Then go to “Appearance” and change the font to Consolas, 15. Come back to this, and click “Save” again.

Now click “Open” and you will be taken to the remote session on the server. You will be asked to accept the server, just click Yes. Use your azure account details as login and password.

You will see a screen like below:

SSH Screen

The so-called endpoints are the open ports on your machine. In this case, port 22 (SSH) and 80 (HTTP) are both open.

If your server has web interface, you should open an HTTP and HTTPS endpoint when you are ready to go into production. You can do this by clicking VIRTUAL MACHINES in the vertical menu bar on the left, choosing VIRTUAL MACHINE INSTANCES in the central resource area and then clicking on the name of the virtual machine you want to add an endpoint to.

You can now access your virtual machine via the web visiting, where DNSname is the name shown in theDNS NAME column of the VIRTUAL MACHINE INSTANCES tab.

You should receive something like:

Wordpress screen

To see the wordpress site itself, visit To log into your WordPress site for the first time, use user as your username and bitnami as your password. You are advised to change these credentials as soon as possible after deploying your site.



Showcase site for MVC using ASP.NET and onion architecture

I have created a new site hosted in Azure to showcase the latest ways of working using the .NET framework. The site is built using MVC using a strict ONION Architecture to ensure separation of concerns are maintained. Please have a look at