Making Architecture Matter - Martin Fowler Keynote

Creating an encrypted Time Machine disk on ExFAT

hdiutil create -stdinpass -encryption "AES-256" -size 500g -type SPARSEBUNDLE -fs "HFS+J" YourImage.sparsebundle

Where YourImage is the name you want to give your backup image and 500g is the maximum size of your disk image.

open YourImage.sparsebundle diskutil list

Find your mounted image in the list and get it's path, in my case it was: /dev/disk3s2

sudo diskutil enableOwnership /dev/disk3s2 sudo tmutil setdestination /Volumes/YourImage

references: http://hints.macworld.com/article.php?story=20140415132734925 http://garretthoneycutt.com/index.php/MacOSX#Creating_an_encrypted_sparsebundle https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man1/hdiutil.1.html

Running NodeJS in production

It's 2015, we are in a coffee shop with our laptops, we want to write a NodeJS web application, we know JavaScript, but we don't have Node installed yet.

We head over to http://nodejs.org/ and start downloading it. Installing it is pretty straight forward, we are able to complete it before we need to order our next coffee.

As a result of our installation we get two binaries, node and npm. node runs our application npm can install modules for our application.

We start looking at http://nodejs.org/docs/api and see that the documentation is pretty straight forward and are able to write a simple HTTP server in a matter of minutes just by copy pasting example code.

We start up our app by just running:

node app.js

Our application that is barely 10 lines of code works! We are extremely happy that we went from nothing to a demo application running on our laptop in less than an hour.

One week has passed. Our application does a lot more than just print 'hello world' however our app.js file went from less than 10 lines to 3000 and we still have a lot of features to add.

Naturally we decide to split our app.js into multiple files, the strategy works and all is simple again.

One month has passed. We decide to hire a new guy. This is when we realise that hiring NodeJS developers is hard and Node developers are also a lot more expensive than the average. The new guy is great, however he has a hard time understanding the way we structured our app, many days of debate and discussion follows on how to organise files.

One week later we decide that a framework is needed to help structure our app, the only problem is NodeJS is a pretty new technology, so the available frameworks are still quite immature. We end up choosing one that seems to be the most descent. The next couple of weeks, we end up spending a lot of time making the framework do what we need it to do, we are sending patches, implementing workarounds and monkey patching it.

Two months later our app is ready to go and we immediately upload it to the cloud, it works, just as it worked on our laptop, all is good.

24 hours later our app encounters an exception and crashes and it remains offline until we restart it. We realise that this is a problem and write a shell wrapper that automatically restarts it, all is good.

3 days later our users are experiencing weird bugs, so we decide that we need a way to log stuff, we chose Winston, cause that looks to be the logical choice.

One week later our app becomes popular, we are getting a lot of users, that's great however our app seems to be becoming less responsive, we don't understand this, NodeJS was supposed to be high performance. After days of painful debugging and reading up, we realise that event-loop systems do have an upper limit and the only way to get around this is to have multiple nodes of our application. After some modifications, our app is able to run in clusters, now we need something to manage and load balance this. After a lot of searching and researching we go with PM2 for process managing and Nginx for load balancing. We also realise that we need to make sure our logs get merged, cause nobody wants to search in 10 log files when something goes wrong, fortunately PM2 can take care of that.

The moral of this story

Running things on your laptop can be different from running this in production. Because NodeJS is a young platform, you need to find DIY solutions for a lot of problems that were solved in most other platforms long ago. A lot of Node advocates forget to mention the challenges of getting the platform into a production-ready state. NodeJS development is expensive. Node is still 0.x

Some facts:

Node does not have an official process manager, so we need to chose from community provided ones:

Even with a process manager, 1 faulty request that makes Node crash will crash all other requests that are in progress, there is no way to fix this.

A single Node process is severely limited

Currently, by default v8 has a memory limit of 512mb on 32-bit systems, and 1gb on 64-bit systems. The limit can be raised by setting --max_old_space_size to a maximum of ~1024 (~1 GiB) (32-bit) and ~1741 (~1.7GiB) (64-bit), but it is recommended that you split your single process into several workers if you are hitting memory limits.

The LAMP test

New flashy web technologies appear all the time, that claim they are better than their old outdated counterparts.

The LAMP test is designed to compare a new technology to something old but widely used to see wether it holds up to it's promise, or it's just a marketing ploy.

Most marketers of new technology showcase areas where the particular technology excels and ignore all the other areas where it fails compared to it's old counterpart.

A lot of new technologies look good on paper and demos, but fail when they are used under production conditions.

Surprisingly a lot of over-hyped web technologies fall short compared to boring technologies like: PHP, MySQL and jQuery.


AngularJS is one of the most hyped JavaScript-based frontend technologies built by Google.

I have developed 3 projects in AngularJS and have 1 year experience with the technology. During the last two years, I have consistently seen AngularJS to be advertised as the holy grail. I've seen managers put the Angular logo on their slides when pitching a project, just to make it more appealing.

In reality however AngularJS doesn't live up to it's promises.

This article sums of the issues best: Why you should not use AngularJS ( HN discussion )

The following comment sums up how mind blowing AngularJS's faults are:

I find the rise of Angular kind of baffling. Angular's scope system is exactly analogous to the scope system of a programming language. This is a solved problem! When you make a scope system, make it lexical, and require explicit declaration before use. If you're not making those choices, then at least acknowledge that these are the standard answers, with very clear advantages over other scoping systems, and explain why you are not using these answers. But with angular, we have a dynamic, implicit declaration scoping system. New scopes are introduced somewhat unpredictably, at the discretion of each directive. I thought that introducing dynamic, implicit-declaration, non-block-scoped variables in 2014 was like introducing a new car with a coal-burning engine, but no one even seems to remark on it. Then there's the dirty-checking loop. After every event there is a digest; every digest runs every watch. To me, just reading this description makes a voice speak up in my head: "Uh-oh! That sounds like O(n^2)!" Now that angular is being widely used, people are noticing that it's slow as shit. But why did the framework get to this level without anyone remarking, "this dirty-checking algorithm is fundamentally, irremediably not scalable"? Do people not have a sense even for the most coarse performance characteristics of algorithms like this? Or do people simply think that nowadays "performance does not matter"? Angular's "module" system is the strangest of all. It doesn't do namespacing or dependency tracking. What is even the point of it? What thought process led to this useless module system? It's just strange. Hundreds of years of people's work are spent on something, which the most cursory, CS 101 analysis shows to be seriously flawed. Is analysis simply a lost art in this industry? Oh well, people are finally realizing Angular has its faults, because they've seen them with their own eyes and now they believe them. It would be nice if we could learn from this, and maybe skip the next boondoggle (web components for instance), but I have no hope for it.

source: https://news.ycombinator.com/item?id=8652566

Problems with Angular: http://www.leanpanda.com/blog/2015/09/20/our-criticisms-of-angularjs/

In summary, if you use AngularJS in a large project:

  • your Angular app won't scale
  • your code will be unmaintainable
  • you will spend a lot of time working around Angular's faults
  • your work will be obsolete when AngularJS 2.0 arrives which will not be compatible


Docker is the AngularJS of infrastructure:

  • extremely overhyped
  • doesn't do anything new
  • has severe issues that advocates happily ignore

In terms of development practices docker images are used exactly like VM images, there is no substantial difference there, you are getting the same features you would get from using VirtualBox or VMWare.

Docker can be useful if you have one powerful machine and want to run multiple isolated environments inside it without the overhead of virtualisation, this would have been really useful 10 years ago, however in the age of the cloud platforms, this use case is very rare.

The real problem with using docker is that you inherit all the issues you would have with using VM images as well as ones unique to Docker:

  • you cannot version control it
  • you don't know what's really on the machine
  • if you use an upstream image as a base, you have to trust it
  • you cannot rebuild a clean infrastructure, you only have the option to revert back into a previous saved state






If you want a manageable infrastructure, it's better to script it with Puppet, Chef, Ansible etc.

Python with MacPorts

sudo port install python27 py27-pip
sudo port select --set python python27
sudo port select --set pip pip27
printf '\nexport PATH=/opt/local/Library/Frameworks/Python.framework/Versions/2.7/bin:$PATH\n' >> ~/.profile

Why you might want to use boring technologies

There are technologies that are over-glorified overhyped and there are technologies that get the job done, sometimes these two are not the same thing.

If you need to solve a technically boring problem, you might be better off using a boring technology instead of whatever is on the front page of Hacker News.

Rockstar databases

Episode 1 - Mongo DB Is Web Scale https://www.youtube.com/watch?v=b2F-DItXtZs

Why You Should Never Use MongoDB http://www.sarahmei.com/blog/2013/11/11/why-you-should-never-use-mongodb/


Re: MemSQL the "world's fastest database"? http://www.postgresql.org/message-id/4FE8A2DE.5050806@agliodbs.com


MongoDB stands for humongous:





Cool Architectures

Episode 2 - All The Cool Kids Use Ruby https://www.youtube.com/watch?v=5GpOfwbFRcs

Node.js Is Bad Ass Rock Star Tech https://www.youtube.com/watch?v=bzkRVzciAZg

PHP and Nginx


sudo apt-get install php5-fpm
sudo mkdir /www
sudo chown ubuntu:ubuntu /www
sudo vim /etc/nginx/sites-available/default server {
    listen       80;
    index index.php index.html;
    root /www;
    location / {
        # try to serve file directly, fallback to app.php
        try_files $uri /app.php$is_args$args;
    location ~ \14;php$ {
        fastcgi_pass unix:/var/run/php5-fpm.sock;
        fastcgi_split_path_info ^(.+\14;php)(/.*)$;
        include fastcgi_params;
        fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
        fastcgi_param HTTPS off;

Mac OS X (Macports)

sudo port install php55-fpm
sudo cp /opt/local/etc/php55/php-fpm.conf.default /opt/local/etc/php55/php-fpm.conf
sudo port load php55-fpm
sudo vim /opt/local/etc/php55/php-fpm.conf

Commend out: listen =

Add in: listen = /opt/local/var/run/php5-fpm.sock


listen.owner = nobody
listen.group = nobody
listen.mode = 0660
sudo cp /opt/local/etc/nginx/fastcgi_params.default /opt/local/etc/nginx/fastcgi_params
sudo port unload php55-fpm
sudo port load php55-fpm

Sample nginx vhost:

server {
    listen       80;
    server_name  watchtower.dev;
    index index.php index.html;
    root /Users/void/Developer/watchtower;
    location / {
        # try to serve file directly, fallback to app.php
        try_files $uri /app.php$is_args$args;
    location ~ \14;php$ {
        fastcgi_pass unix:/opt/local/var/run/php5-fpm.sock;
        fastcgi_split_path_info ^(.+\14;php)(/.*)$;
        include fastcgi_params;
        fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
        fastcgi_param HTTPS off;

Setting up an Ubuntu server for PHP

It's good practice to ensure that the system is up to date.

sudo apt-get update
sudo apt-get dist-upgrade
sudo reboot
sudo apt-get install apache2 mysql-server php5 libapache2-mod-php5 php5-gd php5-mysql php5-curl postfix
sudo a2enmod vhost_alias rewrite
sudo service apache2 restart

(Optional) Add ServerName yourhostname to /etc/apache2/httpd.conf

Difference between coding style and error handling

In one of my previous projects some people were complaining that I was trying to force everyone to code in my style, what was really happening is that I kept telling my colleges to write proper error handling in their code.

If we are talking about code style, that implies that we are talking about the source codes that do exactly the same thing, all they differ is in the style they are written in.


function isEven(number) {
   return !(number % 2)
function isEven(number) {
   if (number % 2) {
       return false;
   } else {
       return true;
function isEven(number) {
   if (number % 2)
       return false;
       return true;
function isEven(number) {
   if (number % 2) {
       return false;

   return true;
function isEven(number) {
   var returnValue = true;
   if (number % 2) {
       returnValue = false;

   return returnValue;

These all do the same thing but are written in different styles, I could use either of these in an application and I would get the same end results.

However this is not the case with the following examples:

// #1
function isEven(number) {
   if (typeof number !== 'number') {
   return !(number % 2)
// #2
function isEven(number) {
   if (typeof number !== 'number') {
       throw new Error('Input argument is not a number type');
   return !(number % 2)
// #3
function isEven(number) {
   return !(number % 2)

For these examples we would get different results for invalid inputs, therefore we cannot say that there are written in a different style, because they clearly don't do the same thing. All 3 of these examples behave differently for invalid inputs.

The problem with code like #1 is that if you give it something other then a number, it will return undefined which acts as false in a conditional statement. Your program will never know that it received a bad input, it will just continue executing and producing bugs in other places.

Correctness and reliability

A function is something that takes inputs and returns a result. Functions should be simple, do one thing, and do that one thing well.

Let's write a function in JavaScript that computes the area of a rectangle.

function getArea(width, height) {
    return width * height;

This code will work for us, however if we didn't write this code it will probably be not obvious what kind of area does this compute. We could add comments explaining what the code does, however that does not help us when we are reading a code that uses this function. The best thing to do is give the function a good name that makes it's purpose obvious to the reader.

function getRectangleArea(width, height) {
    return width * height;

Now we have a simple and readable function that returns the correct results for correct input, which is wonderful, however what happens when the inputs are not correct?

What happens when the width and/or the height are not numbers? What happens when the width and/or the height are invalid numbers?

Java programmers and other static-typing-language programmers would quickly point out that they don't have to worry about this because the compiler checks the types, so that could never go wrong. This is true to some extent however static typing doesn't always ensure that the oder of the arguments is correct and some have problems with Null values.

interface Greet {
       void sayHello(String firstName, String lastName)

   class PrintGreeter implements Greet {
       public void sayHello(String lastName, String firstName) {
public class Test {

    public static void whatever(Integer foo) {


    public static void main(String[] args) {
        whatever(null); // The compiler will not complain.

Static typing also has negative effects, because it can give false confidence to developers: http://programmers.stackexchange.com/questions/221615/why-do-dynamic-languages-make-it-more-difficult-to-maintain-large-codebases/221658#221658

Some developers just go out and assume that static typing will save them from all bugs and don't bother to write tests and make sure that their code is reliable(more on this later).

Even if we ignore the shortcomings of static type systems listed above, we are still faced with another type of issue: valid type, invalid value.

The above code will happily expect negative values as rectangle width/heights which is invalid and static typing would not catch this error(the invalid values), because they are valid numbers.

Relevant links:




Macports updating

Update the Macports database.

sudo port selfupdate

Update the installed packages.

sudo port upgrade outdated

Clean up old versions.

sudo port uninstall leaves
sudo port uninstall inactive

PHP Sessions in redis

Storing session data in files can be a bottleneck in PHP, also it can be tedious to manage. Luckily there is an easy way to store the session data in memory in Redis.

First step is we install redis and php-redis.

sudo apt-get install redis-server php5-redis

We also need to tell PHP to use redis for sessions.

; Handler used to store/retrieve data.
; http://php.net/session.save-handler
session.save_handler = redis
session.save_path = "tcp://localhost?weight=1"

We also might want to set the session lifetime to what we desire.

session.cookie_lifetime = 2764800
session.gc_maxlifetime = 2764800

After we configured everything an apache restart is required.

sudo service apache2 restart

Reference: https://github.com/nicolasff/phpredis

Installing NodeJS on Ubuntu

Install tools required for compiling Node.

sudo apt-get update
sudo apt-get install build-essential openssl libssl-dev pkg-config

Download build and install Node.

cd ~
wget http://nodejs.org/dist/v0.10.25/node-v0.10.25.tar.gz
tar xvf node-v0.10.25.tar.gz
cd node-v0.10.25
sudo make install

Clean up.

rm node-v0.10.25.tar.gz
rm -fr node-v0.10.25

Advanced Git deploys

Deploying with git makes sense for various reasons:

  • it is fast and efficient because you only transfer the deltas
  • you see exactly which version of the code you have
  • it is very easy to revert or even switch between versions
  • you have the ability to automate most of the deployment via hooks

Install git

sudo apt-get install git-core

Create user

sudo adduser \
   --system \
   --shell /bin/bash \
   --gecos 'git version control' \
   --group \
   --disabled-password \
   --home /home/git git

Gitolite setup

su git
cd ~
git clone git://github.com/sitaramc/gitolite
mkdir -p $HOME/bin echo "PATH=$HOME/bin:$PATH" > ~/.bashrc gitolite/install -to $HOME/bin
su git
cd ~
gitolite setup -pk YourName.pub

$REPO_UMASK, octal, default 0077 The default UMASK that gitolite uses makes all the repos and their contents have rwx------permissions. People who want to run gitweb realise that this will not do. The correct way to deal with this is to give this variable a value like 0027 (note the syntax: the leading 0 is required), and then make the user running the webserver (apache, www-data, whatever) a member of the 'git' group.

If you've already installed gitolite then existing files will have to be fixed up manually (for a umask or0027, that would be chmod -R g+rX). This is because umask only affects permissions on newly created files, not existing ones.

reference: https://github.com/sitaramc/gitolite

Web root directory

Create a directory that the git user is allowed to write to and apache is allowed to read from.

usermod -a -G git www-data
usermod -a -G www-data git
service apache2 restart
mkdir /www
chown www-data:www-data /www
sudo chmod 775 /www

Afterwards all you need to do is write a post-receive hook that puts the files into the /www directory.

Using package.json for dependencies

If you are working on a project that uses NodeJS, you have no reason not to use package.json for listing your dependencies, unless of course you are only using built-in node modules.

  • storing your node modules can add unnecessary bloat to your repository
  • binary modules can be tricky, especially if you are using multiple architectures
  • it makes it easier to see which versions of which modules you use

Getting started

You can use the following command to generate a package.json.

npm init

or create one yourself and start specifying your dependencies in the dependencies or devDependencies property. The modules are the keys and the version numbers are the values.

  "name": "test",
  "version": "0.0.0",
  "author": "You",
  "license": "BSD",
  "dependencies": {
    "awesomemodule1": "1.2"

dependencies vs. devDependencies

Dependencies are usually modules you need to run your app, while devDependencies are modules you need to develop your app.


Once you set up a package.json, all you need to do is run npm install and it will fetch all the modules that you specified above.

Java weirdness

One does not equal one

System.out.println(new Integer(1) == new Integer(1)); //prints false

Explanation: the equality operator checks if you are comparing the same objects not the same value.

Same strings aren't always the same

System.out.println("a" == "a"); //true
String a = "a";
String b = "a";
System.out.println(a == b); //true
String c = new String("a");
String d = "a";
System.out.println(c == d); //false

Strongly typed language, except when it's not

Integer foo = 5;
int bar = 5;
System.out.println(foo == bar); //true

Interesting list to array conversion API

List<String> stringList = new ArrayList<String>();
String[] stringArray = stringList.toArray(new String[stringList.size()]);  

The callback hell that never was

Let's talk about callbacks, people seem to get terrified by them, because they see something like this:

var mysql = require('mysql'),
    db = mysql.createClient({
        user: 'root',
        password: 'root'
    id = 1;

db.query('USE mydb', function (err) {
    //some code here
    db.query('SELECT * FROM whatever WHERE id=?', [id], function (err, foo) {
        if (foo) {
            //some other code
            db.query('UPDATE whatever SET bar=?', [foo.bar], function (err) {
                db.query('INSERT INTO someothertable (sometime) VALUES (NOW())', function () {
                    //some code
                    console.log("It's done");
    //more code

This has the potential to turn into hard to understand spaghetti code. But we don't have to write it like this, we don't have to inline every function we can simply name them instead.

var mysql = require('mysql'),
    db = mysql.createClient({
        user: 'root',
        password: 'root'
    id = 1;

function init(fn) {
    db.query('USE mydb', fn);

function fetch(id, fn) {
    db.query('SELECT * FROM whatever WHERE id=?', [id], fn);

function checker(err, foo) {
    if (foo) {
        //some other code

function update(bar) {
    db.query('UPDATE whatever SET bar=?', [bar], function (err) {
        db.query('INSERT INTO someothertable (sometime) VALUES (NOW())', function () {
            //some code
            console.log("It's done");

init(function () {
    fetch(id, checker);

We could also put this in a module or wrap a (pseudo) class around it if we wanted to. The main point is that we don't have to nest inline callbacks to get the job done, we can name them.

How to write a CKEditor image uploader backend

CKEditor is an awesome piece of software, it's essential for good CMS-es that require WYSIWYG editing, the problem however that it doesn't have an image uploader by default and it's not a trivial task to write one because it's documentation is not very good on this subject I spent a lot of time googling for hints on how can I write a PHP backend for image uploads, luckily after some hard work I managed to gather enough info to create a working image upload backend.

  1. I have an ajax directory relative to the sites root, I will create an upload.php in this folder.

  2. I edit the config.js (inside CKEditor's directory) and append the following line:
    //this needs to be inside: CKEDITOR.editorConfig = function( config )
    config.filebrowserImageUploadUrl = 'ajax/upload.php?type=Images';
  3. This upload.php will have to deal with a HTTP Post Request of course and the file will be sent from an input named "upload", in PHP this means that we have to deal with $_FILES['upload']. Also we need to respond to CKEditor which is a little tricky, it involves ad-hoc JavaScripting(which was badly documented when I did this the first time). The following template solves the CKEditor specific issues:
//process $_FILES['upload']
//store uploaded images URL in $uploadedImageURL
<script type="text/javascript">
window.parent.CKEDITOR.tools.callFunction( <?php echo $_GET['CKEditorFuncNum']?>, '<?php echo "$uploadedImageURL"?>' );

We normally want to resize the image at this point and tell CKEditor the URL of our resized image. This ad-hoc method is mainly for security.

There is also the option for using CKFinder for file uploads, but I prefer doing custom solutions so that I can have maximum control over my file uploads.