About Guido Tapia

Over the last 2 years Guido has been involved in building 2 predictive analytics libraries for both the .Net platform and the Python language. These libraries and Guido's machine learning experience have placed PicNet at the forefront of predictive analytics services in Australia.

For the last 10 years Guido has been the Software and Data Manager at PicNet and in that time Guido has delivered hundreds of successful software and data projects. An experience architect and all round 'software guy' Guido has been responsible for giving PicNet its 'quality software provider' reputation.

Prior to PicNet, Guido was in the gaming space working on advanced graphics engines, sound (digital signal processing) engines, AI players and other great technologies.

Interesting Links:

Angular 2 and Internet Explorer 9 (IE9)

Getting Angular2 and IE9 working together is not very straight forward, since the move to release candidates some of the Shim libraries have been changed and others removed from the NG2 packages making things confusing.

After some research this is what I have determined to be the minimum number of “workaround” libs required. Note I have also included some other imports that are required since the move the RCs.


The following NPM libraries you will need to install are the following:

npm install --save angular2-ie9-shims
npm install --save core-js
npm install --save zone.js
npm install --save systemjs
npm install --save reflect-metadata
npm install --save rxjs

Script tags

The corresponding script imports are:

  <!-- This first should be temporary -->
  <script src="lib/shims_for_IE.prod.js"></script> 
  <script src="lib/shim.min.js"></script><!-- from core-js -->
  <script src="lib/zone.js"></script>
  <script src="lib/Reflect.js"></script>
  <script src="lib/system.js"></script>
  <script src="lib/Rx.min.js"></script>

This should have things working (for now) in IE9->11

Upgrading to Angular 2 – Reactive Forms (RC3 New Forms)

Upgrading to new the new form engine in Angular 2 (RC3) is fairly straight forward and not nearly as dawnting as some of the router changes in the past. So here is my quick and dirty how to which worked on my projects (no other guarantee).

Firstly – Read the docs
One of the issues with working in this beta/RC arena is that the documentation of these things is quite hard to find. The best doc I found about this is here:
Read that, it basically has all you need towards the bottom.

npm packages.json
When you upgrade to RC3 you will need to run:

npm install --save @angular/forms

This will install your latest form.

Update your gulp script and ensure the new forms files are copied to your testing directory.

Update your system JS to include this new forms folder. Since System.js can be configured a million ways I will just show what I added to my config:

packages: {
'@angular/forms' : {defaultExtension: 'js', main: 'index.js'}
map : {
 '@angular/forms' : 'lib/@angular/forms'

In your boot.ts or app.ts add the following:

import {disableDeprecatedForms, provideForms} from '@angular/forms';

bootstrap(AppComponent, [

Search and replace your templates for the following:

ngFormModel -> formGroup
ngSwitchWhen -> ngSwitchCase (not really new forms but RC3 so do it anyway)
ngFormControl -> formControl

That’s all I had to do, but according to “the doc” you should also do the following (note, I did not test these):

formControlName (deprecated: ngControl)
formGroupName (deprecated: ngControlGroup)
formArrayName (deprecated: ngControlGroup)


In your components remove all the old form related imports, these would be coming from @angular/common probably. The ones I removed are:

Control, ControlGroup, FormBuilder, Validators

And then replace with the new directives/components from @angular/forms. The new names are:

FormControl() (deprecated: Control)
FormGroup() (deprecated: ControlGroup)
FormArray() (deprecated: ControlArray)
FormBuilder (same)
Validators (same)

All directives are included in the single REACTIVE_FORM_DIRECTIVES so just add this to your @Component.directives array.
Then remember to search and replace component your code for the following.

ControlGroup -> FormGroup
ControlArray -> FormArray
Control -> FormControl

Very straight forward once you find “the doc” (the magical doc).

VANguard and Federated Authentication Service

We recently completed an integration project, where we connected a complex QlikView visualisation and analytics application to VANguard federated authentication service.  VANguard is a service provided by the Federal government that allows government agencies to provide single sign on capabilities to their systems to different agencies.

The project itself ended up being much more complex than originally planned and some lessons learnt are listed below in the hopes it will help others.


VANguard uses the SAML specifications to provide federated authentication services.  I must say that SAML appears to be greatly over engineered and considerably more complex than should be required.  However, several libraries are available that allow the abstraction of the underlying SAML.

 No metadata

VANguard does not provide metadata for their service.  Metadata is a way for an identity provider to give details of its environment (URLs, certificates, etc).  VANguard advised that they are not an identity provider, an explanation I found academic as providing a decent metadata file would have greatly simplified the process.

If you have are trying to integrate a system that comes with SAML support out of the box it will probably not work with VANguard due to this limitation.


We ended up using the dk.nita.saml20 package to abstract the SAML boiler plate.  We did however have to modify the source in several places to hack and patch things together.  For instance we had to change the language on the service metadata to “en” rather than “da” and there were several other small things like this we changed.  Customising the code also helped us work around the limitation of no metadata existing for the VANguard service.


Some resources that really helped are:


If we were to try to integrate to VANguard again we would do things differently.  The “no metadata” is a huge limitation and makes VANAguard integration very hard.  The existing libraries out there just expect an identity provider to have a metadata file (and yes I know that in theory VANguard is not an identity provider but that should be transparent).  Next time we will not use an abstraction library and do all creation and parsing of SAML files manually.  This seems daunting at first given the over complexity of SAML payloads but I think we would have completed the project sooner going down this path.  Live and learn.

Angular 2 Forms Wrapper – Clean forms

Angular 2 forms are very powerful, the custom validation support is extremely flexible and easy to use.  However, this power and flexibility comes at a cost; and that is boiler plate.  To nicely integrate validations into the user interface you need a lot of ugly boilerplate template code.  This makes the templates hard to maintain and very hard to keep consistent across a large application.  So this simple wrapper tries to address that.

Note, this should not be treated as a complete open source project but rather a template you can work into your own application with your own customizations.  The sample here uses html 5 input elements but you should insert your own generators for bootstrap, material or what ever control/css library you prefer.

The Problem

To get a nice interactive forms user interface you need a lot of code.  For instance this a simple email field without good interactivity / validations looks like:

<div class="form-group">
  <label for="email">Email</label>
  <input type="email" [(ngModel)]="email">

This is nice, clean and simple.   However, it lacks the validations required for a good quality production application.  Let’s add good interactivity and validation to this template.  We will be using Angular 2 forms for this.

<div class="form-group">
  <label for="email" [ngClass]="{'ui-state-error': !form.find('email').valid && (model.ID || !form.find('email').pristine), 'required': f.required}">
    <span class="required">*</span>Email
    [ngClass]="{'ui-state-error': !form.find('email').valid && (model.ID || !form.find('email').pristine)}">

Personally I feel that this makes the template virtually unmantainable (just imagine a form with 10 or 20 fields).  You could of course move some of the ngClass objects into the component itself but then you are just moving the mess around.


So lets add a layer of abstraction.  Angular 2’s great component model makes this fairly easy.  Ofcourse with every abstraction you are losing flexibilty and introducing a critical piece of code that may turn into a maintenance bottleneck.  However, I chose to do this because I found the Forms boiler plate just too messy.  Basically this wrapper makes forms programatic rather than declarative, so a form definition now looks like this:

this.fieds = [
  {id: 'ID', name: 'Entity ID', type: 'number', disabled: true},
  {id: 'EntityName', name: 'Entity Name', type: 'text',
      required: true, autofocus: true},
  {id: 'Email', name: 'Email', type: 'email', required: true}

Which is not great, I mean we are losing a lot of flexibility here and that is why I say above that this is not a project in its own right, simply a template for you to start with as you have to ensure it meets your own purposes.

Some things to note:

  • I am using a tabbed form to allow for bigger forms. If you ommit the tab: ‘Name’ attribute then tabs are not shown
  • I am using [PrimeNG] (http://www.primefaces.org/primeng/) as my component library, can be replaced with anything.
  • I am using hard coded custom elements; ie: type: ‘custom1’ … type: ‘custom5’.  The reason for this ugly hack is this [Angular 2 limitation](https://github.com/angular/angular/issues/8563)
  • If a form contains a custom element that eventually shows another form (such as a dialog that shows another form) you may run into issues.  Recursive components cause problems in Angular 2 RC1.


The edit-form Component code:

import {Input, Component, OnInit, CORE_DIRECTIVES, 
    InputText, Password, Button, InputTextarea, Calendar, 
    Dropdown, Checkbox, Dialog, MultiSelect, TabView, TabPanel, 
    Autofocus, Helpers} from '../common';
import {Control} from '@angular/common';
import {ControlGroup, FormBuilder, Validators} from '@angular/common';
import {ColorPickerDirective} from '../lib/color-picker/color-picker.directive';

  selector: 'edit-form',
  templateUrl: 'app/misc/edit-form.html',
  styleUrls: ['app/misc/edit-form.css'],
  directives: [CORE_DIRECTIVES, InputText, Password, Button, InputTextarea, 
    Calendar, Dropdown, MultiSelect, Checkbox, Dialog, TabView, TabPanel,
    Autofocus, ColorPickerDirective]
export class EditFormComponent implements OnInit {
  @Input() public fields: IField[];
  @Input() public class: string;
  @Input() public model: any;
  @Input() public formValidator: any;
  public tabs: string[] = [];
  public form: ControlGroup;

  constructor(private fb: FormBuilder) {}

  ngOnInit() {  
    if (!this.fields || !this.fields.length) {
      throw new Error('no fields specified for this edit-form');

    const group = {};
    const hastabs = !!Helpers.find(this.fields, (f: IField) => f.tab);
    let lasttab = this.fields[0].tab;
    if (hastabs && !lasttab) { 
      throw new Error('If tabs are specified then the first field must have a tab'); 
    this.fields.forEach((f: IField) => {
      const fieldopts: any[] = [this.defaultval(f) || ''];
      let validators: any[] = [];      
      if (hastabs) {
        if (!f.tab) { f.tab = lasttab; }
        lasttab = f.tab;
        if (this.tabs.indexOf(f.tab) < 0) { this.tabs.push(f.tab); } } else { this.tabs = ['']; } if (f.required) { validators.push((c: Control) => 
          this.visible(f) ? Validators.required(c) : null); }
      if (f.validators) { validators = validators.concat(f.validators); }
      if (validators.length === 1) { fieldopts.push(validators[0]); }
      if (validators.length > 1) { fieldopts.push(Validators.compose(validators)); }
      group[f.id] = fieldopts;
    this.form = this.fb.group(group, {validator: (g: ControlGroup) => {              
        return this.formValidator ? this.formValidator(g) : null;

  visiblefields(tab: string): IField[] {
    return this.fields.filter((f: IField) => (!f.tab || f.tab === tab) && this.visible(f));

  classes(f: IField): any {
    return { 'form-heading': f.type === 'heading', 'form-group': f.type !== 'heading' };

  options(f: IField) {    
    if (typeof(f.options) === 'function') { return f.options(); }
    if (typeof(f.options.length) === 'number') { return f.options; }    
    return Object.keys(f.options).map(k => { return { value: f.options[k], label: f.options[k] }; });

  onchange(f: IField) {
    if (f.onchange) { f.onchange(); }

  defaultval(f: IField) {    
    if (typeof(f.default) === 'function') { return f.default(); }
    return f.default;

  visible(f: IField) {    
    if (typeof(f.visible) === 'undefined') { return true; }
    if (typeof(f.visible) === 'boolean') { return f.visible; }
    return f.visible();

  geterror(f?: IField): string {
    const errors = f ? this.form.find(f.id).errors : this.form.errors;
    if (!errors) { 
      if (!f) { 
        const controls = this.form.controls;
        const ids = Object.keys(controls).filter(id => !controls[id].valid);
        let message = '';
        ids.forEach(id => {
          const field = Helpers.find(this.fields, (f2: IField) => f2.id === id);
          const err = (field.name || field.id) + ': ' + this.geterror(field);
          message += err + '
        return message;
      return 'Please correct the form errors.'; 
    let message = '';
    Object.keys(errors).forEach(e => {
      if (!errors[e]) { return; }
      if (message) { message += '
'; }      
      let msg = errors[e];
      if (msg === true) {
        if (e === 'required') { msg = 'Field is required.'; }
        else { msg = e; }
      message += msg;
    return message;

export interface IField {  
  // generic
  id: string;
  type: string;

  hideLabel?: boolean;
  tab?: string;
  autofocus?: boolean;
  visible?: any;
  name?: string;
  disabled?: boolean;  
  default?: any;
  required?: boolean;
  validators?: Function[];

  // p-dropdown options  
  filter?: boolean;
  options?: any;
  onchange?: Function;

The edit-form template code:

<div *ngIf="fields && model">
  <form [ngFormModel]="form" [class]="class">
    <div *ngIf="!form.valid && (model.ID || !form.pristine)" 
    <div [ngClass]="{'hide-tabs': tabs.length <= 1}" class="ui-grid">
        <p-tabPanel *ngFor="let t of tabs" [header]="t">
          <div *ngFor="let f of visiblefields(t)" [ngClass]="classes(f)"  class="ui-grid-row">      
            <div *ngIf="!f.hideLabel" class="ui-grid-col-3">
              <label *ngIf="f.name || f.id"
                  [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine), 'required': f.required}">
                <span *ngIf="f.required">*</span>
                {{f.name || f.id}}
            <div [ngSwitch]="f.type" [ngClass]="{'ui-grid-col-9': !f.hideLabel, 'ui-grid-col-12': f.hideLabel}">
              <span *ngSwitchWhen="'custom1'"><ng-content select="custom1"></ng-content></span>
              <span *ngSwitchWhen="'custom2'"><ng-content select="custom2"></ng-content></span>
              <span *ngSwitchWhen="'custom3'"><ng-content select="custom3"></ng-content></span>
              <span *ngSwitchWhen="'custom4'"><ng-content select="custom4"></ng-content></span>
              <span *ngSwitchWhen="'custom5'"><ng-content select="custom5"></ng-content></span>
              <span *ngSwitchWhen="'heading'">            
                <h3 class="edit-form-heading">{{defaultval(f)}}</h3>
              <span *ngSwitchWhen="'multi'">          
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'dropdown'">          
                <p-dropdown *ngIf="options(f).length"
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'new-password'">
                <input class="form-control" 
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'colour'">
                <input pInputText [(colorPicker)]="model.CommercialStatusColour"
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'date'">
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'textarea'">
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchWhen="'boolean'">
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <span *ngSwitchDefault>                   
                <input *ngIf="f.autofocus" 
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
                <input *ngIf="!f.autofocus" 
                    [ngClass]="{'ui-state-error': !form.find(f.id).valid && (model.ID || !form.find(f.id).pristine)}">
              <div [hidden]="form.find(f.id).valid || form.find(f.id).pristine" class="ui-message-error">
                {{ geterror(f) }}


To use the forms wrapper above just do the following; template:

 <edit-form [fields]="fields" [model]="model" #form>


import {Component,OnInit,CORE_DIRECTIVES,EditFormComponent} 
  from '../common';

 templateUrl: 'app/user-edit.html',
 styleUrls: ['app/user-edit.css'],
 directives: [CORE_DIRECTIVES, EditFormComponent],
 selector: 'user-edit'
export class UserEditComponent implements OnInit {
  constructor(data: DataService) {
    this.fields = [
      {id: 'UserName', name: 'Username', type: 'text', required: true, autofocus: true},
      {id: 'Email', type: 'email', required: true},
      {id: 'Company', type: 'text', required: true},
      {id: 'Password', type: 'new-password', required: this.isadd() },
      {id: 'Claims', name: 'Role', type: 'dropdown', required: true, options: this.claims},
      {id: 'IsActive', name: 'Is Active', type: 'boolean'},
      {id: 'NumLogins', name: 'Number of Logins', type: 'number', disabled: true}

  ngOnInit() {
    this.data.getUser().subscribe((user: any) => this.model = user);

Custom Elements

If you ever need a custom element simply do the following; template:

<edit-form [fields]="fields" [model]="model" #form>
     This can be anything, however careful if you are loading 
        another edit-form in one of the children here.
    Another custom element, look a button: <button>Wow</button>


this.fields = [
   // Will "transclude" custom1 contents
   {id: 'HardToMakeGenericField', name: 'Custom Field', type: 'custom1'},
   // Will "transclude" custom2 contents
   {id: 'HardToMakeGenericField2', name: 'Custom Field 2', type: 'custom2'},


I am not a big fan of abstracting frameworks with custom code.  This code usually ends up being the main maintenance bottleneck in complex systems.  However, sometimes complexity in the framwork means that an abstraction is called for.  I leave it to you to decide whether this is the case with Angular 2 forms and if you chose to use a form builder like the one in this post, I hope this helps you achieve that goal.

My experience so far with Angular 2

I know it’s still early in the piece for ng2 but we have to date worked on 3 angular 2 projects.  These projects range from very small (5-10 pages/routes) to medium (50-70 pages/routes).  We started on beta1 and we are now here at RC1, here is my experience so far:

Bad Parts

CSS Frameworks

Are still catching up.  I ended up using PrimeNG from PrimeFaces which is great but it would have been nice to have the option of using Bootstrap or Angular Material.

Router Issues

The ng2 router (now called router-deprecated) was a little buggy.  We ended up having to wrap all calls to router.navigate(…) in a setTimeout to avoid this bug.  This was painful as tracing down any core network bug takes a while.  I think this is now fixed in RC1 but cannot currently confirm.

I was a bit shocked when I saw in RC1 that the router had been deprecated.  I tried upgrading but this was a total no-go as it is massively breaking.  Especially since I am using a custom outlet to handle authentication.

I’m not looking forward to upgrading the router.  This one upset me a little, there are such good routers out there, angular-ui, ember, durandal. Why re-invent the wheel?  Routing is a hard issue, but it’s an issue that’s been solved.  It’s like when Microsoft re-implemented jQuery (with their silly ajax libraries that lasted about 2 months) or NHibernate (with EF, which took about 5 versions to become usable). No good.

Inheritance Issues

There are lots of issues with Component inheritance.  Here are some that have caused us headaches:

EventEmitter from ancestors sometimes do not fire correctly

Not sure why but I ended up having to re-declare some of my EventEmitters (override) in descendants to avoid this very hard to find bug.

@Component attributes/decorators not inherited

Trying to tidy up @Component definitions by pushing up the hierarchy chain does not work.  This may be a typescript limitation.

Dependency injection does not work on inheritance hierarchies

You actually have to manually pass all constructor params to ancestors.  This is even if you do not change constructor signature, i.e. You must add an empty constructor just to pass parameters to parents.

Recursive Container Components

If a component can have a ng-content and that content can also have the container component it can cause issues.  The page sometimes does not load.  This was a nasty issue to identify and an ugly hack was required where we actually copy/pasted the entire component and renamed it.  Other than the name it was identical this “solved” the issue.

Error Messages

Error messages and stack traces are horrible. Clicking into the angular source is a no-go as you usually end up in totally irrelevant parts of the code (usually Zone.js or polyfills code).  So trying to track down issues is a manual and slow process.

Forms are messy

To utilise the great forms functionality you need a lot of boiler plate in both your template and component code.  Fortunately it is not hard to wrap this code in your own abstraction but you lose a lot of functionality doing so.

Still the templates when using forms (especially validation) are really ugly with lots of ngFormModel, form.pristine, form.valid, form.find(‘controlid’), form.find(‘controlid’).pristing, etc, etc tags everywhere.

CSS Encapsulation

Can lead to headaches.  I found myself adding a non-encapsulated global.scss to get around component boundaries when required.

Minification / Production

We still have not fully solved this, however we have delayed this on purpose waiting on the story to get better as angular progresses along to v1.


This one is a little scary, I have to deliver an IE9 compatible system but it does not appear that IE9 currently works.  I need to spend more time on this one as currently there is no error just a blank page so not sure where I’m going wrong here.

Good Parts


When you are not commenting out code trying to identify a new bug after upgrading to beta x; ng2 is actually really, really productive.  Much more so than ng1.  By the time we got to the third system we are completing we were burning along.

The primary reason for this productivity is the great component syntax.  NG1 tried to have a good component model using directives, they failed.  I feel they got it right in ng2 and it makes a huge difference.

Typescript / Decorators

Works great!  We use typescript also for our ng1 projects but decorators bring this to a new level.  Hopefully they clean up the need to define providers, directives and pipes but other than that it’s much cleaner.


I’m glad we made the decision to write this new batch of projects using Angular 2, it was perhaps a little early but we learnt a lot. Now that things appear to be settling down I hope we can start being even more productive and stop trawling through ng source code which is not very nice.

Review of Keras (Deep Learning) Core Layers


This is the first part in a planned series of posts which aims to explore the core layers in the Keras source code.  These posts aim to take practical / a non-theoretical approach whereby we use code samples to demonstrate real usages of the Keras layers being investigated.


All the code in this post requires the following imports and debug functions:

from keras.layers.core import *
from keras import backend as K

def call_f(inp, method, input_data):
  f = K.function([inp], [method])
  return f([input_data])[0]

def print_out(layer, input_data, train=True):
  if hasattr(layer, 'previous'):
    print call_f(layer.previous.input,
        layer.get_output(train=train), input_data)
    print call_f(layer.input, layer.get_output(train=train), input_data)


The masking layer sets output values to 0 when the entire last dimension of the input is equal to the mask_value (default value 0).  This layers expects a 3 dimensional input tensor with the shape: (samples, timesteps, features).

For example let’s call a Masking layer with a 3D tensor with two rows of data:

print_out(Masking(mask_value=1), [[[1, 1, 0], [1, 1, 1]]])
# [[[ 1.  1.  0.], [ 0.  0.  0.]]]

Notice how only the last row gets masked as this was the only row with its entire content matching the mask_value of 1.

Masking is the simplest implementation of MaskedLayer, which is the abstract base class which Masking layers can implement to inherit some boiler plate code.  The Masking layer itself can also be extended to support more advanced masking.  For instance let’s create a masking layer that masks value above a certain value.

class CustomMasking(Masking):   
  def get_output_mask(self, train=False):
    X = self.get_input(train)
    return K.any(K.ones_like(X) * (1. -
      K.equal(K.minimum(X, self.mask_value), 
        self.mask_value)), axis=-1)

  def get_output(self, train=False):
    X = self.get_input(train)
    return X * K.any((1. - K.equal(
      K.minimum(X, self.mask_value), 
        self.mask_value)), axis=-1, keepdims=True)

  [[[3, 4, 5], [5, 6, 7], [5, 5, 5]]])
# [[[ 3.  4.  5.], [ 0.  0.  0.], [ 0.  0.  0.]]]



Dropout layers are used to reduce overfitting by randomly turning off inputs.  It is important to note that Dropout only occurs during training.  During the test phase we do not turn off inputs.  It is also very important to note that output values propagated forward (i.e. not turned off) must increase in value to compensate for the nodes being turned off.  This means that the output value of the layer is the same with or without dropout.  The following simple example shows this a little bit more intuitively:

print_out(Dropout(.3), [1, 2, 3, 4, 5])
# [0,0,0,5.71428585,7.14285755]

So with 30% dropout we see that 3 output nodes were turned off (set to 0).  To compensate for the output value of the layer all the other values were increased accordingly (probabilistically so they may not exactly match the output).

To tune dropout layers Hinton suggests training without dropout until a good layer settings are found.  Then slowly increase dropout until optimal validation score is found after the layer.


An activation function is a function that produces the layer output values by applying an arbitrary function to the input values of the layer.  This function should have a useful derivative as this is used during the optimisation (backward) step of training.  There are many standard activation functions used in NN a great visual summary of these common activation functions can be found at the bottom of the Activation Function Wikipedia page. Partially reproduced here for convenience:

tbl1tbl2 tbl3

The activation function specified in this layer is applied to each input element individually (element wise) so input data dimensions can be arbitrary.

print_out(Activation('tanh'), [.5, 1, 2, 3])
# [0.46211714,0.76159418,0.96402758,0.99505478]
print_out(Activation('softplus'), [.5, 1, 2, 3])
# [ 0.97407699  1.31326163  2.12692809  3.04858732]
print_out(Activation('relu'), [-2, -1, 0, 1, 2])
# [ 0.  0.  0.  1.  2.]
print_out(Activation('sigmoid'), [.5, 1, 2, 3])
# [ 0.62245935  0.7310586   0.88079709  0.95257413]
print_out(Activation('hard_sigmoid'), [.5, 1, 2, 3])
# [ 0.60000002  0.69999999  0.89999998  1.        ]
print_out(Activation('linear'), [.5, 1, 2, 3])
# [ 0.5  1.   2.   3. ] – no weights set



The reshape layer reshapes input to a new shape.  The number of dimensions however  must remain the same.

print_out(Reshape(dims=(2,-1)), [[1, 2, 3, 4, 5, 6]])
# [[[ 1.  2.  3.], [ 4.  5.  6.]]]
print_out(Reshape(dims=(3,-1)), [[1, 2, 3, 4, 5, 6]])
# [[[ 1.  2.],[ 3.  4.],[ 5.  6.]]]



To permute dimensions of a tensor means rearranging the dimensions.  So let’s say we wanted to pivot a matrix we would do something like:

print_out(Permute(dims=(2,1)), [[[1, 2, 3],[4, 5, 6]]])
# [[[ 1.  4.], [ 2.  5.], [ 3.  6.]]]



Flattens rows of a 3D matrix:

print_out(Flatten(), [[[1, 2, 3],[4, 5, 6]]])
# [[ 1.  2.  3.  4.  5.  6.]]


Copies a 2D input matrix into a 3D matrix n times.

print_out(RepeatVector(2), [[1, 2, 3]])
# [[[ 1.  2.  3.], [ 1.  2.  3.]]]



A dense layer is a standard fully connected NN layer, let’s start with some sample source code:

d = Dense(3, init='uniform', activation='linear', input_dim=3)
d.set_weights([np.array([[.1, .2, .5], [.1, .2, .5], [.1, .2, .5]]), 
  np.array([0, 0, 0])])
print_out(d, [[10, 20, 30]])
# [[  6.  12.  30.]]


We see that the input [10,20,30] got converted to [6, 12,30] using a linear activation layer and the weights [.1, .2, .5] for each input row.  So taking the last output node which all weights are 0.5 we get the output (30) by calculating: 10*.5 + 20*.5 + 30*.5. This can be visualised as follows:

Where orange, blue and green arrows are weights of 10%, 20% and 50% respectively.


A very similar layer to the standard Dense layer with the exception that we are now working with an additional time dimension.  So the input and output are in the shape: (nb_sample, time_dimension, input_dim).  So reproducing the Dense example we get the following:

d = TimeDistributedDense(3, init='uniform', 
  activation='linear', input_dim=3)
d.set_weights([np.array([[.1, .2, .5], [.1, .2, .5], 
  [.1, .2, .5]]), np.array([0, 0, 0])])
print_out(d, [[[10, 20, 30]]])
# [[[  6.  12.  30.]]]



Merges the output of multiple layers.  This is used when a Graph model needs to recombine branches into a single trunk.  Or when multiple models need to be combined into one. The following strategies are supported: sum, mul, concat, ave, dot.

No concise code example could be produced.



Converts a 3D TimeDistributed layer output into a 2D output with time steps merged using one of the following strategies: sum, mul, ave.

print_out(TimeDistributedMerge(mode='sum'), [[[1, 2, 3], [1, 2, 3]]])
# [[ 2.  4.  6.]]
print_out(TimeDistributedMerge(mode='mul'), [[[1, 2, 3], [1, 2, 3]]])
# [[ 1.  4.  9.]]
print_out(TimeDistributedMerge(mode='ave'), [[[1, 2, 3], [1, 2, 3]]])
# [[ 1.  2.  3.]]



ActivityRegularization is simply a wrapper around keras. Regularizers.ActivityRegularizer which applies regularisation to a loss function.  We will briefly explore this here as regularization will be the subject of another post in the near future.

r = ActivityRegularizer(l1=.01)
r.layer = Layer()
r.layer.input = np.array([1, 2, 3, 4, 5])
# array(0.029999999329447746)

r = ActivityRegularizer(l2=.01)
r.layer = Layer()
r.layer.input = np.array([1, 2, 3, 4, 5])
# array(0.1099999975413084)

r = ActivityRegularizer(l1=.01, l2=.01)
r.layer = Layer()
r.layer.input = np.array([1, 2, 3, 4, 5])
# array(0.13999999687075615)



An Auto Encoder is an unsupervised neural net that aims to produce data that is similar to the input data.  This allows the net to learn features about the data and regularisation parameters without using labels.  This means the output of the last layer is the same size as the input of the first input layer.  Scoring becomes simple as the row in the input can be used to measure similarity of the produced output.

The Aautoencoder has 2 logical parts, the encoder which is the layers of the net that creates a hidden representation of the input data. And the decoder which is the layers of the net that takes the produced representation from the encoder and creates the output which should match the input data to the encoder.  A benefit of using Auto Encoders is that if the hidden representation of the data is smaller than the input data then we have basically compressed the data (dimensionality reduction).

No concise and descriptive code sample possible.


Creates a layer that performs an python arbitrary function over the layer’s input data:

print_out(Lambda(lambda x: x*x), [1, 2, 3])
# [ 1.  4.  9.]


A Siamese layer is very similar to a Merge layer with one difference.  That is; a Siamese layer can merge output from multiple layers in a net and not just joining branches.

Upgrading from Angular 2 beta 17 to Angular 2 RC1

Upgrading through the angular2 betas has been a bit of a pain with lots of breaking changes but it seemed manageable.  However, upgrading to RC1 was bad…  So in the hopes of helping others here is my step by step guide.  Please note: This works on my project.  If you have other dependencies I do not use this may not work for you.



To upgrade package.json remove “angular2”: “^2.0.0-beta.17” and replace with:

“@angular/common”:  “2.0.0-rc.1”,

“@angular/compiler”:  “2.0.0-rc.1”,

“@angular/core”:  “2.0.0-rc.1”,

“@angular/http”:  “2.0.0-rc.1”,

“@angular/platform-browser”:  “2.0.0-rc.1”,

“@angular/platform-browser-dynamic”:  “2.0.0-rc.1”,

“@angular/router”:  “2.0.0-rc.1”,

“@angular/router-deprecated”:  “2.0.0-rc.1”,

“@angular/upgrade”:  “2.0.0-rc.1”


Also ensure these dependencies are also at compatible versions:

“es6-promise”: “^3.0.2”,

“es6-shim”: “^0.35.0”,

“reflect-metadata”: “0.1.2”,

“rxjs”: “^5.0.0-beta.6”,

“systemjs”: “^0.19.8”,

“zone.js”: “^0.6.12”


Delete your node_modules directory

Run npm install to set up new dependencies



Update your build tool to copy node_modules/@angular/**/index.js to your application directory.  If you read your libraries straight from node_modules then this step is not required.



I use index.html to set up my system.js so here is how to get going:

Remove old angular2 scripts, mine were:

  <script src=”lib/angular2.dev.js”></script> <!– TODO: angular2.min.js does not work –>

<script src=”lib/router.min.js”></script>

<script src=”lib/http.min.js”></script>


And this is my copy of my System.js configuration:


packages: {

‘app’                              : {format: ‘register’, defaultExtension: ‘js’},

‘@angular/core’                    : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/common’                  : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/compiler’                : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/router’                  : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/router-deprecated’       : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/http’                    : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/platform-browser’        : {defaultExtension: ‘js’, main: ‘index.js’},

‘@angular/platform-browser-dynamic’: {defaultExtension: ‘js’, main: ‘index.js’}


map: {

‘@angular/core’                    : ‘lib/@angular/core’,

‘@angular/common’                  : ‘lib/@angular/common’,

‘@angular/compiler’                : ‘lib/@angular/compiler’,

‘@angular/router’                  : ‘lib/@angular/router’,

‘@angular/router-deprecated’       : ‘lib/@angular/router-deprecated’,

‘@angular/http’                    : ‘lib/@angular/http’,

‘@angular/platform-browser’        : ‘lib/@angular/platform-browser’,

‘@angular/platform-browser-dynamic’: ‘lib/@angular/platform-browser-dynamic’




Now the fun part, lets fix all the broken angular imports:

import {bootstrap} from ‘angular2/platform/browser’ -> import {bootstrap} from ‘@angular/platform-browser-dynamic’

import … from ‘angular2/router’ -> import … from ‘@angular/router-deprecated’;

You can safely do a search and replace for  “from ‘angular2/” -> “from ‘@angular/”



A note on router

I am using router-deprecated as changing to the new router is evil and beyond the scope or sanity of this document and developer.

For deprecated router to work you still need to add the following:

Ensure you have this as one of your application providers: provide(APP_BASE_HREF, { useValue: ‘/’ }).  Example:

import {APP_BASE_HREF} from ‘@angular/common’;

import {provide} from ‘@angular/core’;

import {ROUTER_PROVIDERS} from ‘@angular/router-deprecated’;


selector: ‘app’,

templateUrl: ‘app/layout/app.html’,

directives: [ROUTER_DIRECTIVES],

providers: [ROUTER_PROVIDERS,  provide(APP_BASE_HREF, { useValue: ‘/’ })]


Hope the pain in the last few hours at least helps someone out there.  And please do not try to upgrade to the new router yet.  It is nasty!

PredictBench successfully predicts product classifications for one of the world’s largest ecommerce and FMCG companies

As any large FMCG (CPG) is aware, classifying products correctly is critical to having good analytics capabilities.  It is also clear to any global organisation that this is a surprisingly difficult task to achieve.  Most regions use different classifications and combining them on a global scale is a non-trivial task.  It is such a difficult task that many organisations simply ignore it and miss out on potential insights from a global view of product sales.

The Otto Group is a German ecommerce company that sells tremendous amounts of goods and they recently released a dataset to address this exact issue.  Given details of over 200,000 products it was the data scientist’s job to correctly distinguish between Otto’s main product categories.

We used PredictBench to tackle this job and we had amazing accuracy in classification. In fact the PredictBench team was able to come within 0.021 points of the optimal solution which was no mean feat, beating out around 3500 teams from around the world.  Our final position in this challenge was 16th (out of 3514).

On this project we teamed up with American data scientist Walter Reade who brought invaluable experience and knowledge to the PredictBench team.

Working together with Walter we were able to put together and ensemble of hundreds of models including linear models, neural nets, deep convolutional nets, tree based ensembles (random forests and gradient boosted trees) and many others.  The huge scale of the final solution goes to show how incredibly complex this problem was and the skills that were required to achieve such amazing results.

Working with Otto data and teaming up Walter was a great boon to the PredictBench team and we hope to replicate this in the near future.

Machine Learning for the FMCG Industry

A detailed observation on the potential benefits of using modern Machine Learning technologies in the FMCG vertical

Executive Summary

The unique characteristics of the FMCG industry make it an ideal candidate for Machine Learning and associated technologies. These characteristics include very large volumes of transactions and data, and a large number of data sources that influence projections. These characteristics mean that traditional analytics technologies struggles with the volume and complexity of the data which is exactly where Machine Learning is best suited.Most horizontals of the industry are candidates for optimisation including improving the effectiveness of marketing campaigns, increasing the performance of the sales team, optimising the supply chain and streamlining manufacturing. The FMCG industry has been relatively slow to adopt these cutting edge technologies which gives an early entrant an opportunity to strongly outperform its competitors.



FMCG Introduction

FMCG (Fast Moving Consumer Goods) refers to organisations that sell products in large quantities. These products are usually inexpensive, the volumes sold are large and may often have a short shelf life. Profits on individual items is very small and large volumes are required to have a viable business.These characteristics offer many challenges and also many opportunities.

This paper investigates these challenges and opportunities in detail and focuses on the use of Machine Learning technologies to optimise processes to increase profits for FMCG companies.

Machine Learning Introduction

The following list should serve as a refresher when thinking about Machine Learning vs traditional analytics and business intelligence:1. Unstructured Data

Modern Big Data technologies and advanced machine learning algorithms can analyse data in any format, such as images, videos, text, emails, social media messages, server, logs, etc. Whereas traditional analytics can only analyse structured data in databases.

2. Combine Data

Modern technologies allows us to quickly merge datasets together and form rich data collections that can merge internal company data with external public data sets. This allows the data scientist to enrich sales and marketing data for instance with government social demographic statistics. Traditional analytics is usually performed on data silos and when data sets are combined this is usually done at a huge expense by building data warehouses which still only usually have internal company data.

3. Future vs Past

Machine Learning is often called predictive analytics as one of its major use cases is to predict the future. Advanced machine learning algorithms will ingest all your data and find patterns that can then be used to make accurate inferences about the future. These predictions are qualified with an accuracy metric so management can make intelligent decisions based on these predictions. Traditional analytics rarely tries to infer future events and only deal with explaining and visualising past events.

4. Answers vs Reports

Using the predictive power of machine learning, management can start asking smart questions from their data. Questions such as:

  • What is the optimal marketing campaign to increase market awareness for product X
  • How many of product Y should we product to reduce oversupply next winter season
  • What sales rep should I use to manage our new customer to maximise potential profit

This is very different from existing business intelligence suites which usually deliver dry reports or charts which are very often misinterpreted.

5. Speed of delivery

Traditional analytics / business intelligence implementations can take years to complete. They are intrusively integrated into an organisations IT and as such move very slowly. Modern machine learning technologies allow for management to get answers from their data very quickly and efficiently. A simple question can be answered in weeks not years.

6. Machine analysis vs human interpretation

Machine Learning uses advanced computer algorithms to analyse unlimited quantities of data. This analysis is done totally impartially and free from any biases that are common in many manual analysis. The outputs from these algorithms are also very easy to interpret and leave very little room for misrepresentation making them very objective and quantifiable tools for decision making.

Machine Learning in FMCG

The FMCG (Fast Moving Consumer Goods) industry is an ideal target for Predictive Analytics and Machine Learning. There are several unique attributes of the industry that makes this so; these are:

  • The massive volumes involved
  • Access to good quality sales data
  • Short shelf life
  • Current forecasting techniques are relatively inaccurate
  • Current marketing strategies are less than optimal
  • Current manufacturing practices are less than ideal
  • Current supply chain strategies are less than optimal
  • Consumer numbers are very large

We now explore each of these attributes in detail.

1. Large volumes / access to good quality sales data

The number of sale transactions available to modern FMCG organisations is huge. This data can usually be purchased from retailers and is of very high quality. This sales data forms the backbone for any predictive model as increasing sales should always be the primary objective of any predictive project.Most large FMCG companies also have very good systems in place that record data at every stage of a product’s lifecycle. From manufacturing to delivery to marketing and sales. These systems usually have very high quality data and require very little data cleansing to be valuable.

Given the enormous volumes of transactions generated by FMCG this data is usually very hard to analyse manually as it overwhelms most brave analysts. Currently many organisations have not gone beyond basic analysis at a very high aggregated level, for instance: sales for the week, sales for a store, etc. And where they do drill down deeper into the data, this is usually done by senior analysts with years of experience (and biases) at a huge cost.

2. Short shelf life

FMCG products usually have a short shelf life meaning that the costs of oversupply and over manufacture can be significant. Given also, the large volumes of products any optimisation to the oversupply (or undersupply) problem can result in very large ROI. The over/under supply problem is again a perfect candidate for machine learning technologies.

3. Sales and marketing

If your goal is to increase sales then having accurate sales forecasting is critical. With an accurate forecasting model you can create simulations that allow managers to do quality “what if” analysis. Currently sales forecasting is inaccurate and senior management lack the confidence in these numbers. Having the ability to merge many data sources (sales, marketing, digital, demographics, weather, etc.) greatly improves the quality of sales forecasts when compared to traditional predictions which are traditionally done on isolated and aggregated sales figures.Once the sales data is merged with the marketing data we can start making very accurate marketing predictions also. Questions like:

  • Which product should we promote this month
  • What type of campaign will be most profitable for this product
  • What consumer segment should we target
  • How can we get value from our social media data and use current consumer sentiment to create timely marketing campaigns

4. Manufacturing and supply chain

Most large FMCG have wonderful ERP systems that hold a wealth of hidden value in their data. This data can be used to create models that can answer several critical questions.

  • How can we guarantee on time delivery
  • How can we shorten the time to manufacture a product
  • How can we increase the yield for a product
  • How can we minimise product returns / complaints


PredictBench is a product that enables you to get the most value from your data. It is quick and efficient and does not need to involve your IT department. You do not have to understand reporting, statistics or any form of data analysis techniques. You just ask us what questions you want answered and using the latest Machine Learning technologies; we give you those answers.If you are interested in learning more please feel free tocontact me.


Founded in 2002, PicNet has been a leading provider of IT services and solutions to Australian businesses.PicNet helps organisations use technology to increase productivity, reduce costs, minimise risks and grow strategically.

PredictBench set to go global

The official announcement of the Elevate 61 participants was release today.  We are very proud to be included in this list.  Our latest offering “PredictBench” has been recognised as being innovative and exciting enough for Advance and KPMG to help us take it to the US!!

This means we will be extremely busy in the coming weeks/months traveling to the US, meeting and presenting PredictBench to companies and potential partners.

Over the next few months PicNet will be showcasing PredictBench in Los Angeles, San Franciso and New York as well as in all Australian major cities.

This is a wonderful opportunity that will help companies around the world take advantage of our PredictBench solution that we have worked very hard to build and are extremely proud of.

What is PredictBench

PredictBench is a solution that helps organisations predict with confidence future business events based on their own historical data and other influencer factors.  It allows organisations to answer questions such as:

  • What marketing campaign will give me the greatest return on investment
  • How much of a certain product to produce to reduce oversupply whilst guaranteeing no undersupply
  • How can we measure the risk a customer represents

In the past these technologies have only been available to Silicon Valley research start-ups or corporate giants.  We bring this technology to all corporations and government entities in an affordable and efficient solution that aims to deliver real value for money.

For more information please visit the PredictBench page, watch the short video or download the flyer.