Robert Hook


My personal interests are as varied as my professional ones, and it is these that have bought me to the UK in search of new horizons. I am keenly interested in medieval and renaissance history and culture. This has led me to be studying historical swordplay, being involved in re-enactment, and learning to play early music with a variety of woodwind instruments. I’m a keen woodworker, and have tried my hand at constructing reproduction shoes and clothes. You are more likely to find me in a museum or gallery than a football stadium, and even more likely to find me with my nose buried in a book.

Mixed in with more personal things, you’ll find technical rambles on my somewhat intermittent blog, and you can find out more about me via my personal site.


I am a software engineer with over 29 years experience in a broad range of industries, and with a very broad skill set. My passion is creating robust, high performance software, with a desire to create the right solution first time, every time. I am particularly excited by and interested in the problems involved in extracting knowledge from large volumes of data, and in building complex, high availability, high performance server-side software.


Technical Skills

Java, Groovy, Pig, SQL, Javascript, Unix Shell scripting, C, C++, Basic, MUMPS, Pascal, Fortran, Ingres ABF
J2EE, JSF, Struts, Spring, Hibernate, EclipseLink, AWS SWF using Flow, Hadoop, AWS EC2 and EMR
Apache, HTML/XHTML, CSS, JSP, XML, JSON, XSLT, XML Schema, XSL, Servlets, Web Services
Application Servers
JBoss, Glassfish, Jetty, Tomcat, Orion
Operating Systems
HP-UX, Solaris, Linux (RHEL, Ubuntu, BSD), CentOS, MacOS, Windows, MS-DOS, Xenix
DynamoDB, Cassandra, Oracle, Postgres, Ingres, MySql, AWS RDS, xBase
Java sockets (TCP, UDP), HTTP, JMS, SOAP, REST (Jersey + JSON), JAX-WS, Axis
Docker, Ansible, Vagrant, Netbeans, Eclipse, IntelliJ, JBuilder, Xcode, Gradle, Spock, Maven, Ant
Agile, Test Driven Development, Kanban, Waterfall, OOD
Version Control
Git, Subversion, Visual SourceSafe


Senior Data Engineer, Think Big Analytics, January 2017 -

Watch this space...

Senior Software Engineer, Camelot Global, October 2015 - November 2016

At Camelot I was working within the Instant Win Game team, working on a set of web services that provide the IWG service. In addition I designed and built an Event Logging Service that collated and distributed key business events from across the suite of services that comprise the overall platform.

These services were characterised by requiring quite high transaction rates, and need to be extremely reliable. Providing guarantees about the state of customer transactions was as important as providing a very secure system. Part of the strategy around this was to ensure there was a reliable single point of truth (the Cassandra database), addressed by an arbitrary number of stateless horizontally scalable service instances, married to a solid understanding of the latencies between receipt of requests and the point a consistent read was available from the data store.

The code was written in a mixture of Java 8 with Spring 4.x and Groovy, with an emphasis on using Groovy mainly for unit and integration tests run by the Spock framework. All services were deployed into Tomcat containers, and where appropriate are backed by Cassandra databases. REST interfaces were brokered using Apache CXF, with assistance from the Jackson libraries for serialisation of JSON. The build process used Gradle, providing a consistent and seamless development cycle all the way from unit and integration tests on the desktop through to the construction and deployment of RPMs into QA via the Jenkins CI environment.

In this role I was responsible for the correctness of implementations of designs passed across from the architectural team, and contributing to the technical correctness and suitability of architectural decisions. In addition I had the day-to-day responsibility for ensuring that the code base was clean and solid, and suitable for deployment to a QA environment at any time, and provide third-line engineering support for production instances of the services.

My latest project was to work with the DevOps team and others to deploy these services into the Camelot UK environment. In keeping with the strategy of Camelot Global to focus on SAAS, or to be deploying 'black boxes' into a customer's environment, we bundled the services and a variety of supporting technologies into a distributed Docker Swarm cluster. This involved very rapid learning of a large number of technologies in this area: Management and provisioning of virtual machines into various environments and cloud vendors was performed by Ansible, which in turn usede Docker Swarm to create a distributed cluster into which we deployed the applications. The cluster included a Cassandra cluster, and we made use of Consul for DNS resolution and distributing state across the whole cluster. The services were fronted inside the cluster with HAProxy, allowing us to expose a single point of entry into the cluster while providing very high availability within the cluster. Configuration of the wiring within the cluster was dynamic and automatic, relying on Consul as a point of truth for the cluster state.

My role in this project was to rework the service projects to be automatically building Docker images using Gradle in our Jenkins CI pipeline, and coordinating the technical resources to keep the project on track to deliver a solution which allows automated and repeatable deploys into desktop, testing and production environments. In addition, this project was used as a template for me to evangelise the use of Docker and related into the rest of the projects, assisting other teams and the QA team to rebuild the development and testing framework to focus on the target deploy environment for all services to be into a Docker Swarm cluster.

Key Learnings

The biggest challenge so far in this role happened over the first few weeks, where I had to rapidly learn Groovy, Gradle, Spock and CXF having had no previous exposure to these technologies. From a standing start, I can confidently say that I was fully competent within the first four weeks. I also had to come to grips with the different requirements for logical and physical data design using Cassandra, which has slightly different semantics and emphases to other NoSQL databases I have used.

The current project has required me to rapidly assimilate and integrate a cornucopia of technologies around Docker - the Docker suite itself, Ansible, Consul, Registrator and Vagrant to name a few. This has been an interesting experience, learning these technologies in depth while under tight time constraints for delivery, and little scope for error in deployment.

Technical Design Authority, Lithient, November 2014 - October 2015

Lead Java Developer, Lithient, March 2014 - November 2014

Senior Developer, Somo, February 2012 - March 2014

I came on board with Somo early in the Apptimiser project (later renamed Lithient), as the second technical hire. During my time I assisted in building out the engineering team, while at the same time rapidly building out a sophisticated, highly performant and highly resilient system to support analytics in the Mobile Marketing arena.

As the team expanded, so did my role, culminating in taking on the position of Technical Design Authority. In this role I have responsibility for all technical design, taking business requirements through design and articulation to the point of concrete change requests. The reverse also holds, in that I constantly contribute to the direction and prioritisation of business requests to ensure they are achievable and desirable within the broad development roadmap. I also have the responsibility for maintaining standards of quality and process, development of new processes where required, and the day to day management of the efforts of the development team. My team leadership responsibilities included balancing of work across the team to deliver Agile sprints effectively, developing the skills of team members, and coordinating the efforts of the development team with other specialist groups within Lithient. Finally I have a responsibility for ensuring successful frequent software releases to the production environments and ensuring that the production and other runtime environments are monitored and maintained.

I am proud of the efforts I made to promote an environment dedicated to building out extremely high quality code. To support this I introduced and enforced rigorous coding and design standards in a TDD-focused Agile environment. I placed an emphasis on peer code review using Fisheye and Crucible and backed this with automated static code analysis using tools such as FindBugs, PMD and Checkstyle running within a CI environment managed by Jenkins.

To provide for a system that was low maintenance, able to support our high transaction rates, and indefinitely scalable, I built out an architecture largely using J2SE, with J2EE elements used cautiously and abstracted away. This was a deliberate decision to allow us to deploy to very lightweight application containers (Jetty, after having evaluated Geronimo and Grizzly). A deliberate side effect from this decision was that it allowed a lower barrier of entry for junior coders to be able to produce good solutions without having to engage with the broader complexity of J2EE or Spring.

An example of this restrained adoption of key J2SE technologies, was to move the persistence layer to EclipseLink to provide ORM via JPA, and move our messaging to a stand-alone clustered HornetQ installation for use with JMS. In both instances I provided an abstraction layer that removed all complexity of JMS and JPA, reducing the interface to simple Get/Put Bean semantics.

In addition to code and system design and implementation, I was instrumental in the design of logical and physical data models, and implementing controlled development methodologies around maintaining and updating the databases using database migrations. Initially these were simple MySql 5.x databases running on dedicated servers, but as part of a general migration of services into the Amazon AWS cloud, they became managed RDS instances. As well as these traditional relational databases, I implemented use of Amazon's DynamoDB big data solution to hold our transactional data.

During my tenure, I took a proof-of-concept of Apptimiser written by third parties and turned it into a sophisticated and robust, enterprise quality service. When I began, the system could handle at most around 70 transactions per minute. Today it is able to handle over 8000 transactions per minute, and can be scaled indefinitely by adding more server instances. The system has multiple redundancies and self-healing strategies built in, and we are close to being able to introduce Continuous Deployment driven from the Continuous Integration environment. As it is, many of our periodic upgrades do not require a service outage, and we have over 99.9% uptime.

Technologies Introduced

I introduced the following technologies, services and practices to Lithient and the engineering team, and undertook appropriate training of the team, creation of usage and maintenance documentation, and evaluation of alternatives prior to adoption:

Key Learnings

One great thing about my time with Lithient was the chance I had to learn the details of Mobile Marketing from the very best practitioners. On the technical side, the main thing that I had the opportunity to learn about was many of the services in the Amazon AWS stack, particularly DynamoDB, SWF, S3, EMR and EC2. I also gained some exposure to ELB and RDS, but was not directly responsible for the design or implementation of these.

I had some opportunity to dip my feet into Android coding, and was fortunate enough to undertake a formal iOS training course. This left me in the position to be able to maintain the Lithient SDKs for both platforms, and to be able to specify and review changes to those SDKs.

Software Engineer, Transaction Network Services, May 2010 - November 2011

In this role I worked in the Continuous Engineering team to enhance, maintain and support a range of cutting edge and legacy J2EE products for the Payment Industry. The role required sophisticated and rapid problem solving, prioritisation and resolution skills, and a pragmatic approach to providing solutions that satisfy both the end customer and the enterprise, coupled with exceptional Java coding development skills. A significant component of this role was overview and maintenance of software standards for new products, and continuous improvement of the security, reliability and maintainability of legacy systems.

The products in place were all internet facing and oriented around secure communication of transaction data, backed up by strict adherence to and compliance with PCI-DSS. All products were written in Java leveraging the powers of Spring for resource injection and a variety of other modern technologies including JMS, JPA and JAAS. The development environment was Agile, with a very strong emphasis on automated unit, integration and regression testing coupled with a traditional staged release environment. Use was made of Kanban for managing maintenance activity.

Both Eclipse and IntelliJ were used for development, with builds brokered by a mixture of Ant and Maven. All development and maintenance activity was performed against a Subversion repository, and deployed onto system and integration test hosts via a continuous integration environment based around Hudson. In house documentation was written and published via a Confluence CMS instance, and I was a significant and avid contributor to this documentation.

During my time at TNS, I participated in PCI-DSS mandated security training, and am well abreast of current issues and solutions related to web-facing systems, and in particular to security, confidentiality and auditing of financial systems.

Successes at TNS include:

Software Engineer, Salmat/HPA, 2003 - May 2010

Responsible for the design and development of products oriented around fast, high-availability, complex J2EE Web Services and Web Applications to support the organisation’s business process outsourcing activity. These products were characterised by the need to support very large data sets and high transaction rates.

The bulk of the products were developed as a set of loosely connected J2EE web services running within a full J2EE environment (JBoss, Tomcat and Orion) and communicating via SOAP. Those services backed by a database used a mixture of Hibernate and JDBC for persistence against Oracle, Postgres and MySql databases. A number of the products made extensive use of XML for data interchange, and XSLT for presentation.

Successes at Salmat/HPA include:

My dedication to quality and robustness, and a marked willingness to work whatever hours were necessary to fulfil corporate objectives and requirements, saw me lauded on several occasions through the national Employee Recognition program.

Database Administrator / Programmer, Qld Police Service, 1998 - 2003

Primarily dedicated to the creation of tools to support very large scale data conversion and cleansing activities. In addition responsible for design and implementation of Database Administration tools and processes, and database design in support of other development activity. I also designed and managed the corporate wide rollout of Ingres II 2.0 to replace a mix of older RDBMS.

Tools were created in a mixture of C, C++, Unix Shell Scripting and Ingres ABF, using XP and other Agile methodologies. The nature of the enterprise led to the overall development methodology being a traditional Waterfall, however through leading small project teams, I was able to begin introducing some aspects of Agile. The tools and processes developed needed to support very large data sets and extremely tight security requirements, and to meet very aggressive performance requirements.

Database Administrator / Programmer, Qld Department of Natural Resources, 1995 - 1998

Responsible for database design and analysis against very large Ingres installations, and creation of tools and procedures for performing maintenance, analysis and data conversion/cleansing against those large data sets. Worked on the IVAS, IVASe and LGIP projects, to design and create a suitable tool set in a mixture of C, C++, Ingres ABF and Unix Shell Scripting.

Senior Analyst/Programmer, Database Administrator, Pine Rivers Shire Council, 1989 - 1995

Responsible for design and maintenance of broad range of local government administration and financial systems, initially using MUMPS, but in later years working in C and Ingres ABF. I was responsible for creating and promoting standards and processes for the use of Ingres within the organisation.

Programmer / Technical Support, Shannon Robertson Systems, 1988 - 1989

Development and support of MS-DOS based small business systems and support systems for the agricultural industry, including debtors/creditors systems, feedlot management products, and stock breeding/stock book programs designed to integrate with the ABRI Breedplan project.

Secondary School Teacher, Mathematics and Science, Qld Department of Education, 1987 - 1987

Having taught secondary Mathematics and Science in a remote outback town, I acquired excellent communication, negotiation and time management skills. I maintain a professional interest in educational and didactic techniques, policies and trends.



Available on request, or visit my profile on LinkedIn. Note also that a previous version of this CV provides considerable more detail on some of my past work.


My personal interests are as varied as my professional ones, and it is these that have bought me to the UK in search of new horizons. I am keenly interested in medieval and renaissance history and culture. This has led me to be studying historical swordplay, being involved in re-enactment, and learning to play early music with a variety of woodwind instruments. I’m a keen woodworker, and have tried my hand at constructing reproduction shoes and clothes. You are more likely to find me in a museum or gallery than a football stadium, and even more likely to find me with my nose buried in a book.

Mixed in with more personal things, you’ll find technical rambles on my somewhat intermittent blog, and you can find out more about me via my personal site.