Skip to main content

Book Review - The Phoenix Project

Here at Skybox Labs, we do regular lunch and learn session where a fellow colleague present on topics ranging from clean code, continuous integration, game development, machine learning to almost any areas where there are reasonable interests.

One of the very recent lunch and learn series that I have attended was focusing on DevOps which got me interested to learn more about the topic. I was looking for recommended books and the lead presenter highly recommended that I start with 'The Phoenix Project' by Gene Kim

Being inspired by the stellar reviews in Amazon, I have decided to get a copy and read it over the weekend.
A fantastic book that contains a wealth of information and delivers it in an intelligent and interesting way; a story. The book successfully captures the events and struggles of most people who work in IT Operations and gives a very good explanation on why these problems exist, and how you can solve them. It portrays a very effective way of thinking in applying our understanding of plant floor operations, manufacturing, and logistics to how we manage IT in a business. The way it establishes that connection is fantastic.

Beside throughly enjoying  the book as a breathtaking novel, here are my few key takeaways from this book:
  • Definitely a must read for anyone in leadership position. The vivid narrative presented in this book should convince anyone about the importance of well run IT in overall business success. 
  • Know and master The Three ways.
  • Work is not done until it is in the hand of your end users/customers. Every team is equally responsible and should work together to make that happen. It is not like DEV team is done now it's QA's job.
  • The more we can automate the hand-off between teams the less chance of failure and unplanned work. For example, crux of the deployment issues phoenix team initially faced was due to the fact that their DEV, QA and PROD environment were not on sync. After they automated the environment creation and made it consistent across DEV, QA and PROD that problem resolves itself. Also, now  they could spin a new DEV machine in record time as a nice side effect. This eventually points to the importance of continuous integration, deployment and a tight feedback loop to make the pipeline more robust.
  • Frequent deployment immensely helps business to experiment on the market need, iterate quickly on the winning features and retire the ones which are not as successful. Also it gives the development team more confidence on their process as it gets easier to push bug fixes or roll out new features.
  • It is immensely important to have a clear understanding of core business goals, each team's roles on achieving those goals and relating those goals to relevant work items.
  • Understand Four types of work and their differences.
    • (a) Business project (Delivers customer value) - New Solar powered Car - Halo Infinite
    • (b) Internal IT project (Production line for delivering (a)) - Paint machine, Crash tester -  Source control, Automated test, Continuous delivery
    • (c) Changes(improve (a) and (b)) - Replacing Faulty car parts, Reduce production time - Usual Bug and hot fixes and performance tuning
    • (d) Unplanned work (Basically paralyze all the other works) - Painting machine broken down so cars in the assembly need to wait until it is fixed - Build broken.
  • My favorite character - Erik,  hated character - Sarah. However, both were equally important to create the necessary tension in the story.

The next book in my reading list is 'The DevOps Handbook' by the same author. Stay tuned :)


Popular posts from this blog

Creating dynamic email templates using C# and Office Outlook

It is quite common for many applications to send automated email notifications. Couple of months ago, I have worked on improving our old email template format to make it more user friendly . In this tutorial I will walk you though regarding how I took advantage of Microsoft Outlook to quickly generate custom email template and later using the html template for building an automated custom email application using C#. Steps: Creating Templates: Using the rich text editor support  in Outlook create a nicely formatted email. Use placeholder text for the values you like to change dynamically based on your task completion status. To keep this tutorial simple, I have created a  simple table with placeholder text inside the third bracket  [place holder text]. However, you can use anything supported by outlook editor. Figure: Email Template Getting HTML code: Send the created email to your own address. After that, open the sent email and right click to view source . It

Why using XOR might not be a good hash code implementation?

Using XOR for computing hash codes works great for most of the cases specially when order of computation does not matter. It also has the following benefits: XOR has the best bit shuffling properties of all bit-operations and provides better distributions of hash values. It is a quick single cycle operation in most computer  Order of computation does not matter. i.e. a^b = b^a However, if ordering of elements matter then it is often not a good choice. Example For simplicity consider you have a class with two string properties named Prop1 and Prop2  and your GetHashCode returns the xor of their hash code. It will work fine for most of the cases except cases where same values are assigned to different properties. It will generate same hash-code i.e. collision in that case as can be seen in the below example . However, using the modified approach as recommenced by Joshua Bloch's in Effective Java which uses prime multiplication and hash chaining provides more unif

SQL Performance improvement for User defined table types

Recently, I have dealt with an interesting performance issue with one of my SQL query and thought I will share the experience here. Context: We had a legacy stored procedure responsible for saving large amount of excel row data to our database tables. It was using   User Defined Table Types as one of the parameter to get a list of row data from excel. However, the stored procedure was taking very long time to save the large data set. Root Cause: After quite a bit of investigation using execution plan in SSMS, I was able to narrow down the performance issue to the following: Joining with User defined table type was taking >90 percent of the time A custom hash function which has been used multiple times as a join criteria was also quite expensive to compute. After doing additional research using stack overflow , I was able to figure out that the primary reason for the poor performance doing a  JOIN on Table Valued parameters is that : it does not keep statistics and a