Skip to main content

Openers vs Closers

I've read an article in the past about the 2 types of developers: openers and closers. I can't remember the exact link, but this distinction is weighing on my mind lately.

In my mind these are roles, not developer types. A developer might be filling either role, depending on the circumstances.

Openers

The openers are the developers that are on the front line, taking feature requests from managers and customers, and turning them into real applications that can be interacted with. The openers are free to use whatever means necessary to get the job done. They aren't concerned with performance or scalability. Their job is to make the app work.

Closers

Once the app features solidify, the closers are the ones who fix the performance and scalability issues. They care deeply about runtime performance, about efficient algorithms, about profiling tools, memory allocation, all all the deep technical issues. While all developers are typically nerds, the closes are typically the uber-nerds. They understand systems at a deep level, and figure out systems that they didn't take part in building.

I've been developing software for 15 years. I'm finding myself falling into the closer category as of late. I'm digging into technical issues that the openers don't have the time, knowledge, or patience to dig into. And I enjoy it.

Sometimes I wish that the openers would spend a little more time thinking like closers :).

The key point about openers vs closers is that they are completely different mindsets. Openers are thinking in terms of features, user interactions, what should be built, and how to iterate quickly. Closers are in the mindset of how can we make this work for our user base, how will it scale, how do we monitor it, and a whole lot of other minutiae that the openers don't want to get bogged down with.

This reminds me of the Lean Startup approach to building software. Openers are testing ideas, iterating as quickly and cheaply as possible. Closers are thinking long term, and how to maintain the system, and keep the lights on.

Comments

  1. Ask students to work in pairs and interview one another about what they have learned. www.jtownmoldremediation.com/

    ReplyDelete
  2. The key point about openers vs closers is that they are completely different mindsets. IEEE projects for cse
    Openers are thinking in terms of features, user interactions, what should be built, and how to iterate quickly. Closers are in the mindset of how can we make this work for our user base final year projects for computer science
    , how will it scale, how do we monitor it, and a whole lot of other minutiae that the openers don't want to get bogged down with.

    ReplyDelete

Post a Comment

Popular posts from this blog

Generating Java Mixed Mode Flame Graphs

Overview I've seen Brendan Gregg's talk on generating mixed-mode flame graphs  and I wanted to reproduce those flamegraphs for myself. Setting up the tools is a little bit of work, so I wanted to capture those steps. Check out the Java in Flames post on the Netflix blog for more information. I've created github repo ( github.com/jerometerry/perf )  that contains the scripts used to get this going, including a Vagrantfile, and JMeter Test Plan. Here's a flame graph I generated while applying load (via JMeter) to the basic arithmetic Tomcat sample application. All the green stacks are Java code, red stacks are kernel code, and yellow stacks are C++ code. The big green pile on the right is all the Tomcat Java code that's being run. Tools Here's the technologies I used (I'm writing this on a Mac). VirtualBox 5.1.12 Vagrant 1.9.1 bento/ubuntu-16.04 (kernel 4.4.0-38) Tomcat 7.0.68 JMeter 3.1 OpenJDK 8 1.8.111 linux-tools-4.4.0-38 linux-to...

Basic Web Performance Testing With JMeter and Gatling

Introduction In this post I'll give a quick way to get some basic web performance metrics using both JMeter and Gatling . JMeter is a well known, open source, Java based tool for performance testing. It has a lot of features, and can be a little confusing at first. Scripts (aka Test Plans), are XML documents, edited using the JMeter GUI.  There are lots of options, supports a wide variety of protocols, and produces some OK looking graphs and reports. Gatling is a lesser known tool, but I really like it. It's a Scala based tool, with scripts written in a nice DSL. While the scripts require some basic Scala, they are fairly easy to understand and modify. The output is a nice looking, interactive, HTML page. Metrics   Below are the basic metrics gathered by both JMeter and Gatling . If you are just starting performance testing, these might be a good starting point . Response Time – Difference between time when request was sent and time when response has been fully re...

Multi Threaded NUnit Tests

Recently I needed to reproduce an Entity Framework deadlock issue. The test needed to run in NUnit, and involved firing off two separate threads. The trouble is that in NUnit, exceptions in threads terminate the parent thread without failing the test. For example, here's a test that starts two threads: the first thread simply logs to the console, while the other thread turfs an exception. What I expected was that this test should fail. However, the test actually passes. readonly ThreadStart[] delegates = { () => { Console.WriteLine("Nothing to see here"); }, () => { throw new InvalidOperationException("Blow up"); } }; [Test] public void SimpleMultiThreading() { var threads = delegates.Select(d => new Thread(d)).ToList(); foreach (var t in threads) { t.Start(); } foreach (var t in threads) { t.Join(); } } Peter Provost posted an article that describes how to make this test fail. It...