Powering Through

I hurt myself the day before yesterday. After a weightlifting practice, there was a higher intensity portion of the workout that involved a similar movement. I didn’t think to remove weight from the bar, and after 12 minutes of a 15 minute AMRAP, I felt a stabbing twinge between my shoulder blades. After a few hours, it was uncomfortable enough to take some Tylenol and complain a lot. Most of the evening I spent lying directly on my back, or on my chest, spreading my upper back out as much as possible to alleviate the pressure. I took a few more Tylenol and went to bed, and largely slept in the same position.

Yesterday my back still felt tight. Better, but still tight and twinge-y when I twisted quickly. I took a few more Tylenol in the morning. I decided to let my workout slide and take a rest day.

This morning the tightness remains, but little pain. I notice it from time to time, as I wiggle my shoulders, exaggerating them into circles under the guise of a stretch. When I push my chest out wide, I can feel a small indent, as if someone is digging their thumb into the flesh directly under the tip my right scapula. I plan on taking another rest day, and am considering another tomorrow.

I try not to power through stuff much. Powering through pain is the right way to seriously hurt yourself.

When I was 34, I was 320 pounds(ish), loud, big, terribly funny, and devilishly handsome. Every once in a while, I went into the doctor’s office, and heard the spiel about my blood pressure, and how I should lose weight. In my life, I have almost always been big. When I was a kid, my dad called me the ‘little one’, because I was 1) his youngest, and 2) at 13 years old, 3 inches taller and 15 pounds heavier then he.

Dad was both stocky and wiry; he seemed electric. Quick and powerful. He was also not terribly funny. He always laughed at his own jokes. He still does. I was big and strong and always had been. When we moved out of the bar in Elk River, I held up one end of the hide-a-bed, he and my older brother, the other.

In November of 2013, I noticed a cough that was sticking around longer than I expected. I still came into the office. I could power through. On Friday, seven or eight of us developers from {redacted} went to a downtown bar, that had a large restaurant seating area. Most of us drank a beer, had a burger, and then made the trek back to the office, mostly laughing about Star Wars, meat, or whatever libertarian idea some of the more political friends had proposed.

I could not keep up with my friends up the stairs. Each step, I fell behind. I could not catch my breath fast enough. When I did finally make it up those steps, the group was happily laughing a hundred feet ahead of me. I had seen enough episodes of House to know that this was some kind of walking pneumonia. I just needed to get to a doctor to get some antibiotics.

I would have to go all the way to a doctor to get a prescription. What a damned waste of time.

Walking back to the office, slowly, slightly downhill, I began to feel better. I caught up with my co-workers, they asked about where I went, and I made a joke about noticing an attractive woman and losing my interest in them. It was time to power through the rest of the day.

That night I could not fall asleep. As soon as I would lay on my back, I would start coughing. Whatever was in my lungs just would not go away.

I went in to the doctor in the morning. My normal doctor was not there, but the on-call guy would be just as good. It does not take a lot of experience to prescribe antibiotics.

I went back into the office with a ‘I do not have time for any BS, let’s get this done’ nurse. I liked her; she was quick to laugh. She asked what I was there for, and I told her with a joke. She laughed easily, quickly wrote things on my chart while she went through the basics of a checkup. I was two steps away from a consult with a doctor, then a trip to the pharmacy.

She wanted to get my blood pressure. It had not been taken for a while. Her mouth set hard and her eyes opened very wide as she looked at the gauge on the blood pressure machine. I remember telling her that I knew it was high. She opened her eyes wider, stood up, and headed out of the room while the cuff remained on my arm. The doctor came in. Reran the test. Then asked if he could look into my eyes. “Sure doc. Just don’t fall in love or anything…”

I remember two other points on that day, but the rest is gone.

The X-Ray tech asking me what happened, why he was X-raying my chest, if I got into a fight or something. Getting a few bottles from the pharmacy, and the pretty pharmacist looking up at me, “Are you OK?”

“I’ll power through.”

To be continued…

A Wine Gem : Warr-Kings 2017 Descendant

Last night a storm in the area brought an intense amount of rain drumming down around our home, making for a perfect evening to enjoy one of my favorite red blends, The Descendant, from Warr-King wines. It is a complex blend, with a floral and spicy nose, that leads into deep red fruit and full tannins. The wine is mainly cabernet franc, but the magic is the in the blend. A tiny hint of merlot gives it that floral nose and the full mid-palette. I would not call it a ‘Big and Bold’ Washington Blend, so much as a ‘introspective’ blend. A great wine to read a classic novel with.

I live in a house with 3 teenagers. So beyond the yelling, (why there is always goddamn yelling with teenagers?), about “where such-and-such is in the kitchen”, and yelling at a Zoom call, and yelling about some video game being epic but also totally unfair and hacked, I could only attempt a moment’s peace with a magazine and a glass. The pictures were nice, and I opened the window so I could point my ears at the sound of driving and pounding rain and avoid the sound of ‘stupid unfair hackers’ and ‘where’s the coconut oil.’

Heather and I have been happy members of the Warr-King wine club since mid-2019, and we got this particular bottle at a club pickup in November of 2019.

A Note About Wine Clubs

Heather and I are members at eight local wineries. All of them have their individual charms, but as a club member, you can usually expect a few perks. Wines are released to you first, and usually at a discount to their retail prices. Release parties at the wineries are common, and are sometimes even offered with food. Visits and tastings at the winery are usually offered gratis (including tastings with guests) for a fun afternoon where you usually end up buying a bottle of something wonderful, and again, usually discounted. Often, there are different commitment levels to boot.

Wine club details from Long Cellars, in Woodinville Washington
Woodinville Wine Clubs are Awesome

Frankly, wine club membership is a heck of a deal. You make a relatively small commitment to purchase wine from an artisan crafts-person whose wine already you know you enjoy. The winery enjoys less risk in production, and you get the wine you will enjoy at a lower price.

The Story of The Descendant

If you ask Lisa up at Warr-King, she probably has a wholly different story, but for me, the story of The Descendant started at a local restaurant in downtown Bothell, Revolve Food and Wine. The restaurant offered local wine, and an entirely gluten-free menu. My wife is gluten-free, so having a whole menu my wife could pick from made for a happy evening.

While there I had sampled a few local wines but my favorite on the evening happened to be from a local winery I had not heard much about. The wine was deeply ruby, rich and fruity and a little spicy. Tannin was present, but only just. One of the points I enjoyed the most was the balance. Many Washington wines are so fruit-forward, that they tend to have higher alcohol content. They burn and feel ‘boozy’. This one didn’t at all. It was just wonderfully complex and delicious.

The wine list at Revolve, highlighting the price of the Warr-King Descendant.
It was also selling for $75 a bottle.

A $75 restaurant bottle of wine is a little out of my ‘everyday drinker’ category, so I promised myself I would do a little more research on this as we went home.

Around about that time, I had started following a Washington Wine Podcast called Decanted. The fates conspired, and I happened to be on the way to work, when I clicked into 5th episode, highlighting Lisa and Warr-King. An hour later, I had put ‘go to Warr-King’ on our calendar.

A Happy Fate

It did not take long. Several visits over several months passed. Heather and I had the Passport to Woodinville Wineries, so visited first from there, then we came in again, to sample a Syrah, and again when the Malbec came out. We were hooked.

I had almost forgotten about the Descendant, until the release party in late 2019, when it was happily secured in our club allocation.

An empty bottle of the 2017 Descendant, which thankfully cost me nowhere near $75.

The 2017 is fuller this year, and just as lovely as it was when I first got it. It was rich and dark ruby again. I got hints of tart red fruit and Bing cherries and a floral nose. It is a wonderful blend you should try today. Or, join her wine club, and get a 15% discount!

How-To: Rename the Master Branch

It is well past time for your project to get rid of noninclusive terminology in your git branches! Code is for EVERYONE, so in the interest of making our language serve everyone equally, here is instructions for how to do it.

Starting A New Repo

The process is as follows:

  1. Create to your starting directory.
  2. Initialize the repository.
  3. Create a new “main” branch. I prefer main, trunk or root.
  4. Create a new file in the repo. Start with the .gitignore file.
  5. Add that file.
  6. Commit that change.
$ mkdir demo.branchinit
$ cd demo.branchinit
$ git init
Initialized empty Git repository in {Your Directory}

$ git checkout -b "main"
Switched to a new branch 'main'

$ touch .gitignore
$ git add .

$ git commit -am "First commit."
[main (root-commit) {hash}] First commit
 1 file changed, 0 insertions(+), 0 deletions(-)
 create mode {something} .gitignore

$ git branch
* main

Voila! The repo is now rooted off of your main branch, and the word master has nothing to do with it.

Changing an Existing Repo

An existing repo also easy to change, even if you have a remote source setup.

The process follows:

  1. Go to your existing master branch.
  2. Create a new ‘main’ branch, from that master branch.
  3. Drop the master branch.
  4. Push your new main branch to your remote.
$ git checkout master
Switched to branch 'master'
Your branch is up to date with 'origin/master'

$ git checkout -b "main"
Switched to a new branch 'main'

$ git branch -d master
Deleted branch master (was {a hash})

$ git push origin main
Total 0 (delta 0), reused 0 (delta 0), pack-reused 0
remote: Create a pull request for 'main' on GitHub by visiting:
remote:      https://github.com/myorg/myProject/pull/new/main
To https://github.com/myorg/myProject
 * [new branch]      main -> main

Now you just need to reset the default branch in your remote.

  1. Navigate to your remote git source. I use my web browser for this.
  2. Update the default branch to your new ‘main’ branch.
    • In GitHub,
      1. Go to your repo, click Settings
      2. Then Branches on the left.
      3. Select main in the Default Branch box, and click Update.
      4. Then click through the confirmation.
    • In Azure DevOps Services
      1. Go to your repo, then Braches.
      2. Locate the main branch, and then click the 3 dots at the end of the row.
      3. Click ‘Set as default.’

Finally, clean up your local repo’s remote branches.

  1. While still in your main branch, reset the head on your remote.
  2. Delete the master branch!
$ git remote set-head origin -a
origin/HEAD set to main

$ git branch -d master
Deleted branch master (was {a hash})

Alright, you still have some work to do, but your local repo is nice and clean. You’ll still have to clean up your remote, but you can do so just like you would with any other branch delete.

Thanks for doing your part to make everything more inclusive! #blacklivesmatter

Getting Things Done, A Guide

I was inspired by Sarah Knight‘s, Get Your Sh*t Togther book here, and optimized her ideas for my needs. I read it once, then reread it right after, because it was so engaging. Buy it and read it yourself.

The Way

I start with four lists. First is my ‘To-do’ list, which is my generic catch all list for ideas. The next is my ‘Must-do’ list. This is the important list that requires daily attention. I take the highest priority items in the ‘To-do’ list, and pull over the items that must be done on the day into the ‘must-do’ list. The third list is the ‘Doing’ list. I pull one item from ‘Must-do’, and pull it into ‘Doing’ until it is complete. The last list is the ‘Done’ list, that starts out empty every morning, and is satisfyingly full at the end of every day.

The To-do List

The To-do List is meant to be an open catch all for tasks and items that creep into my head over the course of a day. “Get to the grocery store…”, gets added to the list. “Get the bikes tuned”, check. “Workout for {date}”, added to the list.

Sometimes I get a random thought, and I just add it to the To-do list, if only just to allow myself the freedom to say “this is worth spending some time thinking about, but it’s not the priority now.” The value here is that those things that are ‘wishes’ become things I allow myself to treat as eventually ‘doable’, just not as important right now. This works well for my ‘I need to rewrite this in F#’ feelings.

Prioritizing the To-do list

One of the major functions of the to do list is to show me a pile of all the things I’ve been thinking about, and enables me to stack them in a simple list of most important to least important. Simple and small items tend to get to the top of the list, if only because I know how to do them quickly, and the process of getting things done feels good.

Pro tip: Hack of the to-do list by making tasks small and manageable. “Spend half-an-hour on Udemy course” is MUCH simpler to schedule and accomplish than “Get Better with Machine Learning and AI.”

Prioritization is function of my values. One of my values is making sure my personal finances are in shape, as they were not, for an embarrassingly long time. My to-do list consistently has budgeting and finance items right at the top of the list. Self-care with physical fitness is also a high priority for me right now, so daily physical activity takes the top spots as well.

The Must-do List

The ‘Must-do’ list is where I put top priority items from the to do list. My ‘to-do’ list will contain forty or fifty things. The ‘Must-do’ list I keep less than eight, and only add items to it as items are completed. That keeps the list doable over the course of a day, and creates a simple block around what is actually possible.

Items that are regular ‘Must-dos’:

  • Daily workout.
  • Daily budget/finances check.
  • Write for 45 minutes.
  • Log food for the day.
  • Correspondence.

Putting these items as distinct tasks allow for some distinct optimizations. A daily workout is a scheduled item that ends up on my calendar. In COVID times, that has been a Zoom session, and with the update to phase 1.5 in King County, it’s a small class session at my old gym.

Correspondence tasks can tend to elongate over the course of a day, e.g. who hasn’t spent multiple hours on a slack channel, but when I treat it like a task with a defined end things take on a different mode. Personal email is managed in the morning. Work email is managed immediately afterwords, and then once again in the afternoon. Slack communications can follow the same cadence, and once folks are aware that is how you are working, the immediacy of the medium is not as demanding as it seems.

Logging food is usually a simple task, which can <5 minutes after any meal, but can be done at the end of the day. The point is to remember the intent of the task. I log food to remind myself to measure portions and be intentional about my consumption, not to be 100% accurate down to the calorie. I accept the risk of inaccuracy (did I eat 100 grams of blueberries or raspberries in my yogurt this morning?) for time management.

The Doing List

The Doing list is the loneliest column because I allow one thing in it at a time.

I cannot multitask. At all. In order to do anything competently, I need to focus on one thing at a time.

One valuable feature of the single ‘doing’ list, is that, when my brain has wondered I can quickly glance at the doing list, and ask myself “Am I really doing what I said I’m doing?”

If I have Twitter open, and my ‘Doing’ task doesn’t read ‘read Twitter feed and get pissed off’, there’s a good chance I need to re-engage myself on what I want to spend my time on.

The Done List

The Done list is the most fun list, obviously. I start with a clean list and as I complete tasks from the Must-do list, I put them on the Done list. Initially, an empty Done list is underwhelming, but as the day progresses, it can get satisfying and full. What makes this particularly satisfying is the list ends up usually in the 15-20 items completed over the course of a day, and all of them are the priority items according to what I value the most.

I prioritized ‘daily workout’ and got it done.
I prioritized ‘developer coaching session’ and got it done.

Getting done what I chose and prioritized is empowering.

Further Optimizations

One optimization I have made to this structure since starting it has been creating two other lists to assist some work items, and a few more ‘values’ based optimizations.

First, I created a ‘work week done’ list that simply contains the tasks I have done specifically for work. This helps me write full and accurate weekly status reports for work. In work-from-home COVID times, being able to communicate what I have accomplished over the course of the week seems invaluable.

Second, I created a ‘Do Every Day’ list that I use a simple copy feature to move to the ‘Must Do’ list. This saves me the few minutes spent putting the everyday tasks in the To-Do list.

Estimates: Time-Based vs Point-Based, and when to use them.

There are two general methods for software estimates, time-based and point-based. Here are some tips on when to choose either one.

Time-Based Estimates

Time-based estimates can be tricky. They are the simplest style of estimate and that simplistic nature lends engineers to treat them flippantly. Just about every engineer has said “That’s easy, it’ll take 15 minutes” about something trivial, and then spent a full workday working on it. Time-based estimates get a bad rap from engineers, because they feel punished for missing them. Project managers like them because planning work is simpler.

Time-based estimates work best for near immediate-priority bug fixes / feature requests, and costing and budgeting breakdowns.

Point-Based Estimates

Points-based estimates are a common model in Agile shops, and are largely the same except in terms of flavor when they are requested in terms of t-shirt sizes or Fibonacci series values. They are meant to be quick estimates that give a sizing gauge for the feature without the negativity associated with a missed estimate. Points-Based estimates get a bad rap from project managers, because the statistical approach to ‘backing into’ the time something takes is not accurate enough to get an ‘on-time’ delivery of anything. Engineers like them because they don’t feel as beholden to them.

Point-based estimates work best for high-level estimates of long-term work, or as a quick way to gauge consensus between engineers.

When To Use Them

A project team will generally follow a particular methodology (Agile, SAFE, Waterfall), so you may be held to the preferences and norms of that group, but in general, follow these guidelines for successful estimating.

  • When to use a Points-Based Estimate
    1. The amount of time dedicated to estimating is low.
    2. The priority on the work is variable; your product owner will use your estimate to assist in determining its priority.
    3. Engineers have substantively different approaches to the work, or the work is largely exploratory.
    4. Your dev-to-release cycle is fast.
  • When to use a Time-Based Estimate
    1. The work is your next immediate priority item.
    2. You will use the estimate to directly assess cost.
    3. There is an understood and correct way to do the work, and will not drastically change if done by one engineer or another.
    4. Your release cycle is slow, or singular.

Mocking without Dependencies in F#

I am not a fan of mocking frameworks. Never have been. However, F# allows for some very simple mocking behavior which pretty much throws away the need for mocking frameworks across the board.

Imagine you have the following:

using System;

namespace TestNamespace
    public struct Item 
         public string Name { get; set; }
         public double Price { get; set; }

    public interface IDoStuff
          int DoStuffForRealsies(int a, int b, Item y);

    public class ImportantClassToTest
        public ImportantClassToTest(IDoStuff constructorInjectedDependency)
             /// etc.

Depending on your needs, you might choose to make a mock of that IDoStuff interface for testing purposes. Maybe your standard IDoStuff has some database stuff you don’t want running in simple unit tests, or maybe it involves another installed dependency. If you are working in C#, you might be stuck with Moq, or NSubstitute, but in F#, you get a much nicer model.

module WhyFSharpIsBetterV2123

open Xunit
open SampleTestableClasses

let stubDoStuff = { new IDoStuff with 
                            member this.DoStuffForRealsies(a, b, y) = 

let ``To Heck with Mocking``() =
    let rd = new ImportantClassToTest(stubDoStuff)
    Assert.Equal(2, rd.MethodThatRequiresDi(5))

A nice quick implementation of the interface without any mocking extra dependencies, or any extra libraries to maintain. Just use your F# compiler, unit test library of choice and go.

The Wine We’re Drinking

I live three miles away from Woodinville, Washington, a town with over two hundred wineries in it, including two major labels and a smattering of medium sized ones. Living here and not enjoying wine is like living in Colorado and not skiing. The grapes are typically grown in eastern Washington, but the wines are produced here, or are simply sold here in a tasting room. As it is a very short drive away, a frequent weekend activity is to stop into a local tasting room. With the COVID19 pandemic, however, the tasting rooms have been closed since mid March until this weekend, when we have FINALLY been able to come back. These are the wines we were able to try this past weekend.

1. Lord Lion

We started at Lord Lion, as we had a club release pickup waiting for us, and we didn’t know precisely how tasting would really work in phase one point five, but the tasting was as wonderful as always.

We went in, sanitized our hands, and were directed to a table a good distance from the other patrons. We were handed small glasses, and proceeded a flight of six newly released wines. Aside from the distance the poor folks had to wander about to pour wine, there really was not much different between pre-COVID times and post-COVID tasting.

One thing we especially love about Lord Lion is that Graham releases wines later than other winemakers in the area. This recent release included a 2014 Petit Sirah, a 2015 Malbec, a 2015 Cabernet Sauvignon, and a 2016 oaked Chardonnay. Other wineries we visited were in the middle of their 2017s and 2018s releases. If you’re noticing that ‘everyone has the same stuff’, Lord Lion has an wonderfully atypical selection.

There are many things things to sample there. Graham always does a fantastic Viognier, and the 2019 was lovely, if a bit sweeter than the year before. His 2019 rosé of Sangiovese has been lovely for the past two years. The star in this release was the 2014 Petit Sirah. Full, inky dark, and lovely paired with a ribeye, or even something like a beef short-rib.

2. Adrice

Adrice was our second stop. Frankly, we stop there fairly often. With phase one point five, they were able to really open up the tasting room with large tables, a bar. and food served! Heather and I stopped in after calling ahead to make sure they could fit us in.

Heather waiting on pour number three at Adrice.

I do not have enough data to say for absolute certain (still working on collecting that), but Pam from Adrice may be one of the top 3 winemakers in the state. She simply does NOT make a bad bottle of wine. Her cheap stuff is great, and her expensive stuff is absolutely worth it! She is one of the few local producers that I will happily spend $75 on a good bottle for, although as I am budget conscious, I do enjoy my club discount for that particular bottle.

The tasting included a flight of six wines, and Heather and I also grabbed two charcuterie plates to keep up our strength. The takeaway favorites were: a damn near perfect 2019 Sauvignon Blanc from Yakima Valley; an award winning 2017 Red Blend called ‘Lift Off’ (which is a crazy steal at $25 a bottle); and a beautiful 2017 Malbec. Pam also poured us a pre-sneak-quel of a Cab / Barbera blend she’s got coming out in July, that will be some lovely stuff.

3. Long Cellars

Our final stop of the day was to an old favorite, Long Cellars. Jason is a mad scientist back there, but when he makes contact, he hits nothing but home-runs. Never one to stick with the same-ole ideas, the trick to tasting Long Cellars is to taste not only what the wine is now, but what it will be in 5-8 years.

Tasting room welcomed us warmly again, with a giant Frankenstein statue right up front. The room is small, but we were able to sneak into a table in the back, where we had been to two Long Cellars-hosted burlesque shows. Barrels everywhere, Heather was tempted to hunt around for a barrel thief, and eventually found one hidden away.

Be careful about putting Heather near the bottles, Jason!

We tasted two whites, and two reds before our daughter called and requested a pickup from a ‘social distance pickup’, so unfortunately our tasting was cut short. The steal of the show was a 2018 Cabernet Sauvignon, which tasted like fresh strawberry jam, which was unique and intriguing for a Long Cellars Cab. You could pair it with a light salad, and it wouldn’t be out of place. It was inexpensive, fresh and fruity with that classic peppery pull at the end that let you know it was a Cab. His 2018 Reserve Malbec proved absolutely wonderful and deep to finish the tasting, but as I said at the beginning, the best part will be waiting on it.

Coaching Engineers – A Review

One of my regular responsibilities at my new job at the Credit Union is coaching developers, engineers, sdets and QA folks. Today, I got to be involved in three different coaching sessions that all had unique subjects and discussion points.

Session 1: How to get to Senior – Developing Expertise

This is a fairly common situation. A developer wants to go from Software Developer to Senior Software Developer.

The process of making Senior Software Developer generally comes down to adding more responsibility and influence to your day-to-day job. To get to a senior role, you can do one of the following:

  1. Take on a team lead role. In this case, you are the point person and responsible for more of the project work itself. You T-shape your skill set, but become the primary point person for the whole project.
  2. Take on a manager role. In this case, you’re trying to mentor and grow the skills of the folks around you. You may not be directly responsible to all the functions in the project, but you help and mentor those folks around you.
  3. Take on an expert role. In this case, you target getting deeply technical and specialized. Your plan is to become a known leader and expert on a particular technology.

The developer in question was interested in learning more about this third pattern of developing her expertise, and what it would take to continue that progression. She expressed interest in web user interfaces with Angular, and spent the session showing me what she had learned and worked on, and where she was going next.

To coach, sometimes you just need to be the accountability buddy.

Session 2 : Whose Design is Right?

In this session, a team of software developers had some questions about the nature of their solutions. They did not agree about the approach to a problem, and this particular session was with one side of that argument.

Side note: I love these sorts of discussions. Folks getting passionate about the way they choose to solve a problem is WONDERFUL.

The best part is that there was not a clear winner in the design of the application itself. They were different designs, to be sure, but they each had technical merits that could very easily be seen.

At the core, this one came down to coaching back to the engineering. The crux of the problem was that there was no data proving one solution better than another. The quantitative features of the respected solutions had not yet been tested, and that was the end state I coached towards here.

If your design is better, prove it with data. Otherwise, GTFO of the way.

Session 3 : SDETs in the Credit Union

Initially, this one was setup to be a discussion about how to write code to use a Windows Application automation tool (Selenium with WinAppDriver), but after the first session, it was apparent many of the SDETs present already had a lot of experience with those libraries. There were four SDETs and one analyst in this session, so it became a larger discussion about the nature of testing. We started collaborating on ideas about the about the best ways we could automate some of the harder tests to deal with.

Finally, it came down to discussions about the AAA pattern of testing, the kind of test code we wanted across the org, and even some of the difficulties in teams where SDEs and SDETs have a combative relationship.

Coaching engineers is exhausting and inspiring all at once. It was a great day, and I feel blessed to be able to do it!


This morning, like most of the rest of the US population, I saw protests against police brutality in our cities. Most protests have been nothing but peaceful displays of solidarity. Some, less so, with police responding to property destruction and graffiti with violence, including pepper spraying an eight year old.

I am a pacifist; I do not believe in the use of force, in any case.

Recent news has shown, in clear and not uncertain terms, that being a white male shows that I am not a target. People of color do not enjoy that privilege. It is easy to be a pacifist when systemic racism does not target me.

Statistics back me up here. I am unlikely to be arrested, injured or killed by a police officer. I don’t need to send messages to my friends and family when I have been pulled over by a police officer, as that police officer is unlikely to believe me to be ‘aggressive.’ **

** See the book So You Want to Talk About Race, by Ijeoma Oluo, for details here. Also, it’s just a fascinating book, you should buy it.

In general, I trust police officers to keep us safe. I fully accept that my privileged position supports that trust. That said, I want all people; people of color, LGBTQ people, differently-abled, and any marginalized group I’m (as of yet) unaware of to feel the same.

The police should make people feel safe.

Everyone’s life should matter. However, saying #alllivesmatter’ is fundamentally ignoring systemic racism. Use of force by police statistically impacts people-of-color drastically differently than it impacts white people.

So, being unable to protest myself in the time of COVID19 (I am high risk, heart condition), I will say emphatically here: Black Lives Matter.

Two Rules to Estimating Software Features

OK, you’re a software developer, and someone’s asked you to quick look at a feature, and give them an estimate on how long it’s going to take to develop. For this example, I will refer to that feature as ‘SuperFeature’, and we will have two estimating developer examples; Gina, who examples good estimates, and Lisa, who examples less good estimates. Priya is our product owner, and since Priya owns the product, good estimates give her the information required to make a intelligent decisions. Well-formed estimates enable her to prioritize work well, and manage expectations of customers and stakeholders.

Here’s two rules to estimating features successfully.

Rule 1: Your estimate should represent doing the ‘work done in a vacuum.’

It is counter-intuitive to estimate work in this way, but creating an estimate, as if that work is being done in a vacuum is the best way for your product owner to assign the feature a priority.


Gina, who estimates in a vacuum – “SuperFeature will take about 2 weeks to do.”

Lisa, who estimates based off of her current workload – “I will need 6-8 weeks to do this.”

In the first case, Priya knows how long the feature will take to develop. She knows that Gina and Lisa are both working on very high priority items, so she gets Liam to work on that feature.

In the second case, the Priya knows how long Lisa will take to get it done, but has very little awareness of the what priority Lisa is putting on the work. At next week’s stand-up, imagine Priya’s surprise to know that Lisa hasn’t even started work on it yet!

The lesson: Your product owner owns priority of the features. An estimate should give your product owner the information required to set that priority.

Rule 2: The larger an estimate, the more detail it needs.

If your feature is large, your product owner needs to know and understand why. In order to understand the work needing to be done, it should be broken down into tasks.


Lisa, who doesn’t break the work down – “SuperFeature will take about 4 months to work on.”

Gina, who realizes the work is complicated, and Priya needs to understand the details. – “In all, SuperFeature will take about 4 months. We’ll need 3 days to start building the catalytic converter, and then a week to refit and install the Whizzbang…” etc, etc.

In the first case, it’s hard to really even start the work, or even know how to break it up. Is it 4 months altogether? Can you break the work apart? Can you create multiple work streams?

In the second, Gina gives Priya all the details she needs to break up the work accordingly. She also does so succinctly, so that ordering tasks and dependencies are clear.

Following the two rules above will help make your estimates more valuable and your relationship with your product owner more beneficial.