Back of my head during a challenge code review!

I recently participated in another coding challenge as part of a government procurement. It was a my third one, and this challenge had some interesting differences in the set up of the challenge. It got me thinking about what I liked and didn’t like across the three.

GSA Agile Delivery Services 

https://18f.gsa.gov/2015/06/15/agile-bpa-is-here/  

Liked:

  • Challenge itself - I liked that the government actually did it. This was the first major coding challenge procurement and it was a bold change in an effort to shift government IT to more modern practices.
  • Detailed evaluation criteria - 18F published a nice spreadsheet with detailed, technical evaluation criteria that made it easy for vendors to know what to accomplish.
  • Open-ended problem - The challenge was to do something cool and interesting with FDA's recently published (and very well designed) APIs  data sets on drugs, devices, and food.  It allowed a lot of creative freedom with played well to ICF's strengths in design, UX, and analytics.
  • Tech stack options - Vendors had a lot of freedom in tech stack choices, as long as they were open-source. We used Angular, Node, and Elasticsearch which provided lots of flexibility on quick changes to the front-end or supporting data.
  • Transparency - 18F and GSA were very transparent the entire time with lots of blog posts and an industry day.  Vendors were well prepared for the challenge. It's a nice, easy way for the government to communicate about procurements without all of the legalese of official procurement communications.

Didn't Like:

  • Initial ideas on challenge duration - Before it was released, 18F mentioned a 48 hour response. That would have been difficult, especially since participants in the challenge were still working on projects serving government clients. It's also a burden for work-life balance. I think that five business is the optimal duration for a challenge. It's a fair amount of time for vendors to show the stuff and short enough for quick procurements. The duration ended up being a couple of weeks.
  • Questions - The biggest issue with the procurement for both vendors and the government was including the question period within the challenge timeframe. Government procurements have formal question and answer procedures to help make procurement fair and defensible. The government received and answered almost 200 questions during the challenge. This is a huge number of questions for a procurement and it took GSA a long time to answer them. This caused them to extend the submission deadline several time s. 
  • Deadline extensions - While extensions are usually welcomed on written proposals, they weren't in this case. The coding work is intense and you feel pressure to keep working to add new features to keep up with competition. The extensions frustrated vendors and the government .
  • Submission complexity - The submission required several files and forms to be filled out beyond just sending the link to the repo. I think the forms were a function of the contract vehicle and GSA's hope to get new vendors included. The forms helped vendors double check that they met all of the submission requirements.

EPA Environmental Digital Services

https://18f.gsa.gov/2015/12/03/epa-environmental-digital-services-marketplace/

Liked:

  • Another agency - It was nice to see another agency try the prove-it style competition. But, I think this was driven mostly by Greg Godbout's move from GSA to EPA.

Didn't Like:

  • Too early in procurement process - The challenge was issued as part of a Request for Information (not Proposal). RFI's are usually used by government to quickly survey industry about the viability of vendors and approaches. These challenges should be reserved for actual procurements given the necessary investments of time and people. 
  • Less transparency  - There was not a lot of published information about the procurement, compared to 18F. EPA could use their blog to provide updates and preview the procurement with industry.

VA Coding Challenge

Liked:

  • Functionality specificity - This challenge gave several specific user stories that were to be implemented. The stories were clear and included Given/When/Then specifications that made them even easier to understand. Being more of an engineer than an artist, I preferred this specificity over the open-ended nature of the 18F challenge. 
  • Neutral problem domain - I think most of us assumed that challenge would be about veterans, but the VA created a neutral problem domain that level-set the playing field. 
  • Questions - Questions were due two hours after the challenge started and were answered a few hours later. The quick turn around was nice. But, this was a much different circumstance than the 18F question period. This challenge was on a smaller contract vehicle, with fewer vendors, and with vendors experienced in government procurements. 

Didn't Like:

  • Required tech stack - The tech stack was specified in this challenge. There was still freedom in architecture and front-end tools. But, the backend was fixed to the planned environment for the actual project. 
  • 72 hour elapsed duration - The duration was measured in 72 hours elapsed from a 12:00PM planned release. At one point, the VA planned to delay the challenge for a day until it was pointed out that the challenge would than be due on a weekend. I think the best approach is to make a challenge due at 5PM on the 5th business day following the release. This avoids awkward end times and gives participants a good amount of time to respond.
I have really enjoyed working on the challenges and appreciate the moves by the federal government to improve IT.