At Bytecubed, I designed a proposal evaluation workflow in collaboration with one other UX designer, and the Product, Development, & Data Science teams to standardize source selection across the Department of Defense.

 

My Role

User Research, Interface & Motion Graphics Design, High-Fidelity Prototyping

The Client

The United States Department of Defense (DOD) is the country's largest government agency and its largest employer.  The DOD is divided into over 15 command offices, each comprising a separate stakeholder with unique needs.

 

RESEARCH

We were asked to examine the federal acquisition program through which the different DOD components publish RFPs, evaluate proposals, and award contracts.

Almost every component has a unique process for topic development & source selection making measuring the program's success difficult and tedious. 

To improve the procurement process, we are developing a unified web platform where all federal agencies can release topics, evaluate proposals, & generate reports in one place, customizing the workflow to address individual needs.

Site Mapping

From early meetings with program managers across the DOD, it was clear that there were three basic needs: (1) to write and publish topics, (2) to receive & review proposals, (3) to collect & make available accurate, real-time data.

To capture the entire process, we split the project into distinct, use-based work-streams – Topic Development, Source Selection, & Reporting – each with it's own Product & Development teams.  As a part of the UX team I split my time between each module.

A simplified version of a site map I drafted showing the work-stream divisions.  This case study focuses on the Proposal Evaluation function within the Source Selection work-stream.

User Journey Mapping

Focusing on the proposal evaluation piece, my first goal was to figure out who exactly is involved with source selection & what they are doing. 

IMG_7184 (1).JPG
IMG_7185 (1).JPG

Meeting with DOD employees across the source selection process, I facilitated User Journey Mapping exercises to define the user flow and tease out specific goals & pain points.  

At the beginning of these sessions, we identified the distinct steps in the user flow (the green index cards in the picture below).  Then, we discussed the actions, questions, successes, pain points, and opportunities (the yellow post-it notes) that are associated with each step.

The fruits of one of my journey mapping sessions

Functional Roles

The biggest takeaway from the user journey mapping sessions was that although source selection goals are the same across the DOD, the precise personas involved vary greatly.  

We had initially defined user personas by position title, such as Component Program Manager & Proposal Evaluator.  However, the more we met with source selection staff across the DOD, the more we realized that these titles may be the same from one component to the next but the exact duties may differ depending on component size & structure.

To address this, I split our user personas into 9 functional roles defined by the most basic task a user has to perform at any step of the process, allowing us to accommodate differences in user personas from component to component.

Source Selection functional roles mapped out against common personas

Source Selection functional roles mapped out against common personas

Pain Points

Some of the pain points that were expressed:

 
Reviewers can be looking anywhere from 8 to 86 proposals per topic and only 30 days from when it is submitted to do it.  We need to be able to evaluate them quickly
— Proposal Evaluator
 
 
Technical reviewers don’t need to read through the entire proposal package. They should base their evaluation on the technical merit, and innovation of the proposed approach alone.
— Criteria Master
 
 
Clear and concise evaluations are best.  Some reviewers get too technical.
— Proposal Selector
 
 
We need reviewers to write valuable feedback on a proposal’s strength’s & weaknesses.  Writing ‘none’ is not helpful.
— Debriefer
 

SKETCHING

I devised a two-paned view to allow proposal reviewers to read through proposals and evaluate them simultaneously. 

I split the evaluation workflow into 5 sections, Proposal Content, Technical Merit, Key Personnel, Commercialization, & Selection Recommendation, to avoid overloading the user & help guide them through the evaluation.

Untitled_Artwork 32.png

After consulting with the data science & development teams, we realized that the PDF data was not reliably parsable and agreed that an in-app PDF viewer would be a huge technical lift out of scope for the MVP.

In my next iteration, I pivoted to designing a tool that could be re-sized and viewed next to any PDF viewer application.

Untitled_Artwork 30.png

USABILITY TESTING  

Throughout the research and design process, I continually pushed for more and more frequent user testing.

Getting even lo-fidelity designs in front of users enabled us to validate incremental changes in direction & identify potential problems early on.

However, although we have regularly scheduled client engagement meetings, we weren't always able to carve out enough time for primary users to test every iteration.

IMG_7204.JPG
sbir one user testing.png
 

Even though we couldn't always find "real" users to participate in user testing, I wanted to continue testing as much as possible.  

Recruiting test users from ByteCubed coworkers on other projects, I was able to test small chunks for functionality & discoverability.

 
 

PROTOTYPING

Drawing from material design & angular material components, I built out my sketches in Adobe XD and animated interactions with Principle.  

Form Design

Focusing the primary action in a narrow section of the viewport, I wanted to prioritize and conserve vertical working space. 

 
 

I swapped horizontal steppers for vertical, bringing the section headings in-line with the evaluation form and allowing the user to breeze through the evaluation workflow within one page.

I included information that we could accurately parse in an expanding card, only occupying the viewport when needed by the user.

After testing the vertical steppers users noted that they liked "how all the pieces are on one screen" and that "it's easy to switch between sections."

sbir-one-form-design.gif
 
 
sbir-one-responsive.gif
 

Responsive Design

Speaking with primary users, I learned that proposal evaluators have access to a variety of different computer set-ups, from laptops to multi-monitor desktops.

I designed the interface to be responsive, easily adapting to a range of screen sizes while still usable in full screen.

 
 

Tool Tips

Throughout testing, users noted that "the labels are not clear on all icons," indicating that "adding descriptions to buttons would make things much easier."

To design straightforward & discoverable UI for users of all engagement levels, I added tool tips for unlabeled buttons and modal windows to allow for more descriptive instructions.

sbir-one-helper.gif
 
 
sbir-one-MG.gif
 

Onboarding

83% of the source selection staff I met with was between 46 and 60 years old.

While older generations are becoming more and more comfortable navigating through digital space, the platform would need to cater to digital natives & non-natives alike.

I created a motion graphic to help guide users of all computer literacy levels to take full advantage of the proposal evaluation feature.

 
 

WHAT'S NEXT?

We are currently on track to release the platform MVP late August 2018.  With the first release, I plan to conduct more formalized user testing sessions to continue improving on features we have already built out.

Looking ahead to the next release, we still have a number of features that will enable different components to customize the proposal evaluation workflow, including incorporating customized evaluation questions, and criteria weighting.