Success measures
When TTSC succeeds, people have better experiences with the government. Four outcomes define TTSC’s success:
Partner satisfaction
Are our partners happy with our work? Would they work with us again? Would they recommend us to others?
How we measure partner satisfaction:
-
TTSC Customer Listening Tour. Beginning in FY24, the TTSC director, along with some helpers, interviewed the primary points of contact for most of our active engagements. Questions focus on perceived value for the cost of our services and likelihood to use us again or recommend us. Respondents should be asked to quantify their satisfaction with TTSC so we can track year-over-year improvement via Net Promoter Score, Customer Satisfaction Score, or equivalent score as defined by the TTS Director’s annual goals.
-
18F Partner Satisfaction Surveys. For several years, 18F account managers have conducted partner satisfaction surveys after an engagement ends. These surveys are structured interviews. In FY24, began collecting them throughout the delivery cycle: 90 days after the engagement/project starts, and for longer engagements, every six-months thereafter. The feedback from these interviews is analyzed and shared out in a quarterly cadence.
- FY24 benchmarks were:
- 50% or more of partner satisfaction survey respondents report being Likely or Very Likely to recommend 18F to others
- 50% or more of partner satisfaction survey respondents report the value they received was at least equal to the budget spent
- See: , external,Partner Satisfaction in Airtable and the , external,form to add new records
- FY24 benchmarks were:
-
FAS Customer Loyalty Survey. A standard survey sent annually to customers of each FAS business unit. This is a required survey for all FAS units, but the response rate for TTSC customers has been very low in the past few years so this data has not been a reliable indicator of partner satisfaction.
Delivery quality
Do our deliverables represent our standard operating procedures? Does our work have a positive impact on the public or government employees?
How we measure delivery quality:
-
Deliverables are consistent with the standard operating procedures in this handbook. We’ve been helping modernize government for over ten years and know what approaches are likely to be successful. If your engagement needs to deviate from the SOPs in the handbook, you must be prepared to explain why.
-
Engagement impact assessment. At every phase of an engagement, we should be assessing its potential for impact on the public or federal employees. Annually, TTSC also reviews overall impact and highlights.
- See: past impact reports:
Employee engagement
Are our staff stimulated by their work and content with our working conditions?
How we measure employee engagement:
-
TTSC pulse check surveys have been conducted separately within 18F and CoE. These help us identify areas for improving employee experience. Surveys are conducted three or four times per year. Results are compiled and synthesized by a working group.
-
Federal Employee Viewpoint Survey (FEVS) is an organizational climate survey administered annually by the Office of Personnel Management. All TTSC employees are encouraged to participate. Results can be broken down to the levels of TTSC, 18F, and CoE. Data from FEVS is considered in SES performance plans and TTS strategic planning.
- See: , external,OPM on FEVS and past results:
- FY24 FEVS results for TTSC TK
- See: , external,OPM on FEVS and past results:
Financial stability
Are we covering the direct costs of our office?
How we measure financial stability:
- The organization aims to cover its direct costs, which include staff labor. The metric we use to gauge progress towards cost recovery is utilization rate.