Grant Tracker
Government
Results
Dashboard became the centerpiece for funding presentations, both internally and to elected officials Significantly reduced the need for ad hoc data preparation across departments
Link
Case Study: Designing a Public Financial Dashboard for Early Childhood Programs
Overview
Role: UX Designer / Product Strategist
Project Type: Public Data Dashboard
Client: Early Childhood Education and Care Department (ECECD), State of New Mexico
Duration: 12 weeks
Team: 1 UX Designer (me), 1 Data Analyst, 1 PM, 2 Engineers, 1 Policy Advisor
Problem Statement
The state of New Mexico developed a public-facing digital tool to increase transparency around early childhood education programs. Initially designed for open access by anyone—government officials, employees, organizations, and the public—the tool’s value was limited by its lack of focus. It provided information, but not insight. As a result, it fell short in helping key stakeholders understand the outcomes and effectiveness of their efforts and decisions.
Goal
Transform the tool into a purpose-driven, data-informed dashboard that:
Helps ECECD employees quickly assess how and where early childhood funds are being allocated
Enables elected officials and decision-makers to visualize program reach and community-level impact
Educates all users—regardless of technical background—on how to use the data to inform actions and investments
Research
Methods:
Stakeholder interviews across state departments (ECECD staff, program leads, elected officials)
Task analysis and workflow mapping for budgeting and reporting use cases
Data source audit to identify available metrics and system gaps
Usability walkthroughs of early prototypes with non-technical users
Key Findings:
Users needed not just raw data, but interpretation and context
Employees wanted a high-level financial view with drill-down capability
Officials prioritized outcome-focused data: number of children served, services provided, and regional equity
Public users required a clean, simplified view with minimal jargon
Design Process
1. Needs Alignment
Grouped users by roles and needs: internal reporting, funding oversight, public transparency. Designed dashboard views tailored to each context while maintaining a single, cohesive interface.
2. Narrative-Driven Data Design
Shifted from static tables to story-based visualization. Structured dashboard around key questions:
How much funding has been distributed?
Where has it gone?
What impact has it made?
3. Wireframing and Data Modeling
Created wireframes focused on progressive disclosure—starting with high-level metrics (e.g., total funds spent, providers reached), followed by regional and demographic breakdowns. Collaborated with data analysts to ensure consistency and accuracy.
4. Onboarding and Education Strategy
Recognized that even the best-designed dashboard required user guidance. Developed a built-in onboarding flow and supporting documentation to teach users how to interpret and use the data responsibly.
Key Metrics (Before vs. After Dashboard Launch)
Metric | Before | After Launch | Change |
---|---|---|---|
Clarity of Fund Allocation (internal) | Low (manual reports) | High (real-time) | Significant increase |
Dashboard Engagement (monthly visits) | N/A | +8,000/month | New public use baseline |
Regional Equity Analysis Use | Manual, infrequent | Integrated, daily | Operationalized |
Legislative Briefing Support | Fragmented | Unified dashboard | Streamlined reporting |
Stakeholder Confidence (survey) | 2.8 / 5 | 4.6 / 5 | Strong improvement |
Design Highlights
Impact Snapshot: Summarized total funds distributed, children served, and providers engaged
Geographic Breakdown: Visual map interface with county-level insights and filters by funding stream
Drill-down Analytics: Users could move from overview to granular views (e.g., per-program, per-region)
Integrated Guidance: In-app tooltips, onboarding flow, and public help documentation enhanced usability
Iteration Example
Initial Design:
Dashboard contained static charts with minimal context. Users were unclear how to interpret what they were seeing.
User Feedback:
“I see the numbers, but I don’t know what to do with them.”
Revised Design:
Integrated narrative framing and contextual annotations (e.g., “This represents 42% of all program funding in rural counties”). Added callouts to explain year-over-year comparisons and changes in funding focus.
Outcome
Dashboard became the centerpiece for funding presentations, both internally and to elected officials
Significantly reduced the need for ad hoc data preparation across departments
Helped New Mexico secure momentum and justification for the next phase of ECIDS (Early Childhood Integrated Data System)
Recognized as a key transparency initiative supporting early childhood investment strategy
Tools Used
Figma (wireframes, design system)
Tableau & Power BI (data modeling and visualization)
Notion (content strategy and documentation)
Zoom / Mural (remote collaboration and workshops)
Reflection
This project highlighted the dual challenge of designing for impact and comprehension. A well-built dashboard isn’t enough—users need to understand and trust the story it tells. If we had approached the project as just a reporting tool, we would have missed the larger opportunity: to equip decision-makers with clarity, and in doing so, shape policy and investment in meaningful ways.
The biggest lesson: designing public tools requires intentional education and empathy as much as technical accuracy.
Next Steps
Expand filtering for demographic insights and equity benchmarks
Build multilingual support to expand access for broader public users
Integrate dashboards into ECECD policy planning workflows for long-term strategic use
Let me know if you'd like this turned into a formatted PDF, a portfolio webpage, or a presentation slide deck.