2025
2025
Desktop App for Evala
Evala is a scrutiny dashboard designed for local government and civil service projects, policies, and strategies, aimed at improving transparency and accountability. The system organizes projects into a visual workflow with tasks, documents, and comments, while maintaining a complete audit trail of all actions. Users can track progress, monitor interdependencies, and identify risks in real time.
I developed the visual workflow interface for tracking tasks, documents, and comments, ensuring all actions are recorded in a comprehensive audit trail. I also suggest implementing a reporting feature, enabling standardised project summaries for public release or internal review.
UX Researcher
UX/UI Designer
User Research
UX/UI Design
Accessibility
Figma
FigJam
Microsoft Teams
January 2025 to October 2025
Current project scrutiny in local government is limited and often relies on simple checklist-style reporting, which provides little insight into performance, risks, or interdependencies. Officers and elected members need a tool that goes beyond tick-box oversight to deliver meaningful, data-driven insights that support informed decision-making and effective accountability.
In local government, I supported councillors and MPs, gaining insight into decision-making. As a policy and strategy officer, I worked on corporate projects, developing a broader view of service delivery. These experiences inspired a scrutiny dashboard to consolidate data and improve transparency. I tested the idea through a SWOT analysis and research on other councils’ digital oversight tools, combining experience with structured analysis to identify opportunities and challenges for transparency, efficiency, and adoption.
The first step was to run a survey using Microsoft Forms, drawing on a mix of councillors, local government officers, and civil service policy analysts. This audience represented the end-users of a potential dashboard tool and gave us a broad, diverse set of perspectives. The survey gave me a quantitative baseline, a clear picture of common pain points and feature priorities. This ensured my design direction was grounded in evidence, not assumption, and helped avoid bias toward loud individual voices.
Three weeks later, I ran an online co-design workshop on Microsoft Teams with a smaller cross-section of users. By this stage, early brainstorming and concept sketches were ready, meaning participants could react to concrete examples rather than abstract ideas. By combining a survey with a co-design workshop, I ensured our research was both evidence-based and user-centred. The survey told me what mattered most; the workshop revealed why it mattered and how users wanted those features delivered.
Using the raise-hand feature for live voting created a familiar, democratic way to gauge consensus, and discussions highlighted the value of a traffic-light system with automatic task assignment to boost accountability. This stage humanised the survey data, adding nuance and context by capturing participants’ frustrations in their own words.
"My dream feature would be an automatic briefing tool. Instead of digging through a 60-page report, I’d get a clear one-page summary highlighting the essentials, key outcomes, and any red flags. It would save time, cut through noise, and make it easier to stay on top of multiple projects at once."
"I’d want a system that flags projects with a simple traffic-light rating for climate, equality, or financial impact. It should highlight what needs deeper scrutiny and allow users to assign the right team members to take action, so nothing important slips through the cracks."
Using insights from the survey and co-design workshop, I ran an affinity mapping exercise to organize qualitative input from interviews and open-text responses. Rather than pre-defining categories, I captured observations, pain points, and ideas on FigJam sticky notes. Collaboratively, participants grouped notes by shared meaning, revealing common themes and how they framed their needs, frustrations, and priorities.
After initial wireframes, further thought was put into how the final prototypes should look, which meant developing a visual identity. Evala was envisioned as a web application, so the brand needed to feel digital-first, professional, and politically neutral, something that could earn trust in civic and analytical contexts while remaining accessible and modern.
The colour palette centres on a teal blue-green, chosen for its sense of calm authority and modern professionalism. Together, they create a trustworthy, inclusive, and visually balanced identity.
Typography played a key role in conveying approachability and clarity. I tested several sans-serif typefaces before selecting Inter as the primary, and Roboto as the secondary. Together, they provide a modern, legible, and balanced pairing that supports both digital readability and formal reporting.
I wanted a symbol that could stand confidently on its own while adapting seamlessly across screen sizes. After testing four concepts (a shield, a plain E, a stylised E, and an eye with a checkmark). After gathering user feedback, the stylised E was chosen. This logo pairs with the “Evala” wordmark for wider layouts, while the square, standalone version ensures flexibility across digital platforms.
How to leverage real-world experience to inspire change and develop practical design solutions that address user and organisational needs.
Strengthened UX research, interaction and visual design in Figma, and facilitation skills through remote workshops and prioritisation exercises.
Organise user testing for the dashboard, and involve tech and security experts from local government early to address potential barriers to adoption and ensure feasibility.