

I led the product design process for a project in collaboration with the American Red Cross, focusing on digitalizing their disaster simulation training game. The primary goal of this project was to study decision-making processes in post-disaster recovery scenarios. The game serves as a data collection tool for this research, providing insights into how players navigate crisis management and resource allocation. Currently, the project is in the development phase, with ongoing work on AI-Agent architecture design.
Role
Product Designer
Duration
October 2024 to Present
Team
2 Product Designer
2 Software Engineer
Methods
UX/UI Design
Prototyping
Testing
Tools
Figma
Sketch
Challenges
The ARC Simulation Game was originally a 3-day tabletop exercise conducted internally for training workforces who had no prior experience in post-disaster recovery operations. The game provided a hands-on approach to understanding crisis response strategies in an instructor lead controlled environment.
During the pandemic, the American Red Cross recognized the need to digitize the simulation to improve training efficiency, eliminate location constraints, and facilitate data collection for performance analysis.
Later, in collaboration with CMU’s AI Institute for Societal Decision Making Lab, a project team was formed to tackle two major challenges:

1
How might we design an online version without altering the original gameplay mechanics?
2
How might we design AI to provide adaptive decision-making support during training process?
Solutions

01 Grid-Based Map System
The grid-based map system, combined with an event generation model, creates realistic disaster response scenarios by dynamically integrating spatial constraints and situational variables, ensuring an immersive and adaptive training experience.

02 AI Agent for Decision-Making Support
AI agents, powered by a Reinforcement Learning (RL) model, adapt to user decisions in real time, providing multi-dimensional decision support tools that simulate the complexity of collective decision-making in disaster response scenarios. To help users make more informed and effective choices during training.

03 Task-Based Resource Allocation
Through resource allocation and task assignment, players must strategically distribute limited supplies and delegate responsibilities, replicating real-world constraints and requiring them to balance competing priorities in disaster response scenarios.

Design Framework

Why This Framework?
Unlike Double Diamond, the Tandem Framework evolves game mechanics and goals together, keeping gameplay and learning objectives aligned.
How We Applied It?
We followed two consecutive design loops, each refining different aspects of the game.
1st Phase
Game System & Goal Design
Established the core mechanics & gameplay structure.
2nd Phase
AI Integration & Task Design
AI-driven enhancements for better training outcomes.
1st Phase: Game System & Goal Design
Game Mechanics Analysis
By analyzing the 100+ page process book from the original game, we identified three key design strategies to effectively translate the gameplay into a digital format. Based on these strategies, we developed the initial game flow to structure game phases, general task execution process.
Shifted from a multiplayer game to a solo experience, where players take on the role of an Administrator role.
Shifted from a multiplayer game to a solo experience, where players take on the role of an Administrator role.
Shifted from a multiplayer game to a solo experience, where players take on the role of an Administrator role.
Game Mechanics Analysis
Based on these strategies, we developed the initial game flow to structure the game phases and task execution process. The game spans 8 days, beginning with a pre-game phase where players learn the basics. On Day 1, players focus on construction and staffing. From Day 2 onwards, they complete both daily tasks and emergency tasks.

Building on the core gameplay flow, we refined a detailed Game User Flow, integrating AI agents to assist players in decision-making throughout various game stages.

Game Goal Design
With the overall game structure defined, we refined the player role and game objectives while ensuring the game effectively supports decision-making training. At this stage, we developed the Game Instruction Document to outline core mechanics, task flows, and player interactions.

Play Testing
To test playability, we created a paper prototype and conducted 10 user play-tests. Through verbal interactions, we simulated AI agents assisting players in the decision-making process.


User’s Voice

I understand how to play and the general rules.

I wouldn’t have realized I should build the kitchen first without a prompt.

There’s a lot to calculate, I need to figure it out.
Game Map Interface Design
After integrating user testing, game flow, and game goals, we began designing the main game interface by first defining four key functional areas: Information Panel, AI Agent Panel, and Task Interaction Panel. After structuring the layout, we created an initial wireframe using open-source components, then conducted a play-testing session with the client to gather feedback. Based on the insights, we iterated further to develop the current refined version.
Layout Planning

Wireframe

Current Version

2nd Phase: AI and Technology Integration
In Phase 2, which we are currently working through, our primary goal is to research and define how AI components will be integrated into the game. This includes designing how users interact with AI agents and ensuring a seamless experience. Our key areas of focus are:

How does the AI Decision-Support System operate?

How should tasks be structure
within the Grid-Based System?

How should different task types be mapped to appropriate gameplay interactions?
AI-Driven Training: Integrating Agents & LLMs
Unlike traditional scripted game agents, our AI Agents adapt to real-time scenarios, enhancing disaster training by providing dynamic decision-making support. Trainees interact with AI-driven agents for context-aware guidance, improving problem-solving in high-pressure situations. To further optimize decision support, we integrate LLMs to generate personalized recommendations based on real-time data.

Structuring Tasks in the Grid-Based System
To effectively integrate tasks into the Grid-Based System, we focused on mapping game mechanics to real-world disaster response scenarios. This required defining environmental triggers, event layers, and emergency decision points, ensuring a dynamic and context-aware task structure.
Defined Task Hierarchy
Each day, players handle fixed Daily Basic Tasks, and starting from the second day, emergency events are dynamically triggered by environmental factors, grid attributes, and resource demands, simulating real-world crisis unpredictability.

Task Distribution & Game Interaction Design
With the task hierarchy established, the next step is refining task distribution and defining the interaction flow for different task types. This phase is currently in progress, focusing on designing seamless game interactions and ensuring AI provides adaptive decision-making support.
Next Step

Self-reflection
Driving Design Through Technical Exploration
Understanding the underlying technology allows designers to make informed decisions that enhance both usability and system functionality. By delving into AI-driven decision support, task distribution mechanisms, and adaptive interactions, I was able to align design choices with technical capabilities, ensuring a seamless and efficient user experience.
