Que.
The Associative External Brain.
"What if your digital memories could find you, instead of you fighting to find them?"
The "Save-and-Forget" Epidemic
"A few months ago, I saved a link to an amazing vintage store. Fast forward to last week, sitting in my car, I pulled out my phone, but my mind went completely blank. I knew I saved it, but I couldn't remember the title. I was digging through hundreds of bookmarks, physically so close, yet digitally lost in my own clutter."
Research shows the moment we click "save", our brains experience a cognitive offload. We stop processing it internally.
Navigating the "App Islands".
Through deep interviews with seven heavy information users, I mapped out the "Moment of Failure." I found that users don't forget that they saved something; they forget where they saved it and what it was called.
We are scattered across "App Islands"—Instagram for aesthetics, WeChat for articles, LinkedIn for professional growth. This fragmentation creates a massive cognitive load. I identified 7 "Memory Anchors"—cues like color, time, and the mood of the moment—that stay in our heads long after the title of a webpage is forgotten. This became the foundation of my search logic.
1. The "Black Hole"
Users save obsessively but refuse to organize. The list becomes a graveyard.
"I have 25 pinned tabs right now because if I put them in bookmarks, I'll forget they exist. My saves are a total mess."
2. Context Over Keywords
Memory isn't text-based. People recall the vibe, the color, or the time of day.
"If I forget the title, I can usually only remember the main color of the image, or 'what was that thing I looked at yesterday evening?'"
3. The iMessage Workaround
To avoid losing links, users forward them to chat apps, turning conversations into search engines.
"I forward the ones I actually want to make to my roommate on iMessage so we can find them in our chat history."
The Que Workflow.
Based on the research, I mapped out a unified data flow. Que eliminates manual sorting by intercepting content at the point of "Adding", filtering it into the Library, and providing multiple intuitive output nodes.
The Mid-Fi Reality Check.
I built a Mid-Fi prototype focusing on traditional folder categorization and rigid keyword search. The usability test revealed massive friction.
Critical Feedback:
- Flat Information Hierarchy: Users felt overwhelmed when forced to manually assign categories during the "save" action.
- High Interaction Friction: Leaving Instagram/Safari to save a link broke their natural browsing state.
The Pivot: I realized I needed a zero-friction capture system (Native Share Sheet) and an AI-driven contextual search.
Bridging the gap effortlessly.
Integrated directly into your phone's native sharing ecosystem. Browsing Instagram, just hit the standard 'Share' button. You capture the inspiration the exact second it hits you, without leaving the app.
Que quietly summarises the content, categorizes it, and tags time/location. Zero cognitive load. Perfectly organized external memory, without ever feeling like a librarian.
Designed to understand how you speak. Just type, 'that aesthetic coffee shop article' or 'the red jacket from last winter.' You describe what's in your head, ideas are instantly back.
Vibe Coding a Living Prototype.
Bridging Design & Code
Designing a feature is one thing; proving it works in reality is another. I bypassed traditional static mockups and built a fully functional Alpha prototype using SwiftUI and Cursor through "vibe coding."
Invisible Context Scraping
To make retrieval truly effortless, the engine runs silently in the background when a user hits "Share." It extracts high-level metadata, visual anchors, and text body, transforming raw URLs into structured, searchable memories without exposing the complex scraping logic to the user.
Flipping the Metric & Pushing Boundaries.
Testing the functional prototype in a real-world scenario: Users saved 10 items in the morning. 12 hours later, they retrieved them using only vague descriptions. The new AI-driven architecture yielded massive improvements in our core UX metrics. Furthermore, I challenged Que with two polar-opposite scenarios to test its semantic boundaries.
"How can I see through thick smoke?"
I used a 20-page research paper on infrared drone physics. The system proved its ability to perform Concept Mapping—bridging the gap between a layman's query and professional scientific data ("LWIR sensors").
"Find the laptop for 8K editing on a beach."
Using the technical specs of the MacBook Neo, I tested Que's Contextual Reasoning. It successfully identified that "editing on a beach" requires specific hardware strengths like peak brightness and thermal efficiency.
Meet Que.
A Silent Copilot.
"Building Que taught me that technology should not replace memory, but act as a silent extension of it. The real complexity lies in designing it to be effortless."