Ghosted
Designed and built an AI-assisted workflow product to help technical job seekers manage applications, documents, interviews, and follow-ups through one local-first product model.
- Go
- Bubble Tea
- Lip Gloss
- JSON
Highlights
- Designed and built a working prototype for developers managing fragmented job-search workflows
- Created a keyboard-first TUI plus scriptable CLI so humans and AI agents could work from the same local data model
- Modeled an 8-stage application pipeline with document links, interview details, contacts, and follow-up actions
- Built JSON-based agent handoffs for parsing postings, generating documents, reviewing quality, and updating records
- Used the project to study how product designers can reason more clearly with engineering constraints
Context
Ghosted began as a product-design exercise disguised as a terminal app. I wanted to understand the kind of tooling developers reach for when a workflow is repetitive, private, and spread across too many surfaces: job boards, notes, emails, resumes, cover letters, interviews, and follow-up reminders.
Rather than sketching the concept only in Figma, I built a working prototype in Go. That forced the design decisions to collide with implementation details: state transitions, file formats, keyboard navigation, command naming, error tolerance, and the boundary between what an AI agent should automate and what a person should approve.
Problem
The modern job search is a high-volume workflow with low tolerance for losing context. A developer may save a posting in one browser tab, tailor a resume in another tool, track a recruiter in email, and remember follow-ups in a notes app. The more applications they manage, the more the real product problem becomes coordination.
I framed the problem around three jobs:
- Keep every application in one durable pipeline.
- Let the user move quickly without leaving the keyboard.
- Use AI to reduce repetitive document and parsing work without removing human control.
My Role
I designed and built Ghosted myself. That matters to the story: the goal was not only to ship an app, but to become a better bridge between product design and engineering by feeling the constraints directly.
I owned the workflow model, terminal interaction design, command interface, data structure, document pipeline, and AI-assisted development process. Claude Code acted as a coding collaborator, but I made the product decisions: what state needed to exist, what should stay local, where automation should stop, and how much control the user needed before generated materials became part of the tracker.
Design Constraints
- Local-first data: Job-search data can be private and messy, so the first version uses plain JSON files instead of a hosted account system.
- Keyboard-first UX: The primary user is comfortable in a terminal, so navigation and status changes needed to be fast without feeling cryptic.
- Scriptable automation: The CLI accepts and returns JSON so AI agents and shell scripts can work against the same model as the TUI.
- Human review: AI can parse, draft, and score documents, but the product needed dry-run and approval paths before automation updates the tracker.
- Portable documents: Resume and cover-letter outputs needed predictable naming so files could be reviewed, opened, and attached later.
- 01 Saved
A role is captured for later review before the user commits to applying.
- 02 Applied
The submitted role becomes part of the active pipeline with document links.
- 03 Screening
The application has moved into initial recruiter or company review.
- 04 Interview
The detail view tracks interview type, date, contacts, and notes.
- 05 Offer
The opportunity is separated from ordinary active applications.
- 06 Accepted / closed
The pipeline preserves final outcomes, including rejected and withdrawn states.
The state model made the product feel more like a recruiting workflow than a to-do list.
Key Product Decisions
TUI plus CLI, not one interface. The TUI is for review, browsing, and daily updates. The CLI is for automation, scripts, and AI agents. Both share the same JSON store, which kept the prototype simple while still supporting two very different interaction modes.
JSON as the product contract. I chose a plain data format because it made the tool inspectable, version-controllable, and agent-ready. It also helped me reason about the boundary between interface design and underlying product state.
Eight pipeline stages. The model needed enough fidelity to match a real hiring process without becoming a project-management system. The stages are explicit enough for reporting and quick updates, but simple enough to move through with number keys.
Review before automation. The agent pipeline supports dry runs and approval flags because document generation is high leverage but easy to get wrong. I wanted AI to accelerate the repetitive parts, not silently alter the user’s source of truth.
Interaction Details
The interaction model is intentionally small:
- Arrow keys and vim-style movement support quick scanning.
- Number keys update an application’s stage without opening a form.
ustarts from a job posting URL and moves into the fetch/apply workflow.oopens the document folder for the selected application.- Detail views expose salary, contacts, interviews, document links, notes, and follow-ups without overwhelming the list view.
AI Workflow
- 01 Fetch posting
The user provides a URL or saved posting, then Ghosted stores a local markdown reference.
- 02 Parse role
An agent extracts company, role, location, compensation, and requirements into structured data.
- 03 Generate drafts
Resume and cover-letter agents create tailored Typst documents for the selected role.
- 04 Review quality
A review agent scores outputs and flags weak documents before they are accepted.
- 05 Approve update
The user can dry-run, revise, or auto-approve before the tracker is changed.
- 06 Compile files
Typst outputs become predictably named PDFs linked back to the application.
The key design question was not 'can AI do this?' It was 'where should the human stay in control?'
The agent pipeline includes four roles:
- Posting Parser extracts structured information from job descriptions.
- Resume Agent drafts a tailored resume based on the role.
- Cover Letter Agent drafts a targeted cover letter.
- Review Agent scores the documents and provides feedback before approval.
Building With Claude Code
Claude Code helped accelerate the implementation, but the collaboration worked best when I gave it product constraints instead of vague build requests. The most useful loops were around the data model, Bubble Tea view architecture, CLI command structure, documentation, and agent handoffs.
That process changed how I think about AI-assisted design work. The value was not simply speed. It was being able to explore implementation constraints while keeping enough product judgment to reject, simplify, or redirect the generated code.
Outcome And Learning
Ghosted shipped as an open-source working prototype. More importantly, it gave me a concrete model for designing AI-assisted tools:
- Shared data contracts matter more than flashy automation.
- Keyboard UX is only good when the underlying state model is clear.
- AI workflows need visible review points, failure tolerance, and user control.
- Building the prototype myself made the design tradeoffs more honest.
The next validation step would be testing the workflow with other technical job seekers: where they hesitate, what they trust, what feels too automated, and which parts of the pipeline are worth turning into a more polished product surface.
Roadmap
- Add a proactive assistant for stale applications, follow-up timing, and outreach drafts.
- Improve document attachment and audit trails inside the TUI.
- Add stronger failure states for missing job-board data, weak generated documents, and ambiguous company metadata.
- Explore a visual companion surface for users who want the workflow model without living entirely in the terminal.