Nicholas Clooney

Building ProjectDawn with Claude and Codex: An AI-Assisted iOS Devlog Deep Dive

I've been building a habit-logging iOS app called ProjectDawn. Not because the App Store needs another habit tracker, but because I wanted a personal project that was genuinely mine and open source, and a project that can answer this openly: what does it feel like to build a real, modular, native iOS app with AI as a primary collaborator?

This post is part personal log, part technical retrospective. It covers the tools I used, what surprised me, where the AI fell flat, and the biggest shifts in how I think about building things now.

What Is ProjectDawn?

TODO: pics here pls

ProjectDawn is a daily habit logging app with a simple, opinionated premise: if it's on the timeline, it happened. No reminders, no streaks, no gamification. Just a vertical timeline for your day and a tray of habits you can drag onto it. Logging a habit is a physical gesture: drag, drop, done.

The timeline snaps to 15-minute slots. Each placed habit becomes an instance you can resize by dragging its bottom edge. Swipe left or right to navigate between days. The habit tray collapses into a persistent strip at the bottom of the screen and expands into a full library when you need it.

It's a small app with a focused scope, but the interactions between the tray and the timeline are surprisingly nuanced, which made it a good test subject for AI-assisted development.

The Tech Stack

Claude + Codex: A Division of Labor

I used two different AI tools throughout this project, and the division emerged naturally from how each one felt to work with.

Claude is my planner. It's slower (sometimes noticeably so), but it thinks carefully, considers trade-offs, asks clarifying questions, and produces design decisions I can actually reason about. When I need a PRD, an architecture plan, or a bug analysis, Claude is what I reach for. It chews through tokens quickly on my Pro plan, but the quality of the output justifies it.

Codex (ChatGPT) is my driver. It's much faster, great at taking a clear spec and turning it into working code, and excellent at the kind of mechanical implementation work that would otherwise just be tedious. When Claude finishes a phase plan, Codex implements it.

The mental model I settled on: Claude is the senior engineer who sketches the architecture on a whiteboard; Codex is the dev who opens the IDE and makes it real. Neither could replace the other in this workflow, and the combination is genuinely more capable than either alone.

Mise + Tuist: Modular by Design

The project uses Tuist to split the app into individual modules, with mise managing the toolchain version. Every feature lives in its own module under Modules/:

Modules/
  Data/          <- SwiftData models (Habit, HabitInstance)
  DayView/       <- the main day scaffold and navigation
  Timeline/      <- the scrollable time grid
  HabitTray/     <- the expandable bottom sheet
  Interaction/   <- shared drag coordinator and helpers

The app target itself is a thin shell: it wires up the entry point, injects the SwiftData container, and delegates all UI to the feature modules.

The payoff of this structure: I can rebuild, test, and iterate on a single module without touching anything else. When Claude generates a phase plan, it maps cleanly to module boundaries. And when something breaks, the blast radius is contained.

The Docs Live in the Repo

One workflow decision that paid off: all design documents live inside the repo under Docs/.

Docs/
  PRD.md
  plan.md
  Implementation Plans/
    Phase3-Timeline.md
    Phase4-HabitTray.md
    Phase5-DragAndDrop.md
    Phase6-HabitInstancesOnTimeline.md
  bug-analysis-*.md
  bug-report-*.md
  future-ideas.md

The practical reason is that both Claude and Codex can read and write to these files directly. My usual setup is a tmux session with split panes: Claude in one, Codex in another. When Claude finishes a phase plan, it writes it to Docs/Implementation Plans/. Codex reads from there. When a bug surfaces, Claude writes the analysis to Docs/bug-analysis-<slug>.md and Codex can reference it without me re-explaining the problem.

This also means the design history is traceable and version-controlled alongside the code. Six months from now, I can read Phase4-HabitTray.md and understand exactly why the tray architecture looks the way it does, what alternatives were considered, and what was explicitly deferred. That's not something you get from commit messages alone.

The broader principle: keeping docs in the repo enables agents to collaborate across sessions and tools, creates a shared ground truth that survives context window resets, and makes it practical to split work across multiple agents (or agent types) without losing continuity. If you're working with AI on anything non-trivial, this is worth setting up early.

My Workflow, Step by Step

My personal preference throughout this project: always review AI's work, especially during the design phase. The AI is drafting, I'm approving. Here's how that plays out end to end.

1. Spitball and clarify with both Claude and Codex. Before any code is written, I use both to think out loud. What is this app actually for? Who is the user? What does the core interaction feel like? Bouncing ideas between the two surfaces different angles quickly, and the back-and-forth helps compress a fuzzy idea into something I can actually define. The output of this phase is a rough set of user stories and behaviors for the MVP.

2. UI mockups and component decisions. Both Claude and Codex can produce rough UI mockups directly in their chat interfaces, which is useful for validating layout ideas fast. More importantly, this is where I lock in the native component decisions: which SwiftUI primitives to reach for and which to avoid. For ProjectDawn, this is where the choice to use a persistent .sheet for the habit tray was made (and, as it turned out, where a future footgun was quietly loaded).

3. PRD. Once the concept is validated, I ask Claude to write a formal Product Requirements Document. I read it carefully, push back where something is wrong or missing, and iterate until it accurately reflects what I want to build. This document becomes the north star for everything that follows.

4. Master plan. The PRD feeds into a phased master plan. Claude produces it; I review it phase by phase, checking that the sequencing makes sense and that dependencies between features are accounted for. This lives in Docs/plan.md.

5. Per-phase implementation plans. Before each phase of implementation starts, I ask Claude to write a detailed implementation plan: module design, file layout, key decisions, alternative approaches considered and rejected, and often starter code snippets that serve as guardrails for Codex. I review each one and drop it into Docs/Implementation Plans/. These are the documents Codex actually works from.

6. Codex implements. With a clear implementation plan in hand, Codex does the heavy lifting. The plan is specific enough that it rarely goes sideways. When it does, having the plan as a reference makes it easy to diagnose where Codex drifted from the intent.

7. Review. Code review happens one of a few ways depending on what I'm looking at: reading the diff myself, running the project and feeling the interaction, or asking Claude to review the output against the implementation plan. For complex or risky phases, I do all three.

Structure Guides Quality

One thing that took me a while to appreciate: the structure you set up around AI-generated code has a big effect on the quality of what comes out.

For this personal project, the bar is deliberately lower. But for production-grade projects, I set up linters, formatters, and automation before writing a line of code: SwiftFormat and SwiftLint for style and idioms, CI/CD pipelines, and Danger to enforce test coverage and flag undocumented changes. When those guardrails are in place, the AI's output has to pass them too. It produces more consistent code not because you asked nicely, but because the tools enforce it automatically.

The lesson: if you want AI-generated code that meets a certain standard, make that standard enforceable by tooling, not just by eye.

The Part That Actually Works (and Impressed Me)

The interaction that surprised me most was the drag coordination between the habit tray and the timeline, and the fact that it works cleanly across two separate modules.

When a user drags a habit pill from the tray, the gesture originates inside HabitTray. But the drop target (the time slot grid) lives inside Timeline. These are different modules, compiled as separate static frameworks, with no direct dependency between them. The app needs to somehow pass "a habit is being dragged, and it's currently hovering over slot 34" from one side to the other in real time.

The solution is a shared HabitDragCoordinator in the Interaction module, an @Observable class that both HabitTray and Timeline can read from, injected into the environment by the app root.

		
  1. import CoreGraphics
  2. import Data
  3. import Observation
  4. @Observable
  5. public final class HabitDragCoordinator {
  6. public var draggedHabit: Habit?
  7. public var dragLocation: CGPoint = .zero
  8. public var pendingDrop = false
  9. public var isActive: Bool {
  10. draggedHabit != nil
  11. }
  12. public init() {}
  13. public func begin(habit: Habit, at location: CGPoint) {
  14. draggedHabit = habit
  15. dragLocation = location
  16. pendingDrop = false
  17. }
  18. public func move(to location: CGPoint) {
  19. dragLocation = location
  20. }
  21. public func drop() {
  22. pendingDrop = true
  23. }
  24. public func end() {
  25. draggedHabit = nil
  26. dragLocation = .zero
  27. pendingDrop = false
  28. }
  29. }

HabitTrayView calls coordinator.begin(habit:at:) when a long-press gesture starts and coordinator.move(to:) as the finger moves. DayView (sitting above both in the hierarchy) observes dragCoordinator.dragLocation and translates the screen-space point into a timeline slot:

		
  1. .onChange(of: dragCoordinator.dragLocation) { _, location in
  2. guard dragCoordinator.isActive else { return }
  3. hoveredSlot = slot(for: location)
  4. }
  5. .onChange(of: dragCoordinator.pendingDrop) { _, pendingDrop in
  6. guard pendingDrop else { return }
  7. handleDrop()
  8. }
  9. .onChange(of: hoveredSlot) { _, newSlot in
  10. guard dragCoordinator.isActive, let slot = newSlot, slot != lastHapticSlot else { return }
  11. hapticSnap()
  12. lastHapticSlot = slot
  13. }
  14. .sheet(isPresented: .constant(true)) {
  15. HabitTrayView()
  16. }

The result: dragging a pill from the tray causes time slots in the timeline to highlight in real time, with a haptic snap on each slot transition. When the finger lifts, DayView reads the hovered slot, computes the exact timestamp, and inserts a HabitInstance into SwiftData. The tray doesn't know about the timeline. The timeline doesn't know about the tray. DayView acts as the coordinator of coordinators.

I did not write a single line of that wiring. Claude designed the architecture; Codex implemented it. The fact that it works the first time you run it, with the haptics and the highlight and the drop all feeling right, was genuinely one of those moments where you look at the screen and think, we actually built something.

What I Learned Along the Way

Claude Is a Great Planner, but It's Slow

When I say Claude is a planner, I mean it produces real design documents. Here's a sample from the Phase 4 plan it wrote for the Habit Tray, covering module boundaries, the presentationDetents decision, layout constants, and the rationale for each choice:

		
  1. # Phase 4 — Habit Tray: Implementation Plan
  2. ## Architectural Decision: New `HabitTray` Module
  3. Create `Modules/HabitTray` (do not expand `DayView`). The reasoning mirrors the Phase 3 decision to split Timeline:
  4. 1. **Responsibility**: The tray owns substantial independent state — expanded/collapsed toggle, `@Query` over all `Habit` objects, sort order by frequency, add-habit form presentation. Placing this in `DayView` would make that file responsible for layout orchestration, timeline paging, date navigation, and habit management all at once.
  5. 2. **Phase 5 drag coordination**: When drag & drop arrives, `DayView` will thread a `hoveredSlot: Binding<Int?>` between `HabitTrayView` and `DayTimelineView`. This binding already exists on `DayTimelineView` (defaulting to `.constant(nil)`). The boundary is clean: `DayView` owns the shared drag state; `HabitTray` and `Timeline` are consumers. This is the same pattern already planned in Phase 3.
  6. 3. **Mac reuse (Phase 9)**: The menu bar popover uses the same `DayView` root. If the tray is a separate module, Phase 9 can tune `HabitTrayView` for the compact 320pt popover width without touching `DayView` or `Timeline`.
  7. 4. **Testability**: Frequency-sort logic and hex-to-Color parsing can be unit-tested in `HabitTrayTests` in isolation.
  8. **Dependency chain after Phase 4:**
  9. ```
  10. ProjectDawn (app) → DayView → HabitTray → Data
  11. → Timeline → Data
  12. ```
  13. ---
  14. ## Naming Notes
  15. - The module is `HabitTray`; the public entry point view is `HabitTrayView`. No collision risk with SwiftUI system types.
  16. - The add-habit form is `HabitFormView` (internal to `HabitTray`). Phase 7 may promote it if edit needs to be triggered from elsewhere.
  17. - `HabitPillView` is a shared internal component used by both the collapsed and expanded states.
  18. - The color helper is `Color+Hex.swift` — a `Color` extension, internal to the module.
  19. > **Implementation note:** `HabitLibrarySheet` was removed during implementation. The collapsed strip and expanded grid were unified into a single always-on sheet in `HabitTrayView` using `presentationDetents(selection:)`. `DayView` presents it via `.sheet(isPresented: .constant(true))`. `presentationBackgroundInteraction(.enabled(upThrough: collapsedDetent))` allows timeline interaction while the tray is collapsed.
  20. ---
  21. ## Geometry / Layout Constants
  22. ```swift
  23. // HabitTrayLayout.swift (internal enum)
  24. enum HabitTrayLayout {
  25. static let collapsedHeight: CGFloat = 80 // tray strip height (matches Phase 3 placeholder)
  26. static let pillHeight: CGFloat = 44 // capsule height in tray row
  27. static let pillHPadding: CGFloat = 12 // horizontal internal padding in pill
  28. static let pillHSpacing: CGFloat = 8 // spacing between pills in scroll row
  29. static let sheetFraction: CGFloat = 0.7 // presentationDetents(.fraction(0.7))
  30. static let gridColumns: Int = 3 // columns in library grid
  31. static let gridSpacing: CGFloat = 12
  32. }
  33. ```
  34. ---
  35. ## Step-by-Step Plan
  36. ### Step 1 — Scaffold `HabitTray` module manifest
  37. **Commit:** `chore(tuist): scaffold HabitTray module manifest`
  38. - Create `Modules/HabitTray/Project.swift`:
  39. ```swift
  40. import ProjectDescription
  41. import ProjectDescriptionHelpers
  42. let project = Project.module(
  43. name: "HabitTray",
  44. dependencies: [
  45. .project(target: "Data", path: "../Data"),
  46. ]
  47. )
  48. ```
  49. - Create the required empty directories: `Modules/HabitTray/Sources/`, `Modules/HabitTray/Resources/`, `Modules/HabitTray/Tests/` (Tuist globs these; they must exist).
  50. ---
  51. ### Step 2 — Wire into workspace and `DayView`
  52. **Commit:** `chore(tuist): wire HabitTray into workspace and DayView`
  53. - `Workspace.swift` — add `"Modules/HabitTray"` to the `projects` array.
  54. - `Modules/DayView/Project.swift` — add `.project(target: "HabitTray", path: "../HabitTray")` to the `dependencies` array alongside the existing `Timeline` entry.
  55. - Run `tuist generate` to verify the graph resolves cleanly.
  56. ---
  57. ### Step 3 — `Color+Hex.swift` (internal helper)
  58. **Commit:** (bundled with Step 6)
  59. Every `Habit` stores its color as a hex string (`colorHex: String`). The tray needs `hex → Color` for rendering and `Color → hex` when saving a custom-picked color.
  60. ```swift
  61. // Modules/HabitTray/Sources/Color+Hex.swift
  62. import SwiftUI
  63. extension Color {
  64. /// Parses a 6-digit hex string (with or without leading `#`) into a Color.
  65. /// Returns `.accentColor` as a safe fallback for malformed input.
  66. init(hex: String) {
  67. let hex = hex.trimmingCharacters(in: CharacterSet.alphanumerics.inverted)
  68. var int: UInt64 = 0
  69. guard hex.count == 6, Scanner(string: hex).scanHexInt64(&int) else {
  70. self = .accentColor
  71. return
  72. }
  73. let r = Double((int >> 16) & 0xFF) / 255
  74. let g = Double((int >> 8) & 0xFF) / 255
  75. let b = Double(int & 0xFF) / 255
  76. self = Color(red: r, green: g, blue: b)
  77. }
  78. /// Converts this Color to a 6-character uppercase hex string.
  79. /// Returns nil if the color cannot be resolved to RGB components.
  80. func toHex() -> String? {
  81. #if canImport(UIKit)
  82. guard let components = UIColor(self).cgColor.components, components.count >= 3 else { return nil }
  83. #elseif canImport(AppKit)
  84. guard let components = NSColor(self).cgColor.components, components.count >= 3 else { return nil }
  85. #else
  86. return nil
  87. #endif
  88. let r = Int(components[0] * 255)
  89. let g = Int(components[1] * 255)
  90. let b = Int(components[2] * 255)
  91. return String(format: "%02X%02X%02X", r, g, b)
  92. }
  93. }
  94. ```
  95. The `#if canImport` guards keep `toHex()` working for both iOS and macOS (Phase 9 menu bar).
  96. ---
  97. ### Step 4 — `HabitTrayLayout.swift` + `HabitColor.swift` + `HabitPillView.swift` (internal)
  98. **Commit:** `feat(HabitTray): HabitPillView — emoji + name capsule`
  99. ```swift
  100. // Modules/HabitTray/Sources/HabitColor.swift
  101. import Foundation
  102. struct HabitColor {
  103. let name: String
  104. let hex: String
  105. static let presets: [HabitColor] = [
  106. HabitColor(name: "Coral", hex: "FF6B6B"),
  107. HabitColor(name: "Peach", hex: "FF9F43"),
  108. HabitColor(name: "Sun", hex: "FECA57"),
  109. HabitColor(name: "Mint", hex: "48DBAB"),
  110. HabitColor(name: "Sky", hex: "54A0FF"),
  111. HabitColor(name: "Lavender", hex: "A29BFE"),
  112. HabitColor(name: "Rose", hex: "FD79A8"),
  113. HabitColor(name: "Stone", hex: "B2BEC3"),
  114. ]
  115. }
  116. ```
  117. ```swift
  118. // Modules/HabitTray/Sources/HabitPillView.swift
  119. import SwiftUI
  120. import Data
  121. struct HabitPillView: View {
  122. let habit: Habit
  123. private var backgroundColor: Color { Color(hex: habit.colorHex) }
  124. // Luminance-based contrast: white on dark colors, near-black on light.
  125. private var foregroundColor: Color {
  126. let hex = habit.colorHex.trimmingCharacters(in: .init(charactersIn: "#"))
  127. var int: UInt64 = 0
  128. Scanner(string: hex).scanHexInt64(&int)
  129. let r = Double((int >> 16) & 0xFF) / 255
  130. let g = Double((int >> 8) & 0xFF) / 255
  131. let b = Double(int & 0xFF) / 255
  132. let luminance = 0.2126 * r + 0.7152 * g + 0.0722 * b
  133. return luminance > 0.55 ? Color(white: 0.1) : .white
  134. }
  135. var body: some View {
  136. HStack(spacing: 4) {
  137. Text(habit.emoji)
  138. Text(habit.name)
  139. .lineLimit(1)
  140. }
  141. .font(.subheadline.weight(.medium))
  142. .foregroundStyle(foregroundColor)
  143. .padding(.horizontal, HabitTrayLayout.pillHPadding)
  144. .frame(height: HabitTrayLayout.pillHeight)
  145. .background(backgroundColor, in: Capsule())
  146. }
  147. }
  148. ```
  149. The luminance threshold of `0.55` correctly renders dark text on Sun (`#FECA57`) and Stone (`#B2BEC3`), and white text on Coral, Mint, Sky, Lavender, and Rose. It also handles arbitrary `ColorPicker` output.
  150. ---
  151. ### Step 5 — `ColorSwatchRowView.swift` (internal)
  152. **Commit:** (bundled with Step 6)
  153. ```swift
  154. // Modules/HabitTray/Sources/ColorSwatchRowView.swift
  155. import SwiftUI
  156. struct ColorSwatchRowView: View {
  157. @Binding var selectedHex: String
  158. @Binding var customColor: Color
  159. @Binding var useCustomColor: Bool
  160. var body: some View {
  161. ScrollView(.horizontal, showsIndicators: false) {
  162. HStack(spacing: 10) {
  163. ForEach(HabitColor.presets, id: \.hex) { preset in
  164. Circle()
  165. .fill(Color(hex: preset.hex))
  166. .frame(width: 32, height: 32)
  167. .overlay(
  168. Circle().strokeBorder(
  169. (!useCustomColor && selectedHex == preset.hex)
  170. ? Color.primary : Color.clear,
  171. lineWidth: 2.5
  172. )
  173. .padding(-3)
  174. )
  175. .onTapGesture {
  176. selectedHex = preset.hex
  177. useCustomColor = false
  178. }
  179. }
  180. // System ColorPicker rendered as a circle swatch
  181. ColorPicker("Custom", selection: $customColor, supportsOpacity: false)
  182. .labelsHidden()
  183. .frame(width: 32, height: 32)
  184. .clipShape(Circle())
  185. .overlay(
  186. Circle().strokeBorder(
  187. useCustomColor ? Color.primary : Color.clear,
  188. lineWidth: 2.5
  189. )
  190. .padding(-3)
  191. )
  192. .onChange(of: customColor) { _, _ in useCustomColor = true }
  193. }
  194. .padding(.vertical, 4)
  195. }
  196. }
  197. }
  198. ```
  199. The `ColorPicker` is clipped to a circle so it matches the preset swatch row visually. Tapping it opens the system full-spectrum picker. `onChange` automatically switches `useCustomColor = true`.
  200. ---
  201. ### Step 6 — `HabitFormView.swift` (internal)
  202. **Commit:** `feat(HabitTray): HabitColor + ColorSwatchRowView + HabitFormView`
  203. ```swift
  204. // Modules/HabitTray/Sources/HabitFormView.swift
  205. import SwiftUI
  206. import SwiftData
  207. import Data
  208. struct HabitFormView: View {
  209. @Environment(\.modelContext) private var context
  210. @Binding var isPresented: Bool
  211. @State private var name: String = ""
  212. @State private var emoji: String = "⭐️"
  213. @State private var selectedColorHex: String = HabitColor.presets[0].hex
  214. @State private var customColor: Color = .accentColor
  215. @State private var useCustomColor: Bool = false
  216. @State private var defaultDuration: Int = 30
  217. private var isSaveDisabled: Bool { name.trimmingCharacters(in: .whitespaces).isEmpty }
  218. var body: some View {
  219. NavigationStack {
  220. Form {
  221. Section("Name") {
  222. TextField("e.g. Morning Run", text: $name)
  223. }
  224. Section("Emoji") {
  225. TextField("Emoji", text: $emoji)
  226. .onChange(of: emoji) { _, new in
  227. // Clamp to first grapheme cluster
  228. if let first = new.unicodeScalars.first {
  229. emoji = String(first.value > 0xFF ? String(new.prefix(2)) : String(new.prefix(1)))
  230. }
  231. }
  232. }
  233. Section("Color") {
  234. ColorSwatchRowView(
  235. selectedHex: $selectedColorHex,
  236. customColor: $customColor,
  237. useCustomColor: $useCustomColor
  238. )
  239. }
  240. Section("Default Duration") {
  241. Picker("Duration", selection: $defaultDuration) {
  242. ForEach([15, 30, 45, 60, 90, 120], id: \.self) { mins in
  243. Text("\(mins) min").tag(mins)
  244. }
  245. }
  246. .pickerStyle(.wheel)
  247. .frame(height: 120)
  248. }
  249. }
  250. .navigationTitle("New Habit")
  251. .navigationBarTitleDisplayMode(.inline)
  252. .toolbar {
  253. ToolbarItem(placement: .cancellationAction) {
  254. Button("Cancel") { isPresented = false }
  255. }
  256. ToolbarItem(placement: .confirmationAction) {
  257. Button("Save") { save() }
  258. .disabled(isSaveDisabled)
  259. }
  260. }
  261. }
  262. }
  263. private func save() {
  264. let resolvedHex = useCustomColor
  265. ? (customColor.toHex() ?? selectedColorHex)
  266. : selectedColorHex
  267. let habit = Habit(
  268. name: name.trimmingCharacters(in: .whitespaces),
  269. emoji: emoji,
  270. colorHex: resolvedHex,
  271. defaultDuration: defaultDuration
  272. )
  273. context.insert(habit)
  274. isPresented = false
  275. }
  276. }
  277. ```
  278. **Emoji field note**: `TextField` allows multi-character input. The `onChange` handler trims to the first grapheme cluster. Most emoji are single scalars > U+00FF; skin-tone variants use sequences, so `prefix(2)` handles those. A dedicated emoji picker grid (Phase 7) will replace this.
  279. ---
  280. ### Step 7 — `HabitLibrarySheet.swift` (internal)
  281. **Commit:** `feat(HabitTray): HabitLibrarySheet — expanded grid of all habits`
  282. ```swift
  283. // Modules/HabitTray/Sources/HabitLibrarySheet.swift
  284. import SwiftUI
  285. import SwiftData
  286. import Data
  287. struct HabitLibrarySheet: View {
  288. @Query private var habits: [Habit]
  289. @Binding var isPresented: Bool
  290. let onAddHabit: () -> Void
  291. private var sortedHabits: [Habit] {
  292. habits.sorted { $0.instances.count > $1.instances.count }
  293. }
  294. private let columns = Array(
  295. repeating: GridItem(.flexible(), spacing: HabitTrayLayout.gridSpacing),
  296. count: HabitTrayLayout.gridColumns
  297. )
  298. var body: some View {
  299. NavigationStack {
  300. ScrollView {
  301. LazyVGrid(columns: columns, spacing: HabitTrayLayout.gridSpacing) {
  302. ForEach(sortedHabits) { habit in
  303. HabitPillView(habit: habit)
  304. .frame(maxWidth: .infinity)
  305. }
  306. Button(action: onAddHabit) {
  307. Label("Add Habit", systemImage: "plus")
  308. .font(.subheadline.weight(.medium))
  309. .frame(maxWidth: .infinity)
  310. .frame(height: HabitTrayLayout.pillHeight)
  311. .background(.secondary.opacity(0.15), in: Capsule())
  312. }
  313. .buttonStyle(.plain)
  314. }
  315. .padding()
  316. }
  317. .navigationTitle("Habit Library")
  318. .navigationBarTitleDisplayMode(.inline)
  319. .toolbar {
  320. ToolbarItem(placement: .cancellationAction) {
  321. Button("Done") { isPresented = false }
  322. }
  323. }
  324. }
  325. .presentationDetents([.fraction(HabitTrayLayout.sheetFraction), .large])
  326. .presentationDragIndicator(.visible)
  327. }
  328. }
  329. ```
  330. Notes:
  331. - `.presentationDetents([.fraction(0.7), .large])` lets the user pull the sheet fully open if they have many habits.
  332. - `.presentationDragIndicator(.visible)` provides a system drag handle — no need to draw a custom one.
  333. - Frequency sort (`instances.count`) is computed in-memory since `@Query` cannot sort on relationship aggregates. Hundreds of habits is the realistic scale — in-memory is negligible.
  334. - The `onAddHabit` closure bubbles up from `HabitTrayView` so form presentation is controlled from one call site.
  335. ---
  336. ### Step 8 — `HabitTrayView.swift` (public entry point)
  337. **Commit:** `feat(HabitTray): HabitTrayView — collapsed tray, drag-up expansion`
  338. ```swift
  339. // Modules/HabitTray/Sources/HabitTrayView.swift
  340. import SwiftUI
  341. import SwiftData
  342. import Data
  343. public struct HabitTrayView: View {
  344. @Query private var habits: [Habit]
  345. @State private var showLibrary: Bool = false
  346. @State private var showAddHabit: Bool = false
  347. public init() {}
  348. // Most recently added first — a reasonable proxy until Phase 5 adds usage tracking.
  349. private var trayHabits: [Habit] {
  350. habits.sorted { $0.createdAt > $1.createdAt }
  351. }
  352. public var body: some View {
  353. VStack(spacing: 0) {
  354. // Drag handle
  355. Capsule()
  356. .fill(Color.secondary.opacity(0.35))
  357. .frame(width: 36, height: 4)
  358. .padding(.top, 8)
  359. .onTapGesture { showLibrary = true }
  360. // Horizontal pill scroll row
  361. ScrollView(.horizontal, showsIndicators: false) {
  362. HStack(spacing: HabitTrayLayout.pillHSpacing) {
  363. ForEach(trayHabits) { habit in
  364. HabitPillView(habit: habit)
  365. }
  366. Button {
  367. showAddHabit = true
  368. } label: {
  369. Label("Add Habit", systemImage: "plus")
  370. .font(.subheadline.weight(.medium))
  371. .padding(.horizontal, HabitTrayLayout.pillHPadding)
  372. .frame(height: HabitTrayLayout.pillHeight)
  373. .background(.secondary.opacity(0.15), in: Capsule())
  374. }
  375. .buttonStyle(.plain)
  376. }
  377. .padding(.horizontal)
  378. .padding(.vertical, 10)
  379. }
  380. }
  381. .frame(height: HabitTrayLayout.collapsedHeight)
  382. .background(.bar)
  383. .gesture(
  384. DragGesture(minimumDistance: 10)
  385. .onEnded { value in
  386. if value.translation.height < -20 {
  387. withAnimation(.spring(duration: 0.4)) {
  388. showLibrary = true
  389. }
  390. }
  391. }
  392. )
  393. .sheet(isPresented: $showLibrary) {
  394. HabitLibrarySheet(isPresented: $showLibrary) {
  395. showLibrary = false
  396. showAddHabit = true
  397. }
  398. }
  399. .sheet(isPresented: $showAddHabit) {
  400. HabitFormView(isPresented: $showAddHabit)
  401. }
  402. }
  403. }
  404. ```
  405. Key design choices:
  406. - **Drag gesture**: `DragGesture(minimumDistance: 10)` with a negative-Y threshold (`< -20`) triggers the sheet. The horizontal `ScrollView` consumes horizontal gesture vectors first, so the tray's drag gesture only fires on predominantly upward swipes — no conflict.
  407. - **Spring animation**: `.spring(duration: 0.4)` on `showLibrary = true` matches PRD §6.4.
  408. - **Two independent sheets** (`showLibrary`, `showAddHabit`): When the user taps "+ Add Habit" inside the library, `showLibrary = false` runs first, then `showAddHabit = true`. SwiftUI processes these sequentially, avoiding the "sheet presenting over sheet" runtime warning on iOS 17.
  409. - **Separate `@Query` in tray and library**: Both query the same table. SwiftData deduplicates at the persistent store layer. Avoids prop-drilling a potentially large array.
  410. ---
  411. ### Step 9 — Wire `HabitTrayView` into `DayView`
  412. **Commit:** `feat(DayView): replace tray placeholder with HabitTrayView`
  413. In `Modules/DayView/Sources/DayView.swift`, replace the placeholder:
  414. ```swift
  415. // Before:
  416. // Phase 4: HabitTray
  417. Color.secondary.opacity(0.1)
  418. .frame(height: 80)
  419. // After:
  420. HabitTrayView()
  421. ```
  422. Add `import HabitTray` at the top.
  423. In `Modules/DayView/Project.swift`, add the new dependency:
  424. ```swift
  425. let project = Project.module(
  426. name: "DayView",
  427. dependencies: [
  428. .project(target: "Data", path: "../Data"),
  429. .project(target: "Timeline", path: "../Timeline"),
  430. .project(target: "HabitTray", path: "../HabitTray"),
  431. ]
  432. )
  433. ```
  434. The `@modelContainer` is injected at the app entry point (`ProjectDawnApp.swift`), so `HabitTrayView`'s `@Query` and `HabitFormView`'s `@Environment(\.modelContext)` resolve automatically — no changes to `ProjectDawnApp.swift`.
  435. ---
  436. ## File Summary
  437. | Commit | Files |
  438. |---|---|
  439. | `chore(tuist): scaffold HabitTray module manifest` | `Modules/HabitTray/Project.swift` (new) |
  440. | `chore(tuist): wire HabitTray into workspace and DayView` | `Workspace.swift`, `Modules/DayView/Project.swift` |
  441. | `feat(HabitTray): HabitPillView — emoji + name capsule with luminance contrast` | `Sources/HabitPillView.swift` (new), `Sources/HabitTrayLayout.swift` (new) |
  442. | `feat(HabitTray): HabitColor presets + Color+Hex + ColorSwatchRowView + HabitFormView` | `Sources/HabitColor.swift` (new), `Sources/Color+Hex.swift` (new), `Sources/ColorSwatchRowView.swift` (new), `Sources/HabitFormView.swift` (new) |
  443. | `feat(HabitTray): HabitTrayView — public entry point` | `Sources/HabitTrayView.swift` (new) |
  444. | `feat(DayView): replace tray placeholder with HabitTrayView` | `Modules/DayView/Sources/DayView.swift`, `Modules/DayView/Project.swift` |
  445. | `refactor(HabitTray): merge library sheet into HabitTrayView as a single always-on sheet` | `Sources/HabitTrayView.swift`, `Sources/HabitLibrarySheet.swift` (deleted), `Modules/DayView/Sources/DayView.swift` |
  446. ---
  447. ## Key Trade-offs
  448. | Decision | Choice | Reason |
  449. |---|---|---|
  450. | Module split | New `HabitTray` module | Single responsibility; isolated unit tests for sort/color logic; Phase 9 Mac reuse |
  451. | Tray sort order | `createdAt` descending | Simple default; Phase 5 replaces with usage-frequency when logging is implemented |
  452. | Frequency sort in library | In-memory sort on `instances.count` | `@Query` cannot sort on relationship aggregates; 100s of habits is the realistic scale — in-memory is negligible |
  453. | Emoji picker | `TextField` clamped to first grapheme | Zero dependencies, good enough for v1; Phase 7 replaces with grid picker during edit-habit work |
  454. | Sheet sequencing (library → add) | Dismiss library first, then present add | Avoids iOS "sheet over sheet" runtime warning; clean UX transition |
  455. | Drag gesture vs handle tap for expansion | Both (drag on strip + tappable handle) | Handle tap is more discoverable; drag is more natural once users know the tray |
  456. | `ColorPicker` integration | System `ColorPicker` as rainbow swatch | Zero custom code for full-spectrum picker; renders as a circle swatch matching the preset row |
  457. | `toHex()` implementation | `UIColor`/`NSColor` bridge with `#if canImport` guard | Required for Mac compatibility; `Color` has no public RGB accessors in SwiftUI |
  458. | Two `@Query` instances (tray + library) | Kept separate | Avoids prop-drilling; SwiftData deduplicates at the persistent store layer |

That document shaped how Codex implemented the feature. Having it written down also meant that when something diverged from the plan, I had a reference to come back to. And because it's in the repo, a future Claude session can read it cold and immediately understand the reasoning without me re-explaining anything.

The slowness is real, though. There were stretches where I was waiting on Claude to finish a planning pass and couldn't move forward. It's not a dealbreaker (thinking carefully takes time), but it's worth knowing this isn't a "vibe-code at 60 fps" kind of workflow.

Gotcha #1: The GCD Trap

One of the first places I had to step in was around concurrency. Claude had generated some timing logic using DispatchQueue.main.async, the old Grand Central Dispatch pattern that most modern Swift code has moved away from. It worked, but it was out of place in a codebase that was otherwise using async/await and Task.sleep.

It wasn't a wrong choice exactly (GCD isn't broken), but it was an inconsistent one. This is the kind of thing a human reviewer catches immediately because it jumps out as stylistically wrong. The AI didn't have that instinct. I caught it, flagged it, and had Codex rewrite the section using Task.sleep. Two minutes of work, but only because I was paying attention.

It illustrated something I kept coming back to: the AI will choose the first plausible solution, not necessarily the idiomatic one. You need a human in the loop who knows what "right" looks like in context.

Gotcha #2: The Sheet That Eats Your Alert

This one was more dramatic.

The app presents the habit tray as a persistent sheet via .sheet(isPresented: .constant(true)). The timeline sits underneath that sheet. At some point, I added the ability to delete a habit instance from the timeline: long press, confirm, done.

What actually happened: long-pressing a pill on the timeline caused the entire habit tray to disappear, and the confirmation dialog auto-dismissed itself on the first attempt.

The bug analysis Claude wrote tells the story clearly:

		
  1. # Bug Analysis: Timeline Instance Delete Dismisses Habit Tray
  2. ## Root Cause: Presentation Layer Mismatch
  3. The iOS UIKit rule at the heart of this: **a view controller cannot present a new modal if it already has a presented view controller.** If it tries, the existing presented VC is typically dismissed first.
  4. ### The presentation stack
  5. ```
  6. ProjectDawnApp
  7. └── DayView ← the presenting VC
  8. └── .sheet(.constant(true)) → HabitTrayView ← the presented VC (tray)
  9. └── .sheet($showAddHabit) → HabitFormView
  10. ```
  11. ### Why it breaks
  12. `InstancePillView` lives in `DayTimelineView`, which is part of `DayView`'s content tree — underneath the tray sheet in the UIKit hierarchy. When `.contextMenu` or `.confirmationDialog` fires from `InstancePillView`, UIKit must present the resulting `UIAlertController` (or context menu overlay) **from `DayView`'s VC**.
  13. But `DayView`'s VC is already presenting `HabitTrayView`. UIKit sees a VC trying to present something new while it already has a presented VC, and resolves the conflict by dismissing the tray — which is exactly the symptom.
  14. The `.contextMenu` modifier makes it worse: on iOS, long-pressing content beneath a sheet can cause UIKit to actively dismiss the covering sheet in order to surface the context menu's "peek" of the underlying content. That explains why the tray vanishes *before* the confirmation even appears.
  15. The bug report's conclusion is correct: **the delete confirmation is being launched from the wrong presentation layer.** The confirmation needs to originate from inside the tray sheet (the topmost presented VC), not from the timeline content underneath it.
  16. ---
  17. ## What Is NOT the Issue
  18. - Not a SwiftData problem — the delete itself would work fine if the UI could present correctly.
  19. - Not specific to `confirmationDialog` vs `alert` — any modal presentation from the timeline subtree hits the same wall, which is why the alternate `alert` approach also failed.
  20. - Not a race condition or timing issue.
  21. ---
  22. ## Fix Options
  23. ### Option 1 — Inline SwiftUI confirmation within the pill
  24. Replace the `.confirmationDialog` with a small "confirm delete?" overlay rendered directly inside `InstancePillView` using `ZStack`/`.overlay`. No UIKit presentation machinery is triggered, so the tray is never disturbed. This is the least invasive fix.
  25. **Tradeoff:** Non-standard UX pattern — users expect a system-level confirmation for destructive actions. Requires custom styling to communicate clearly.
  26. ### Option 2 — Present confirmation from inside `HabitTrayView`
  27. `HabitTrayView` sits at the top of the presentation stack, so it can safely present. Communicate the "instance to delete" from `InstancePillView` up to `HabitTrayView` via a shared environment value or coordinator, then let `HabitTrayView` own and present the confirmation dialog.
  28. **Tradeoff:** More wiring — `InstancePillView` needs a way to signal delete intent upward through the view tree without owning the presentation itself.
  29. ### Option 3 — Bubble delete intent to `DayView`, present from the tray sheet
  30. Store a `pendingDeleteInstance: HabitInstance?` in a shared coordinator (e.g., `HabitDragCoordinator` or a new `TimelineActionCoordinator`). `InstancePillView` sets it; `HabitTrayView` observes it and presents the confirmation from inside the sheet.
  31. This is structurally similar to Option 2 but uses the existing coordinator pattern from Phase 5 rather than a new binding chain.
  32. **Tradeoff:** Adds responsibility to the coordinator that is conceptually unrelated to drag & drop.
  33. ---
  34. ## Recommended Fix
  35. **Option 1 (inline confirmation)** is the path of least resistance for Phase 6. The delete action is low-frequency and the pill already has an expanded state (`isExpanded`) — a secondary "confirm?" state within the pill is consistent with that existing pattern and avoids touching the presentation hierarchy at all.
  36. If a system-level confirmation is strongly preferred, **Option 2** is cleaner than Option 3 because it keeps delete logic close to where instances are displayed.

The short version: UIKit has a rule that a view controller can't present a new modal if it already has a presented view controller. When the confirmation dialog tried to present from the timeline layer (which sits underneath the sheet), UIKit resolved the conflict by dismissing the sheet. The .contextMenu modifier made it worse by actively pulling back the sheet to "peek" at the content underneath.

Claude did not anticipate this when planning the delete feature, and I wasn't paying enough attention to catch it during the planning review either. Claude designed the interaction in isolation; it had no reason to think about how a presentation originating from the timeline would interact with a sheet presented from the same parent view controller. That's a subtle UIKit behavior that requires real iOS experience to know about.

The fix involved restructuring which layer owns the confirmation dialog. The lesson was slightly more expensive.

Side note: Codex took a few tries and didn't land a proper fix. Claude gave an excellent analysis on the first try, and its recommendations fixed the issue. This is actually the kind of bug where Claude's slower, more systematic reasoning has a real edge.

The Deeper Lesson: AI Doesn't Think About Interactions Between Components

The sheet/alert bug is an example of a broader pattern I noticed throughout this project: the AI plans features in isolation, without simulating how they interact with each other.

Claude wrote excellent plans for Phase 4 (tray), Phase 5 (drag and drop), and Phase 6 (instances on the timeline). Each plan was internally coherent. But none of them modeled how presenting a dialog from Phase 6 code would interact with the sheet architecture from Phase 4.

This isn't surprising. The AI can only reason about what's in its context window. It doesn't have a running mental simulation of a UIKit presentation stack that it updates as new features accumulate.

Two takeaways:

One: We're still early. The AI is already impressive at what it can do. It designed a modular multi-target Tuist workspace, wrote a functional drag coordinator across module boundaries, and produced architecture documents I'd be happy to share in a real code review. But it lacks the accumulated intuitions that experienced engineers carry from failure modes they've personally hit before.

Two: This will get better. The more that UIKit presentation conflicts and similar gotchas are represented in training data (bug reports, Stack Overflow answers, engineering blogs like this one), the better future models will get at anticipating them. I genuinely believe that in a few years, this kind of cross-feature interaction issue will be something the AI flags proactively during planning.

For now, a human who has shipped iOS apps needs to be in the loop during planning reviews.

What Kind of AI Code Is Actually Acceptable?

Here's a question I kept bumping into: how much do I trust AI-generated code, and does the answer change depending on what the code does?

I've started thinking about this as a rough tier system:

Tier 1: Pure UI. Layouts, color tokens, spacing, animations. I trust AI output here almost completely. If a button is 2 points too wide, or an animation curve is slightly off, I'll catch it visually and fix it in five seconds. The failure mode is cosmetic.

Tier 2: UI interactions and gestures. Drag behavior, sheet presentation, haptic feedback, state transitions. More review needed. The sheet/alert bug lived here. The failure mode isn't cosmetic, it's behavioral, and behavioral bugs are often only visible at runtime, in specific sequences, in ways that are hard to reason about from a static plan.

Tier 3: Business logic. Data model decisions, persistence, sync, state management. I want to understand everything in this tier. The AI can draft it, but I read it carefully and think about the edge cases myself.

Tier 4: Security, auth, payments, privacy. This is where I'd be most cautious. Not because the AI is incapable, but because the failure modes here are severe and non-obvious, and you need domain expertise to even know what questions to ask.

The tiered framing isn't about trust in the AI's ability to write syntactically correct code. It's about how much domain expertise is required to evaluate the output, and how bad it is if the output is subtly wrong.

There's also a more philosophical point underneath this: AI right now is the latest tool in the toolkit. It accelerates what humans do. But the judgment about what to build, how it should feel, whether an architecture decision is defensible three years from now, whether a particular bug is cosmetic or catastrophic, that judgment is still human. To make something that reflects your vision and your standards, you still need a human behind the wheel. The AI makes the car faster, not the driver obsolete.

The Biggest Takeaway: POC First, Engineering Later

The shift in mindset that came out of this project is something I keep coming back to.

In the past, when I started a new project with ambitions of building it "properly," I'd immediately reach for module boundaries, protocols, dependency injection, SOLID principles, all the markers of good engineering. Then I'd spend a long time setting up structure before I'd proven that the thing I was building was actually something I wanted to build.

With AI assistance, there's a faster path: build a dirty, disposable proof of concept first. Prove the interaction model works. Prove that dragging a habit onto a timeline and watching it snap feels good. Get the full thing running in a single file if you have to, iterate at speed, and treat it as a throwaway. Then, if the concept earns its existence, distill it into a properly engineered project.

The AI is extremely good at quickly generating that first pass. It doesn't care if everything lives in one view file. It can spin up an interactive prototype in an afternoon. You get to experience the thing, watch real data flow, feel the gestures, encounter the failure modes, before committing to any architecture.

I'm applying this to everything I build going forward. Figure out the UX with some quick POC-grade code. If it proves itself, bring in the modules, the protocols, the test targets. Not before.

Closing Thoughts

ProjectDawn is still early, v0.1+ with the core drag-and-log flow working and a few rough edges. But the process of building it has already changed how I think about AI-assisted development.

The pairing of Claude (planning, critical thinking) and Codex (implementation, speed) is more useful than either alone. Tuist's modular structure made the AI's output easier to review. Keeping docs in the repo made it possible for agents to actually collaborate across sessions and tools. And the failures (the GCD inconsistency, the sheet/alert conflict) were more instructive than the successes, because they revealed exactly where human oversight still matters.

If you're thinking about building something native and wondering whether to bring AI into the loop: yes, do it. Just keep your hand on the wheel.