SwiftData is Apple's Swift-native persistence framework. SnapAction uses it to store analyzed screenshot resources locally on iPhone.
SwiftData was introduced with iOS 17 as a modern persistence layer for Swift apps. Developers define models with the @Model macro, and SwiftData handles local storage, querying, relationships, and UI updates.
For a productivity app, SwiftData is useful because saved items can load instantly without waiting for a network request. The app can filter, sort, and update resource cards directly from local storage.
SwiftData can be used with CloudKit in Apple apps, but this page only describes what is verified in SnapAction's current implementation: a local resource library backed by SwiftData.
A resource can include:
@Model
class Resource {
var title: String
var url: String?
var type: ResourceType
var resourceDescription: String?
var tags: [String]
var metadata: [String: String]
var screenshotAssetIds: [String]
var isRead: Bool
var isFavorite: Bool
var createdAt: Date
} Result: The app can show a card, connect it to original screenshots, and power actions based on type-specific metadata.
SnapAction uses SwiftData as the local library that stores what the AI analysis returns after a screenshot scan.
SwiftData is the local memory layer. The screenshot analysis itself is handled by the Convex-backed AI pipeline.
Install the beta to turn scanned screenshots into a fast local resource library.
Try 30 free scans