https://github.com/aluhed/SwiftUI-Gesture-Library/releases
Welcome to the SwiftUI Gesture Library. This is an advanced gesture recognition toolkit for iOS apps that blends native SwiftUI gestures with a library of custom gestures. It focuses on precise recognition, flexible composition, tactile feedback, and inclusive UX design. The library helps product teams implement expressive gesture interactions without sacrificing accessibility or performance.
Key ideas:
- 30+ custom gestures and counting
- Gesture chaining and composition
- Rich haptic feedback patterns
- Accessibility-focused gesture options for UX designers
- Seamless Swift Package Manager integration for iOS developers
This README provides a complete guide to installing, using, extending, testing, and contributing to the library. It also outlines best practices for building gesture-based UX that is responsive, accessible, and delightful.
Downloads and latest assets From the Releases page, you can find the latest assets for the library. Go to the Releases page to pick the appropriate asset for your platform and workflow, download the asset file, and run or integrate it into your project. The latest assets are hosted here: https://github.com/aluhed/SwiftUI-Gesture-Library/releases. For quick access, see the badge above and visit the same URL again in the text later in this document.
Table of contents
- Overview
- What you get
- Getting started
- How to use
- Gesture catalog
- Architecture and design
- Accessibility and inclusive design
- Haptic patterns
- Customization and configuration
- Testing and quality
- Examples and recipes
- Performance and optimization
- Contributing
- Roadmap and future work
- Licensing and credits
- Releases and downloads
Overview The library is designed for iOS developers, UX designers, and interaction advocates. It brings together traditional SwiftUI gestures with a set of custom gestures that respond to complex input patterns. It supports gesture chaining, so you can combine multiple gestures to trigger specific states or actions. It also ships with a library of haptic patterns that can be paired with gesture events to reinforce interactions through touch.
This project aims to be practical and approachable. It favors clear APIs, fast feedback, and predictable behavior. It is built to scale—from a simple tap-and-respond interaction to a multi-step gesture sequence that powers advanced UI flows. The library emphasizes accessibility, so designers can include gestures that work well with VoiceOver, larger hit areas, and adaptive interaction patterns.
What you get
- A toolkit of 30+ custom gestures, with room to grow
- Easy composition via gesture chaining and sequencing
- A flexible haptic engine with ready-to-use patterns
- Accessibility-friendly gestures and modifiers
- Solid Swift Package Manager integration for easy adoption in iOS apps
- Clear examples and recipes to accelerate development
- Extensible architecture that supports new gestures and patterns
Getting started Prerequisites
- Xcode 14 or newer
- iOS 13.0+ as a minimum deployment target
- Swift 5.x
- Swift Package Manager (SPM) as the preferred integration method
Installation The recommended way to include this library in your project is through Swift Package Manager. You can add the package to your Xcode project using the URL below and picking the latest release tag or a version range that fits your app’s needs.
Swift Package Manager steps (illustrative)
- Open your Xcode project
- Choose File > Swift Packages > Add Package Dependency
- Enter the package URL: https://github.com/aluhed/SwiftUI-Gesture-Library.git
- Select the version you want (e.g., from 1.0.0)
- Add the library to your target
Usage patterns SwiftUI Gesture Library provides both its own gesture types and adapters that work with native SwiftUI gestures. You can mix and match to create robust interactions that feel natural on iOS devices.
Minimal example The following example demonstrates a simple tap gesture that updates a label and triggers a haptic cue.
Code example (illustrative) import SwiftUI import SwiftUIGestureLibrary
struct ContentView: View { @State private var message: String = "Awaiting interaction" @State private var count: Int = 0
var body: some View { VStack(spacing: 20) { Circle() .fill(Color.blue) .frame(width: 120, height: 120) .gesture(LibraryTapGesture(count: 1) { // custom library gesture count += 1 message = "Tapped (count) time(s)" }) Text(message) .font(.title2) .padding() } .padding() } }
Notes
- This example uses a hypothetical LibraryTapGesture to illustrate how the API could look. The actual API may differ, but the pattern remains intuitive: attach a library gesture to a SwiftUI view and handle events in closures.
Composite gestures and chaining One of the core strengths of the library is composing gestures. You can chain multiple gestures to create nuanced interactions. For example, you can require a drag followed by a short hold to trigger a mode switch, or a double tap with a specific velocity to unlock a feature.
Code example (illustrative) import SwiftUI import SwiftUIGestureLibrary
struct ComplexInteractionView: View { @State private var mode: String = "Idle"
var body: some View { Rectangle() .fill(mode == "Active" ? Color.green : Color.red) .frame(width: 180, height: 180) .libraryGestures([ .sequence([ .doubleTap { mode = "Active" }, .drag { translation in // Update UI based on drag translation }, .hold(minimumDuration: 0.4) { /* finalize action */ } ]) ]) } }
Accessibility and UX design The library includes a dedicated set of accessibility gestures and modifiers. These are designed to be discoverable by assistive technologies and to work well with dynamic type, high-contrast modes, and VoiceOver. You can map common accessibility actions to gesture sequences or provide alternative patterns for users who rely on gestures with larger hit targets.
Haptic patterns Haptic feedback reinforces gesture events. The library ships with a collection of patterns suitable for various contexts: subtle confirmations, corrective cues, success alerts, and progress indicators. Each pattern can be tuned for intensity and duration, and you can attach a pattern to a gesture event or trigger it independently when needed.
Architecture and design Core concepts
- Gesture definitions: A set of gesture types that describe how input is recognized.
- Gesture composition: A builder that allows combining gestures into sequences, parallel patterns, or conditional flows.
- Event handlers: Closures that run when a gesture is recognized or when a gesture state changes.
- Haptic engine: A module that selects and triggers tactile feedback based on gesture events.
- Accessibility layer: Abstractions to map gestures to accessible actions and hints.
Code organization
- Core: The gesture engine and the primitives used to define and recognize gestures.
- Adapters: Bridges between native SwiftUI gestures and the library’s custom gestures.
- Haptics: A standalone module for managing haptic feedback patterns.
- Accessibility: Helpers to expose gestures to assistive technologies.
- Examples: A gallery of practical usage patterns and recipes.
- Tests: A suite of unit and UI tests to validate gesture behavior.
Extensibility The library is designed to be extended. You can add new gesture definitions by:
- Defining recognition rules
- Providing a corresponding event handler
- Registering the gesture in a catalog so it can be discovered and used in code samples
Best practices for building your own gestures
- Start with a clear interaction goal. Every gesture should map to a meaningful user action.
- Consider accessibility from day one. Provide alternative patterns, larger hit targets, and simple discovery cues.
- Favor deterministic timing. Avoid long or unpredictable delays that frustrate users.
- Use consistent feedback. Haptic cues should align with the gesture’s meaning.
- Document edge cases. Gesture recognition can be sensitive to motion and device variation.
Accessibility and inclusive design
- Hit targets: Gestures should be usable on devices with varied screen sizes. Default hit areas should be large enough for comfortable interaction.
- VoiceOver: Provide descriptive hints for gestures so VoiceOver users can learn the pattern and purpose.
- Customizable thresholds: Let designers adjust recognition thresholds to suit accessibility needs.
- Visual cues: Offer optional visual feedback for gesture states to aid users who benefit from non-audio cues.
- Alternate patterns: Provide simpler or alternative gestures that achieve the same outcome.
Haptic patterns in detail
- Confirm: A short, crisp cue for a successful action.
- Subtle: A light vibration for minor updates.
- Alert: A sharper cue for errors or warnings.
- Rhythm: A sequence of taps that communicates a progress state.
- Customizable: Developers can tune intensity, duration, and repeat count to fit the app’s tone.
Customization and configuration
- Thresholds: Adjust the sensitivity of recognition to reduce false positives.
- Timing: Set minimum and maximum durations for gesture activation.
- Haptics: Choose from predefined patterns or craft custom ones.
- Visual feedback: Enable or disable overlay hints and micro-animations.
- Localization: Gesture hints can be localized for different languages and regions.
Testing and quality
- Unit tests: Validate the recognition logic, sequencing, and edge cases.
- UI tests: Simulate gestures and verify UI state changes.
- Performance: Measure gesture recognition latency and ensure it stays within acceptable bounds on target devices.
- Accessibility checks: Verify that gestures are discoverable and usable with assistive tech.
Examples and recipes
- Simple tap branch
- Double-tap with acceleration
- Drag then release to trigger an action
- Long-press with a hold-to-activate pattern
- Complex sequence with conditional branches
- Accessibility-friendly alternatives
- Haptic-driven feedback loop
Code samples (illustrative) These snippets illustrate the kinds of usage you might implement with the library. The exact API may differ slightly depending on the version you adopt. Use the library’s official docs as the canonical reference.
Example 1: Simple tap and haptic feedback import SwiftUI import SwiftUIGestureLibrary
struct TapFeedbackView: View { @State private var taps = 0
var body: some View { Rectangle() .fill(Color.accentColor) .frame(width: 200, height: 200) .libraryGesture(.tap { taps += 1; Haptics.feedback(.success) }) .overlay(Text("Tap me: (taps)").foregroundColor(.white)) } }
Example 2: Drag sequence with visual state import SwiftUI import SwiftUIGestureLibrary
struct DragSequenceView: View { @State private var offset: CGSize = .zero @State private var completed = false
var body: some View { Circle() .fill(completed ? Color.green : Color.orange) .frame(width: 100, height: 100) .offset(offset) .libraryGesture(.sequence([ .drag { delta in offset = delta }, .dragEnd { _ in completed = true Haptics.feedback(.success) } ])) } }
Example 3: Accessibility-first gesture import SwiftUI import SwiftUIGestureLibrary
struct AccessibilityGestureView: View { @State private var status = "Waiting"
var body: some View { VStack(spacing: 16) { Image(systemName: "hand.tap.fill") .resizable() .frame(width: 120, height: 120) .foregroundColor(.primary) Text(status) .font(.title2) } .libraryGesture(.accessiblePattern { action in status = action.description }) .accessibilityLabel("Gesture preview") .accessibilityHint("Perform the accessibility gesture to trigger an action") } }
Note: The code samples above are illustrative. The actual API names and usage patterns may vary. Please refer to the library’s documentation for exact details.
Releases and downloads The library is distributed through releases. The assets on the Releases page include example projects, sample apps, and integration helpers that you can download and run. For direct access, visit the Releases page here: https://github.com/aluhed/SwiftUI-Gesture-Library/releases. You can also press the badge near the top of this page to jump to the same resource. If you need a quick pointer, the releases page consolidates all the latest builds, documentation, and ready-to-run samples.
Project structure
- CoreEngine: The heart of gesture recognition. It handles input events, timing, and sequencing.
- GestureCatalog: A registry of built-in gestures and their metadata.
- Composition: Tools to combine gestures into sequences, parallel groups, and conditional flows.
- HapticsEngine: Manages all haptic feedback, including timing and intensity envelopes.
- AccessibilityLayer: Enables gestures to be discovered and used by assistive tech.
- ExamplesUI: A gallery of usage patterns and real-world scenarios.
- Tests: Coverage for recognition accuracy, performance, and accessibility behavior.
Performance and optimization
- Lightweight recognition: The library processes input locally in memory, avoiding heavy allocations during gesture evaluation.
- Debounced feedback: Haptic and UI updates are debounced to prevent jitter and ensure a smooth user experience.
- Thread safety: Gesture evaluation is designed to be thread-safe, with clear async boundaries where needed.
- Profiling tips: Use the standard Xcode profiler to identify hotspots in gesture handling or haptic dispatch.
Design decisions
- Predictability: Gestures are designed to be repeatable across devices with consistent timing.
- Robustness: Recognizers tolerate minor stray movements to avoid false negatives.
- Extensibility: New gestures can be added by implementing a simple protocol and registering it in the catalog.
- Accessibility-first: Every gesture type includes an accessibility mode or an alternate pattern.
Case studies and scenarios
- Mobile app onboarding: Use simple gestures to reveal tips, with haptic confirmation for completion.
- Gamed UI: Complex gesture sequences drive game states and provide tactile feedback for success.
- Music or audio apps: Gestures drive tempo changes, effects, or track navigation, with subtle haptics to reinforce actions.
- Design tooling: Prototyping UI uses accessible gesture patterns to explore interactions quickly.
Testing and quality (expanded)
- Gesture isolation tests: Validate that each gesture responds correctly in isolation.
- Interaction stress tests: Push the system with rapid, back-to-back gestures to ensure stability.
- Cross-device checks: Confirm behavior on devices with different screen sizes and input tolerances.
- Accessibility audits: Verify that gestures map to accessible actions and that hints are descriptive and clear.
Contributing If you want to contribute to the library, follow these guidelines:
- Start with the issue backlog and pick a small, well-scoped task.
- Create a feature branch with a descriptive name.
- Write unit tests for new functionality.
- Document any new API surfaces with examples.
- Run the full test suite before submitting a pull request.
- Be respectful and precise in your code reviews.
- Keep changes focused and small to maintain code quality.
Roadmap and future work
- More gestures: Expand beyond 30+ with community-driven patterns.
- Cross-platform support: Extend to macOS, watchOS, and other Apple platforms where applicable.
- Better accessibility hooks: Introduce richer VoiceOver cues and educational hints for new gestures.
- Visual debugging: Add a live gesture inspector to visualize recognition in real time.
- Localization: Provide multilingual hints for gesture descriptions and prompts.
Licensing and credits This project uses the MIT license. Contributions are welcome under the same license. Acknowledgments go to designers, developers, and testers who helped refine the gesture experiences, haptic rhythms, and accessibility considerations that make these patterns practical in real apps.
Releases and downloads (repeat) For the latest assets and release notes, visit the releases page again at https://github.com/aluhed/SwiftUI-Gesture-Library/releases. The page includes asset bundles, example apps, and integration helpers to help you get started quickly with your project. The same link is provided above as a quick reference to the official distribution channel.
Images and visuals
- Gesture visualization: An illustrative diagram showing how gesture recognition flows from input to state changes, with an overlay of haptic events.
- Accessibility flow: A schematic showing how gestures map to accessible actions and hints.
- Theme and branding: A simple, clean visual kit that aligns with common iOS design language and SwiftUI aesthetics.
A note about usage and integration
- Plan your gesture strategy early in the UX design process. Map core interactions to a small, stable set of gestures that are easy to discover.
- Consider device variability. Gestures can behave differently on different devices; design for tolerance and provide sensible defaults.
- Use haptics thoughtfully. Align tactile feedback with the action’s meaning and avoid overuse that can feel noisy or intrusive.
- Keep the app responsive. Gestures should trigger UI updates quickly and smoothly to maintain a good feel.
Further learning resources
- Apple’s Human Interface Guidelines on touch and gestures
- SwiftUI best practices for gesture handling
- Accessibility guidelines for gesture-based interactions
- Haptics design considerations for iOS apps
Appendix: glossary
- Gesture: A defined pattern of user input that triggers an action.
- Gesture sequence: An ordered combination of gestures that must occur in a prescribed order.
- Haptic pattern: A predefined tactile feedback profile that accompanies a gesture event.
- Accessibility gestures: Patterns designed to be usable by individuals who rely on assistive tech.
- Gesture catalog: A registry of available gestures with metadata and examples.
Disclaimer The content above is designed to serve as a comprehensive guide for the SwiftUI Gesture Library. The details, examples, APIs, and file names shown here are illustrative and intended to convey usage concepts. Refer to the official repository and the Releases page for exact API references, asset names, and version-specific instructions.
Releases and downloads (final reminder) To obtain the latest assets, see the Releases page: https://github.com/aluhed/SwiftUI-Gesture-Library/releases. You can use the badge at the top of this document to jump there quickly, and you can visit the same URL again in this section.