axiom-mapkit
npx skills add https://github.com/charleswiltgen/axiom --skill axiom-mapkit
Agent 安装分布
Skill 文档
MapKit Patterns
MapKit patterns and anti-patterns for iOS apps. Prevents common mistakes: using MKMapView when SwiftUI Map suffices, annotations in view bodies, setRegion loops, and performance issues with large annotation counts.
When to Use
- Adding a map to your iOS app
- Displaying annotations, markers, or custom pins
- Implementing search (address, POI, autocomplete)
- Adding directions/routing to a map
- Debugging map display issues (annotations not showing, region jumping)
- Optimizing map performance with many annotations
- Deciding between SwiftUI Map and MKMapView
Related Skills
axiom-mapkit-refâ Complete API referenceaxiom-mapkit-diagâ Symptom-based troubleshootingaxiom-core-locationâ Location authorization and monitoring
Part 1: Anti-Patterns (with Time Costs)
| Anti-Pattern | Time Cost | Fix |
|---|---|---|
| Using MKMapView when SwiftUI Map suffices | 2-4 hours UIViewRepresentable boilerplate | Use SwiftUI Map {} for standard map features (iOS 17+) |
| Creating annotations in SwiftUI view body | UI freeze with 100+ items, view recreation on every update | Move annotations to model, use @State or @Observable |
| No annotation view reuse (MKMapView) | Memory spikes, scroll lag with 500+ annotations | dequeueReusableAnnotationView(withIdentifier:for:) |
setRegion in updateUIView without guard |
Infinite loop â region change triggers update, update sets region | Guard with mapView.region != region or use flag |
| Ignoring MapCameraPosition (SwiftUI) | Can’t programmatically control camera, broken “center on user” | Bind position parameter to @State var cameraPosition |
| Synchronous geocoding on main thread | UI freeze for 1-3 seconds per geocode | Use CLGeocoder().geocodeAddressString with async/await |
| Not filtering annotations to visible region | Loading all 10K annotations at once | Use mapView.annotations(in:) or fetch by visible region |
Ignoring resultTypes in MKLocalSearch |
Irrelevant results, slow search | Set .resultTypes = [.pointOfInterest] or .address to filter |
Part 2: Decision Trees
Decision Tree 1: SwiftUI Map vs MKMapView
digraph {
"Need map in app?" [shape=diamond];
"iOS 17+ target?" [shape=diamond];
"Need custom tile overlay?" [shape=diamond];
"Need fine-grained delegate control?" [shape=diamond];
"Use SwiftUI Map" [shape=box];
"Use MKMapView\nvia UIViewRepresentable" [shape=box];
"Need map in app?" -> "iOS 17+ target?" [label="yes"];
"iOS 17+ target?" -> "Need custom tile overlay?" [label="yes"];
"iOS 17+ target?" -> "Use MKMapView\nvia UIViewRepresentable" [label="no"];
"Need custom tile overlay?" -> "Use MKMapView\nvia UIViewRepresentable" [label="yes"];
"Need custom tile overlay?" -> "Need fine-grained delegate control?" [label="no"];
"Need fine-grained delegate control?" -> "Use MKMapView\nvia UIViewRepresentable" [label="yes"];
"Need fine-grained delegate control?" -> "Use SwiftUI Map" [label="no"];
}
When SwiftUI Map is Right (most apps)
- Standard map with markers and annotations
- Programmatic camera control
- Built-in user location display
- Shape overlays (circle, polygon, polyline)
- Map style selection (standard, imagery, hybrid)
- Selection handling
- Clustering
When MKMapView is Required
- Custom tile overlays (e.g., OpenStreetMap, custom imagery)
- Fine-grained delegate control (willBeginLoadingMap, didFinishLoadingMap)
- Custom annotation view animations beyond SwiftUI
- Pre-iOS 17 deployment target
- Advanced overlay rendering with custom MKOverlayRenderer subclasses
Decision Tree 2: Annotation Strategy by Count
Annotation count?
ââ < 100 â Use Marker/Annotation directly in Map {} content builder
â Simple, declarative, no performance concern
â
ââ 100-1000 â Enable clustering
â Set .clusteringIdentifier on annotation views
â SwiftUI: Marker("", coordinate:).tag(id)
â MKMapView: view.clusteringIdentifier = "poi"
â
ââ 1000+ â Server-side clustering or visible-region filtering
Fetch only annotations within mapView.region
Or pre-cluster on server, send cluster centroids
MKMapView with view reuse is preferred for very large datasets
Visible-Region Filtering (SwiftUI)
Only load annotations within the visible map region. Prevents loading all 10K+ annotations at once:
struct MapView: View {
@State private var cameraPosition: MapCameraPosition = .automatic
@State private var visibleAnnotations: [Location] = []
let allLocations: [Location] // Full dataset
var body: some View {
Map(position: $cameraPosition) {
ForEach(visibleAnnotations) { location in
Marker(location.name, coordinate: location.coordinate)
}
}
.onMapCameraChange(frequency: .onEnd) { context in
visibleAnnotations = allLocations.filter { location in
context.region.contains(location.coordinate)
}
}
}
}
extension MKCoordinateRegion {
func contains(_ coordinate: CLLocationCoordinate2D) -> Bool {
let latRange = (center.latitude - span.latitudeDelta / 2)...(center.latitude + span.latitudeDelta / 2)
let lngRange = (center.longitude - span.longitudeDelta / 2)...(center.longitude + span.longitudeDelta / 2)
return latRange.contains(coordinate.latitude) && lngRange.contains(coordinate.longitude)
}
}
Why Clustering Matters
Without clustering at 500 annotations:
- Map is unreadable (pins overlap completely)
- Scroll/zoom lag increases with every annotation
- Memory grows linearly with annotation count
With clustering:
- User sees meaningful groups with counts
- Only visible cluster markers rendered
- Tap to expand reveals individual annotations
Decision Tree 3: Search and Directions
Search implementation:
ââ User types search query
â ââ MKLocalSearchCompleter (real-time autocomplete)
â Configure: resultTypes, region bias
â ââ User selects result
â ââ MKLocalSearch (full result with MKMapItem)
â Use completion.title for MKLocalSearch.Request
â
ââ Programmatic search (e.g., "nearest gas station")
ââ MKLocalSearch with naturalLanguageQuery
Configure: resultTypes, region, pointOfInterestFilter
Directions implementation:
ââ MKDirections.Request
â Set source (MKMapItem.forCurrentLocation()) and destination
â Set transportType (.automobile, .walking, .transit)
â
ââ MKDirections.calculate()
ââ MKRoute
ââ .polyline â Display as MapPolyline or MKPolylineRenderer
ââ .expectedTravelTime â Show ETA
ââ .distance â Show distance
ââ .steps â Turn-by-turn instructions
Part 3: Pressure Scenarios
Scenario 1: “Just Wrap MKMapView in UIViewRepresentable”
Setup: Adding a map to a SwiftUI app. Developer is familiar with MKMapView from UIKit projects.
Pressure: “I know MKMapView well. SwiftUI Map is new and might be limited.”
Expected with skill: Check the decision tree. If the app needs standard markers, annotations, camera control, user location, and shape overlays â SwiftUI Map handles all of that. Use it.
Anti-pattern without skill: 200+ lines of UIViewRepresentable + Coordinator wrapping MKMapView, manually bridging state, implementing delegate methods for annotation views, fighting updateUIView infinite loops â when 20 lines of Map {} with content builder would have worked.
Time cost: 2-4 hours of unnecessary boilerplate + ongoing maintenance burden.
The test: Can you list a specific feature the app needs that SwiftUI Map cannot provide? If not, use SwiftUI Map.
Scenario 2: “Add All 10,000 Pins to the Map”
Setup: App has a database of 10,000 location data points. Product manager wants users to see all locations on the map.
Pressure: “Users need to see ALL locations. Just add them all.”
Expected with skill: Use clustering + visible region filtering. 10K annotations without clustering is unusable â pins overlap, scrolling lags, memory spikes. Clustering shows meaningful groups. Visible region filtering loads only what’s on screen.
Anti-pattern without skill: Adding all 10,000 annotations at once. Map becomes an unreadable blob of overlapping pins. Scroll lag makes the app feel broken. Memory usage spikes 200-400MB.
Implementation path:
- Enable clustering (
.clusteringIdentifier) - Fetch annotations only within visible region (
.onMapCameraChange+ query) - Server-side pre-clustering for datasets > 5K if possible
Scenario 3: “Search Isn’t Finding Results”
Setup: MKLocalSearch returns irrelevant or empty results. Developer considers adding Google Maps SDK.
Pressure: “MapKit search is broken. Let me add a third-party SDK.”
Expected with skill: Check configuration first. MapKit search needs:
resultTypesâ filter to.pointOfInterestor.address(default returns everything)regionâ bias results to the visible map region- Query format â natural language like “coffee shops” works; structured queries don’t
Anti-pattern without skill: Adding Google Maps SDK (50+ MB binary, API key management, billing setup) when MapKit search works correctly with proper configuration.
Time cost: 4-8 hours adding third-party SDK vs 5 minutes configuring MapKit search.
Part 4: Core Location Integration
MapKit and Core Location interact in ways that surprise developers.
Implicit Authorization
When you set showsUserLocation = true on MKMapView or add UserAnnotation() in SwiftUI Map, MapKit implicitly requests location authorization if it hasn’t been requested yet.
This means:
- The authorization prompt appears at map display time, not when the developer expects
- The user sees a prompt with no context about why location is needed
- If denied, the blue dot silently doesn’t appear
Recommended Pattern
Request authorization explicitly BEFORE showing the map:
// 1. Request authorization with context
let session = CLServiceSession(authorization: .whenInUse)
// 2. Then show map with user location
Map {
UserAnnotation()
}
CLServiceSession (iOS 17+)
For continuous location display on a map, create a CLServiceSession:
@Observable
class MapModel {
var cameraPosition: MapCameraPosition = .automatic
private var locationSession: CLServiceSession?
func startShowingUserLocation() {
locationSession = CLServiceSession(authorization: .whenInUse)
}
func stopShowingUserLocation() {
locationSession = nil
}
}
Cross-Reference
For full authorization decision trees, monitoring patterns, and background location:
axiom-core-locationâ Authorization strategy, monitoring approachaxiom-core-location-diagâ “Location not working” troubleshootingaxiom-energyâ Location as battery subsystem
Part 5: SwiftUI Map Quick Start
The most common pattern â a map with markers and user location:
struct ContentView: View {
@State private var cameraPosition: MapCameraPosition = .automatic
@State private var selectedItem: MKMapItem?
let locations: [Location] // Your model
var body: some View {
Map(position: $cameraPosition, selection: $selectedItem) {
UserAnnotation()
ForEach(locations) { location in
Marker(location.name, coordinate: location.coordinate)
.tint(location.category.color)
}
}
.mapStyle(.standard(elevation: .realistic))
.mapControls {
MapUserLocationButton()
MapCompass()
MapScaleView()
}
.onChange(of: selectedItem) { _, item in
if let item {
handleSelection(item)
}
}
}
}
Key Points
@State var cameraPositionâ bind for programmatic camera controlselection: $selectedItemâ handle tap on markersMapCameraPosition.automaticâ system manages initial view.mapControls {}â built-in UI for location button, compass, scaleForEachin content builder â dynamic annotations from data
Part 6: Search Implementation Pattern
Complete search with autocomplete:
@Observable
class SearchModel {
var searchText = ""
var completions: [MKLocalSearchCompletion] = []
var searchResults: [MKMapItem] = []
private let completer = MKLocalSearchCompleter()
private var completerDelegate: CompleterDelegate?
init() {
completerDelegate = CompleterDelegate { [weak self] results in
self?.completions = results
}
completer.delegate = completerDelegate
completer.resultTypes = [.pointOfInterest, .address]
}
func updateSearch(_ text: String) {
searchText = text
completer.queryFragment = text
}
func search(for completion: MKLocalSearchCompletion) async throws {
let request = MKLocalSearch.Request(completion: completion)
request.resultTypes = [.pointOfInterest, .address]
let search = MKLocalSearch(request: request)
let response = try await search.start()
searchResults = response.mapItems
}
func search(query: String, in region: MKCoordinateRegion) async throws {
let request = MKLocalSearch.Request()
request.naturalLanguageQuery = query
request.region = region
request.resultTypes = .pointOfInterest
let search = MKLocalSearch(request: request)
let response = try await search.start()
searchResults = response.mapItems
}
}
MKLocalSearchCompleter Delegate (Required)
class CompleterDelegate: NSObject, MKLocalSearchCompleterDelegate {
let onUpdate: ([MKLocalSearchCompletion]) -> Void
init(onUpdate: @escaping ([MKLocalSearchCompletion]) -> Void) {
self.onUpdate = onUpdate
}
func completerDidUpdateResults(_ completer: MKLocalSearchCompleter) {
onUpdate(completer.results)
}
func completer(_ completer: MKLocalSearchCompleter, didFailWithError error: Error) {
// Handle error â network issues, rate limiting
}
}
Rate Limiting
Apple rate-limits MapKit search. For autocomplete:
MKLocalSearchCompleterhandles its own throttling internally- Don’t create a new completer per keystroke â reuse one instance
- Set
queryFragmenton each keystroke; the completer debounces
For MKLocalSearch:
- Don’t fire a search on every keystroke â use the completer for autocomplete
- Fire
MKLocalSearchonly when the user selects a completion or submits
Part 7: Directions Implementation Pattern
func calculateDirections(
from source: CLLocationCoordinate2D,
to destination: MKMapItem,
transportType: MKDirectionsTransportType = .automobile
) async throws -> MKRoute {
let request = MKDirections.Request()
request.source = MKMapItem(placemark: MKPlacemark(coordinate: source))
request.destination = destination
request.transportType = transportType
let directions = MKDirections(request: request)
let response = try await directions.calculate()
guard let route = response.routes.first else {
throw MapError.noRouteFound
}
return route
}
Displaying the Route (SwiftUI)
Map(position: $cameraPosition) {
if let route {
MapPolyline(route.polyline)
.stroke(.blue, lineWidth: 5)
}
Marker("Start", coordinate: startCoord)
Marker("End", coordinate: endCoord)
}
Displaying the Route (MKMapView)
// Add overlay
mapView.addOverlay(route.polyline, level: .aboveRoads)
// Implement renderer delegate
func mapView(_ mapView: MKMapView, rendererFor overlay: MKOverlay) -> MKOverlayRenderer {
if let polyline = overlay as? MKPolyline {
let renderer = MKPolylineRenderer(polyline: polyline)
renderer.strokeColor = .systemBlue
renderer.lineWidth = 5
return renderer
}
return MKOverlayRenderer(overlay: overlay)
}
Route Information
let route: MKRoute = ...
let travelTime = route.expectedTravelTime // TimeInterval in seconds
let distance = route.distance // CLLocationDistance in meters
let steps = route.steps // [MKRoute.Step]
for step in steps {
print("\(step.instructions) â \(step.distance)m")
// "Turn right on Main St â 450m"
}
Part 8: Clustering Pattern
SwiftUI (iOS 17+)
Map(position: $cameraPosition) {
ForEach(locations) { location in
Marker(location.name, coordinate: location.coordinate)
.tag(location.id)
}
.mapItemClusteringIdentifier("locations")
}
MKMapView
func mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) -> MKAnnotationView? {
if let cluster = annotation as? MKClusterAnnotation {
let view = mapView.dequeueReusableAnnotationView(
withIdentifier: "cluster",
for: annotation
) as! MKMarkerAnnotationView
view.markerTintColor = .systemBlue
view.glyphText = "\(cluster.memberAnnotations.count)"
return view
}
let view = mapView.dequeueReusableAnnotationView(
withIdentifier: "pin",
for: annotation
) as! MKMarkerAnnotationView
view.clusteringIdentifier = "locations"
view.markerTintColor = .systemRed
return view
}
Clustering Requirements
- All annotation views that should cluster MUST share the same
clusteringIdentifier - Register annotation view classes:
mapView.register(MKMarkerAnnotationView.self, forAnnotationViewWithReuseIdentifier: "pin") - Clustering only activates when annotations physically overlap at the current zoom level
- System manages cluster/uncluster animation automatically
Part 9: Pre-Release Checklist
- Map loads and displays correctly
- Annotations appear at correct coordinates (lat/lng not swapped)
- Clustering works with 100+ annotations
- Search returns relevant results (resultTypes configured)
- Camera position controllable programmatically
- Memory stable when scrolling/zooming with many annotations
- User location shows correctly (authorization handled before display)
- Directions render as polyline overlay
- Map works in Dark Mode (map styles adapt automatically)
- Accessibility: VoiceOver announces map elements
- No setRegion/updateUIView infinite loops (if using MKMapView)
- MKLocalSearchCompleter reused (not recreated per keystroke)
- Annotation views reused via
dequeueReusableAnnotationView(MKMapView) - Look Around availability checked before displaying (
MKLookAroundSceneRequest)
Resources
WWDC: 2023-10043, 2024-10094
Docs: /mapkit, /mapkit/map
Skills: mapkit-ref, mapkit-diag, core-location