Support
iPhone — Mac — Apple Vision ProVersion 2.0 was designed for Dark Mode. If you are using Light Mode, some text and UI elements may be difficult to read. To fix this, go to Settings > Display & Brightness and select Dark. This will be addressed in the next release.
Developer: Glen Speckert
Email: speckert@specktech.com
For bug reports, feature requests, or any questions about Spatial Agents, please email directly. You will typically receive a response within 48 hours.
Spatial Agents is an intelligence briefing app that uses Apple Intelligence to analyze news from multiple sources and generate structured dashboards on any topic you choose. All AI processing happens entirely on your device.
Spatial Agents uses Apple's on-device foundation model (via the FoundationModels framework) to analyze articles, generate dashboard summaries, extract key actors, build timelines, and create forecasts. This requires an Apple Intelligence-capable device: iPhone 15 Pro or later.
Go to Settings > Apple Intelligence & Siri and enable Apple Intelligence. The on-device model may need to download before first use. Spatial Agents will show a status message if the model is still being prepared.
Spatial Agents pulls from 15+ sources including Reuters world news, USGS earthquakes, NASA EONET natural events, NWS weather alerts, GDELT global events, and more. You can enable or disable individual sources in the app's settings.
Go to the Agents tab, enter a topic (or choose a suggested topic), and tap Generate. The on-device AI will analyze relevant articles and produce a structured briefing with status, timeline, metrics, actor networks, and forecasts.
Spatial Agents offers three levels of intelligence products, each progressively deeper:
Dashboard generates a single-pass briefing from news sources. It produces a 6-section report (status, timeline, metrics, actors, forecasts, reasoning) in one AI generation pass.
Progressive Refinement uses a two-pass approach: a quick headlines sweep followed by panel-by-panel deep refinement with focused prompts. This takes longer than a Dashboard because the AI makes multiple passes over the material, sharpening each section individually.
Deep Analysis is the most thorough option. It decomposes your topic into a hierarchical tree of sub-topics, researches each leaf independently, then integrates findings upward. This involves many AI calls and takes the longest. On devices with early Apple Intelligence support (such as iPhone 15 Pro), please be patient — each AI call runs on-device and processing times will be longer than on newer hardware.
When starting a Deep Analysis, you choose a depth level:
2L (2 Levels) — Your topic is split into a few major sub-topics, each analyzed individually. This is the fastest option and works well for focused questions.
3L (3 Levels) — Sub-topics are further decomposed into more granular themes. This provides broader coverage and more detailed findings. Good balance of depth and speed.
4L (4 Levels) — The deepest decomposition. Each branch is split three times, producing many leaf nodes that are each researched independently. This gives the most comprehensive analysis but takes significantly longer, especially on earlier Apple Intelligence devices.
After a Deep Analysis completes, you can navigate the results interactively:
The root tile at the top shows the overall conclusion and key finding for your topic. Below it are the Level 1 branches — tap any branch to see its sub-topics and findings.
Each tile shows a colored status dot (green = complete, orange = in progress, red = error). Tiles with articles show the number of source references used.
The Reasoning section at the bottom provides four cards you can tab through: Sources (how many references were used), Decomposition (how the topic was split), Confidence (evidence strength), and Conclusion (the AI's integrated finding).
Walk through the tree from root to leaves to understand how the AI broke down a complex topic and what evidence it found at each level.
Spatial Agents runs multiple specialized agents that continuously monitor your news feeds: trend detection, severity escalation, geographic clustering, dynamic pattern analysis, and AI-powered insights. Each agent surfaces findings independently on the Agents tab.
The app requires iPhone 15 Pro or later for dashboard generation (Apple Intelligence). Feed browsing and agent monitoring work on any supported device, but the AI-powered features need Apple Intelligence hardware.
No. All AI analysis is performed entirely on your device using Apple's on-device foundation model. The only network requests are to fetch public feeds from sources like Reuters, USGS, and NASA EONET. No personal data, topics, or dashboard content is transmitted anywhere.
Yes. Spatial Agents is also available on Mac and Apple Vision Pro with native interfaces tailored to each platform.
Spatial Agents processes all intelligence analysis on-device using Apple Intelligence. No user data, search queries, topics, or generated dashboards are collected, stored, or transmitted to any external server. The app only makes network requests to fetch publicly available RSS news feeds.