Open engine
DuckDB, parquet, and a shared loc_id schema. The runtime is on
GitHub. Bring your own datasets into the same data model and run them
locally.
An open geographic query engine with a map-first interface. Ask a location-based question, get an answer grounded in maintained, source-documented packs.
DuckDB, parquet, and a shared loc_id schema. The runtime is on
GitHub. Bring your own datasets into the same data model and run them
locally.
Source-documented data bundles with provenance, coverage, and QA built in. The pack is the unit of installation, citation, and trust.
Explore for map browsing, Research for bounded analytical workspaces with saved corpora, Ops for event monitoring, and agent access through MCP and HTTP.
Public geographic data is fragmented across portals, formats, geographies, and update cadences. DaedalMap makes that terrain navigable.
Built by
DaedalMap is built by Bryan Hellard. Background in robotics and systems engineering, focused on disaster impacts, response, and making geographic data accessible for decision-making.
Vice chair of STAR-TIDES, the global resilience and disaster-response network coordinated through George Mason University. Sibling project Disaster Clippy runs on the same distribution architecture, adding a conversational search layer over disaster-prep guides alongside DaedalMap's data layer.
Substack · LinkedIn · GitHub · X / Twitter · contact@daedalmap.com
Disasters, demographics, economics, climate, and risk stay queryable in one system. Same engine for map exploration, Research workspaces, operational watch, and agent calls.
Ask a question, load a corpus, inspect the source, continue into analysis.
Public datasets DaedalMap hosts after their publishers stopped distributing them. Mirror provenance and drift notes per source, including the CIA World Factbook, FEMA Future NRI, and CEJST.
The full source map. Which families are in play, what they cover, and which packs ship on the live MCP catalog.
Local install
Local DaedalMap runs entirely on your machine. Install the packs you need, work offline, keep your data private, run 2x to 6x faster than hosted. Local AI models plug in directly. Nothing has to leave the machine.