etter /ˈɛtɐ/ n. (Swiss German) — the boundary or enclosure marking the edge of a village or commune; a natural demarcation between settled and unsettled land.
Natural language geographic query parsing using LLMs.
etter transforms natural language location queries into structured geographic filters that can be used by search engines and spatial databases. It uses Large Language Models (LLMs) to understand multilingual queries and extract spatial relationships.
Key Principle: etter's sole purpose is to extract the geographic filter from user queries. It does NOT handle feature/activity identification or search execution.
Tip
Documentation available at https://geoblocks.github.io/etter/
The development of this library is sponsored by Camptocamp.
- Geographic Filters Only: Extracts spatial relationships from queries, ignoring non-geographic content
- Multilingual Support: Parse queries in English, German, French, Italian, and more
- Rich Spatial Relations: Support for containment, buffer, and directional queries
- Structured Output: Pydantic models with full type safety
- Streaming Support: Real-time feedback with reasoning transparency for responsive UIs
- Flexible Configuration: Customizable spatial relations and confidence thresholds
- LLM Provider Agnostic: Works with OpenAI, Anthropic, or local models
✅ etter extracts:
- Spatial relations: "north of", "in", "near", etc.
- Reference locations: "Lausanne", "Lake Geneva", etc.
- Distance parameters: "within 5km", "around 2 miles", etc.
❌ etter does NOT handle:
- Feature/activity identification: "hiking", "restaurants", "hotels"
- Attribute filtering: "with children", "vegetarian", "4-star"
- Search execution or database queries
Integration Pattern: Parent application handles feature/activity filtering and combines it with etter's geographic filter for complete search functionality.
pip install etterWith PostGIS datasource support:
pip install etter[postgis]This project uses uv for dependency management.
git clone https://github.com/geoblocks/etter.git
cd etter
uv sync --extra devFor all the demos, set a valid model name and API key in .env file:
cat <<EOF > .env
LLM_API_KEY="sk-..."
LLM_MODEL="gpt-4o"
EOFAn interactive REPL is available for testing queries interactively:
uv run python repl.pyA FastAPI demo server is available that combines query parsing with geographic resolution using SwissNames3D data.
uv run uvicorn demo.main:app --port 8000 --reloadThe API will be available at http://localhost:8000.
from langchain.chat_models import init_chat_model
from etter import GeoFilterParser
import os
# Initialize LLM
llm = init_chat_model(
model=os.getenv("LLM_MODEL", "gpt-4o"),
temperature=0,
api_key=os.getenv("LLM_API_KEY")
)
# Initialize parser
parser = GeoFilterParser(
llm=llm,
confidence_threshold=0.6,
strict_mode=False
)
# Strict mode - raises error on low confidence
parser = GeoFilterParser(
llm=llm,
confidence_threshold=0.8,
strict_mode=True
)from etter import SpatialRelationConfig, RelationConfig
config = SpatialRelationConfig()
config.register_relation(RelationConfig(
name="close_to",
category="buffer",
description="Very close proximity",
default_distance_m=1000,
buffer_from="center"
))
parser = GeoFilterParser(spatial_config=config)Main class for parsing queries.
Methods:
parse(query: str) -> GeoQuery: Parse a single queryaparse(query: str) -> GeoQuery: Async version ofparse(awaitsainvokeon the LLM)parse_stream(query: str) -> AsyncGenerator[dict]: Parse with streaming eventsparse_batch(queries: List[str]) -> List[GeoQuery]: Parse multiple queriesget_available_relations(category: Optional[str]) -> List[str]: List available relationsdescribe_relation(name: str) -> str: Get relation description
Constructor options (selected):
confidence_threshold: Minimum confidence to accept (0–1, default0.6)strict_mode: RaiseLowConfidenceErrorinstead of warning (defaultFalse)include_examples: Include few-shot examples in the prompt (defaultTrue)datasource:GeoDataSourceinstance — informs the LLM of available concrete typesadditional_instructions: Free-form text injected as a system message after the main prompt and before few-shot examples. Use for region-specific endonyms, domain aliases, or organization-specific place names.
Structured output model representing the parsed geographic filter.
Attributes:
query_type: Type of query (simple, compound, split, boolean)spatial_relation: Spatial relationship (e.g., "north_of", "in", "near")reference_location: Reference location (e.g., "Lausanne")buffer_config: Buffer parameters (optional)confidence_breakdown: Confidence scoresoriginal_query: Original input text
Note: etter is fully implemented with three integrated layers: parsing, geographic resolution via datasources, and spatial operations. The demo API shows a complete end-to-end workflow that resolves locations and computes search areas.
in: Exact boundary matching
near: Proximity with context-aware distance (default 5km, LLM infers based on activity, feature scale, and intent)on_shores_of: 1km ring buffer (excludes water body)along: 500m buffer for linear featuresleft_bank,right_bank: Buffer on one side of a linear feature (river, road) relative to its flow directionin_the_heart_of: Erosion for central areas (default -500m, LLM infers based on area size)
- Cardinal:
north_of,south_of,east_of,west_of: 10km sector (90° each) - Diagonal:
northeast_of,southeast_of,southwest_of,northwest_of: 10km sector (90° each)
from etter import ParsingError, UnknownRelationError, LowConfidenceError
try:
result = parser.parse("some query")
except ParsingError as e:
print(f"Failed to parse: {e}")
print(f"Raw LLM response: {e.raw_response}")
except UnknownRelationError as e:
print(f"Unknown relation: {e.relation_name}")
except LowConfidenceError as e:
print(f"Low confidence: {e.confidence}")
print(f"Reasoning: {e.reasoning}")Here are some good example queries to try with the demo application:
walk in the Gros-de-Vaudon the shores of the lac Moratnear Lausannesouth west of Lausanne5km north of Lausannewalking distance from Zurich main railway station15 min biking from Zurich main railway stationalong l'Orbe2km right bank of the Rhône
See ARCHITECTURE.md for detailed system design.
# Install dev dependencies
uv sync --extra dev
# Run tests
uv run pytest
# Format code
uv run ruff format
# Linting
uv run ruff check
# Type checking
uv run ty checkReleases are automated via Release Please. The version and changelog are determined by PR titles using the Conventional Commits format:
| PR title prefix | Version bump |
|---|---|
feat: ... |
minor (0.x.0) |
fix: ..., perf: ... |
patch (0.0.x) |
feat!: ... or BREAKING CHANGE |
major (x.0.0) |
All other prefixes (chore:, docs:, refactor:, test:, ci:) do not trigger a release.
When a releasable commit lands on main, Release Please opens a release PR. Merging it tags the commit and triggers the PyPI publish workflow.

