diff --git a/AGENTS.md b/AGENTS.md
new file mode 100644
index 0000000..55f438d
--- /dev/null
+++ b/AGENTS.md
@@ -0,0 +1,146 @@
+# AGENTS.md
+
+## Purpose
+
+This repository contains the Mintlify source for [docs.tilebox.com](https://docs.tilebox.com). Use this file to keep documentation updates consistent with the current information architecture, writing style, and tooling.
+
+## Information Architecture
+
+Navigation is defined in `docs.json` and currently follows this top-level structure:
+
+1. `Documentation`: product docs, concepts, and capability docs for Datasets, Storage, and Workflows.
+2. `User Guides`: task-focused walkthroughs (mostly step-by-step procedures).
+3. `Languages & SDKs`: language-specific install and usage docs for Python and Go.
+4. `API Reference`: method-level reference pages for Python and Go clients.
+5. `Changelog`: product updates.
+
+When adding or moving pages:
+
+1. Keep `docs.json` in sync with actual files.
+2. Keep pages grouped by product area (`datasets/`, `workflows/`, `storage/`, `sdks/`, `guides/`, `api-reference/`).
+3. Preserve the current pattern where high-level landing pages link to deeper pages via `Card`/`HeroCard` blocks.
+
+## Diátaxis Mapping
+
+We follow [Diátaxis](https://diataxis.fr/). Pick one primary intent per page and keep it focused.
+
+1. Tutorial: onboarding learning journeys (for example `quickstart.mdx`).
+2. How-to guide: concrete goals via ordered steps (primarily under `guides/**`).
+3. Explanation: conceptual understanding and mental models (for example `datasets/concepts/**`, `workflows/concepts/**`, introductions).
+4. Reference: factual lookup docs (primarily `api-reference/**`, parameter tables, changelog entries).
+
+Practical rule: if a page starts drifting into multiple intents, split it into separate pages and cross-link them.
+
+## Writing Style And Tone
+
+Follow the existing house style:
+
+1. Audience: developers and technical users integrating Tilebox.
+2. Voice: direct, clear, and pragmatic; prefer active voice and short paragraphs.
+3. Person: usually second person (`you`) for guidance.
+4. Tense: present tense for behavior and capabilities.
+5. Scope: explain what the reader needs now; avoid broad marketing language.
+
+Common page flow patterns in this repo:
+
+1. Short context paragraph after frontmatter.
+2. `Related documentation` card section when useful.
+3. Step-by-step sections for procedures.
+4. `Next steps` links/cards at the end.
+
+## Terminology, Capitalization, And Naming
+
+Use consistent product language:
+
+1. `Tilebox` (capital T) everywhere.
+2. Product/module names: `Tilebox Console`, `Tilebox Datasets`, `Tilebox Workflows`.
+3. Generic references are lowercase: `datasets`, `workflows`, `client`, `collection`, `job`, `task`.
+4. Dataset kind names used in docs: `Timeseries` and `Spatio-temporal`.
+5. Keep acronyms uppercase (`API`, `SDK`, `MCP`, `UUID`, `UTC`).
+
+Heading style in the existing docs is mostly concise sentence-style phrases (with selective title casing). Match nearby pages rather than enforcing a new global style.
+
+## MDX And Mintlify Conventions
+
+For new non-reference pages, include frontmatter fields matching existing docs:
+
+1. `title`
+2. `description`
+3. `icon`
+4. Optional short `sidebarTitle` if the title is too long.
+
+Prefer existing Mintlify components and patterns:
+
+1. `CodeGroup` for multi-language examples (usually Python first, Go second when both exist).
+2. `Steps`/`Step` for procedural flows.
+3. `Tip`/`Note`/`Info`/`Warning` for callouts.
+4. `Columns` + `Card` for internal navigation.
+5. `Frame` for screenshots.
+
+Images should follow the existing light/dark pattern when both variants exist:
+
+```mdx
+
+
+```
+
+## Dev Tooling
+
+Core tooling used in this repo:
+
+1. Mintlify CLI for local preview and link checks.
+2. Vale for prose linting.
+3. `pre-commit` hook running Vale.
+4. GitHub Actions CI for Vale + broken link checks.
+
+Local setup and validation commands:
+
+```bash
+# install tooling
+npm i -g mintlify
+vale sync
+pre-commit install
+
+# run docs locally
+mintlify dev
+
+# lint prose
+vale .
+
+# check broken links
+mintlify broken-links
+
+# optional: run all hooks
+pre-commit run --all-files
+```
+
+CI notes:
+
+1. CI installs Node `24`.
+2. CI installs `mdx2vast` before running Vale.
+3. CI runs `vale sync && vale .` and `mintlify broken-links`.
+
+## Diagrams And Assets
+
+Workflow diagrams under `assets/workflows/diagrams/` are generated from `.d2` files via `generate.py`.
+
+When updating workflow diagrams:
+
+1. Edit the `.d2` source.
+2. Regenerate SVG assets with `python generate.py` from `assets/workflows/diagrams/`.
+3. Commit both source and generated SVG outputs.
+
+## Update Checklist For Agents
+
+For any substantial docs update, verify all of the following:
+
+1. Page intent matches a single Diátaxis category.
+2. Page is linked in `docs.json` in the correct section.
+3. Internal links resolve and `mintlify broken-links` passes.
+4. Vale warnings/errors are addressed or intentionally accepted.
+5. New screenshots/assets are optimized and use correct paths.
+6. Language and terminology match surrounding Tilebox docs.
+
+## Notes On API Reference Pages
+
+`api-reference/**` pages are reference-first and should remain concise and factual. Keep examples minimal and parameter descriptions explicit. Avoid turning reference pages into long tutorials; link to guides instead.
diff --git a/api-reference/go/datasets/As.mdx b/api-reference/go/datasets/As.mdx
index 27e3a52..f88f6b4 100644
--- a/api-reference/go/datasets/As.mdx
+++ b/api-reference/go/datasets/As.mdx
@@ -35,7 +35,12 @@ endDate := time.Date(2021, 2, 24, 0, 0, 0, 0, time.UTC)
queryInterval := query.NewTimeInterval(startDate, endDate)
seq := datasets.As[*v1.Sentinel1Sar](
- client.Datapoints.Query(ctx, collectionID, datasets.WithTemporalExtent(queryInterval)),
+ client.Datapoints.Query(
+ ctx,
+ datasetID,
+ datasets.WithCollectionIDs(collectionID),
+ datasets.WithTemporalExtent(queryInterval),
+ ),
)
```
diff --git a/api-reference/go/datasets/Collect.mdx b/api-reference/go/datasets/Collect.mdx
index daf2ae9..705f47e 100644
--- a/api-reference/go/datasets/Collect.mdx
+++ b/api-reference/go/datasets/Collect.mdx
@@ -35,7 +35,12 @@ endDate := time.Date(2021, 2, 24, 0, 0, 0, 0, time.UTC)
queryInterval := query.NewTimeInterval(startDate, endDate)
datapoints, err := datasets.Collect(datasets.As[*v1.Sentinel1Sar](
- client.Datapoints.Query(ctx, collectionID, datasets.WithTemporalExtent(queryInterval)),
+ client.Datapoints.Query(
+ ctx,
+ datasetID,
+ datasets.WithCollectionIDs(collectionID),
+ datasets.WithTemporalExtent(queryInterval),
+ ),
))
```
diff --git a/api-reference/go/datasets/CollectAs.mdx b/api-reference/go/datasets/CollectAs.mdx
index 76c2c04..a20d781 100644
--- a/api-reference/go/datasets/CollectAs.mdx
+++ b/api-reference/go/datasets/CollectAs.mdx
@@ -37,7 +37,12 @@ endDate := time.Date(2021, 2, 24, 0, 0, 0, 0, time.UTC)
queryInterval := query.NewTimeInterval(startDate, endDate)
datapoints, err := datasets.CollectAs[*v1.Sentinel1Sar](
- client.Datapoints.Query(ctx, collectionID, datasets.WithTemporalExtent(queryInterval)),
+ client.Datapoints.Query(
+ ctx,
+ datasetID,
+ datasets.WithCollectionIDs(collectionID),
+ datasets.WithTemporalExtent(queryInterval),
+ ),
)
```
diff --git a/api-reference/go/datasets/Datapoints.GetInto.mdx b/api-reference/go/datasets/Datapoints.GetInto.mdx
index 0c50ad5..be5f947 100644
--- a/api-reference/go/datasets/Datapoints.GetInto.mdx
+++ b/api-reference/go/datasets/Datapoints.GetInto.mdx
@@ -7,21 +7,21 @@ icon: layer-group
```go
func (datapointClient) GetInto(
ctx context.Context,
- collectionIDs []uuid.UUID,
+ datasetID uuid.UUID,
datapointID uuid.UUID,
datapoint proto.Message,
options ...QueryOption,
) error
```
-Get a data point by its id from one of the specified collections.
+Get a datapoint by its ID from one or more collections of the same dataset.
The data point is stored in the `datapoint` parameter.
## Parameters
-
- The ids of the collections to query
+
+ The ID of the dataset to query.
The id of the datapoint to query
@@ -35,6 +35,12 @@ The data point is stored in the `datapoint` parameter.
## Options
+
+ Restrict the lookup to specific dataset collections by collection object.
+
+
+ Restrict the lookup to specific dataset collections by collection ID.
+
Skip the data when querying datapoint.
If set, only the required and auto-generated fields will be returned.
@@ -48,7 +54,10 @@ An error if data point could not be queried.
```go Go
var datapoint v1.Sentinel1Sar
err = client.Datapoints.GetInto(ctx,
- []uuid.UUID{collection.ID}, datapointID, &datapoint,
+ dataset.ID,
+ datapointID,
+ &datapoint,
+ datasets.WithCollectionIDs(collection.ID),
)
```
diff --git a/api-reference/go/datasets/Datapoints.Query.mdx b/api-reference/go/datasets/Datapoints.Query.mdx
index 53c4443..8387836 100644
--- a/api-reference/go/datasets/Datapoints.Query.mdx
+++ b/api-reference/go/datasets/Datapoints.Query.mdx
@@ -7,23 +7,23 @@ icon: layer-group
```go
func (datapointClient) Query(
ctx context.Context,
- collectionIDs []uuid.UUID,
+ datasetID uuid.UUID,
options ...datasets.QueryOption,
) iter.Seq2[[]byte, error]
```
-Query a range of data points in the specified collections in a specified interval.
+Query datapoints from one or more collections of the same dataset.
The datapoints are lazily queried and returned as a sequence of bytes.
The output sequence can be transformed into a typed `proto.Message` using [CollectAs](/api-reference/go/datasets/CollectAs) or [As](/api-reference/go/datasets/As) functions.
## Parameters
-
- The ids of the collections to query
+
+ The ID of the dataset to query.
- Options for querying data points
+ Options for querying datapoints.
## Options
@@ -36,6 +36,15 @@ The output sequence can be transformed into a typed `proto.Message` using [Colle
Specify the geographical extent in which to query data.
Optional, if not specified the query will return all results found globally.
+
+ Specify a geographical extent with an explicit spatial filter mode and coordinate system.
+
+
+ Restrict the query to specific dataset collections by collection object.
+
+
+ Restrict the query to specific dataset collections by collection ID.
+
Skip the data when querying datapoints.
If set, only the required and auto-generated fields will be returned.
@@ -58,7 +67,12 @@ endDate := time.Date(2021, 2, 24, 0, 0, 0, 0, time.UTC)
queryInterval := query.NewTimeInterval(startDate, endDate)
datapoints, err := datasets.CollectAs[*v1.Sentinel1Sar](
- client.Datapoints.Query(ctx, []uuid.UUID{collection.ID}, datasets.WithTemporalExtent(queryInterval)),
+ client.Datapoints.Query(
+ ctx,
+ dataset.ID,
+ datasets.WithCollectionIDs(collection.ID),
+ datasets.WithTemporalExtent(queryInterval),
+ ),
)
```
diff --git a/api-reference/go/datasets/Datapoints.QueryInto.mdx b/api-reference/go/datasets/Datapoints.QueryInto.mdx
index 85e565d..f3bc41b 100644
--- a/api-reference/go/datasets/Datapoints.QueryInto.mdx
+++ b/api-reference/go/datasets/Datapoints.QueryInto.mdx
@@ -7,20 +7,20 @@ icon: layer-group
```go
func (datapointClient) QueryInto(
ctx context.Context,
- collectionIDs []uuid.UUID,
+ datasetID uuid.UUID,
datapoints any,
options ...datasets.QueryOption,
) error
```
-Query a range of data points in the specified collections in a specified interval.
+Query datapoints from one or more collections of the same dataset into a slice.
QueryInto is a convenience function for [Query](/api-reference/go/datasets/Datapoints.Query), when no manual pagination or custom iteration is required.
## Parameters
-
- The ids of the collections to query
+
+ The ID of the dataset to query.
The datapoints to query into
@@ -39,6 +39,15 @@ QueryInto is a convenience function for [Query](/api-reference/go/datasets/Datap
Specify the geographical extent in which to query data.
Optional, if not specified the query will return all results found globally.
+
+ Specify a geographical extent with an explicit spatial filter mode and coordinate system.
+
+
+ Restrict the query to specific dataset collections by collection object.
+
+
+ Restrict the query to specific dataset collections by collection ID.
+
Skip the data when querying datapoints.
If set, only the required and auto-generated fields will be returned.
@@ -62,8 +71,9 @@ queryInterval := query.NewTimeInterval(startDate, endDate)
var datapoints []*v1.Sentinel1Sar
err := client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(queryInterval),
)
```
diff --git a/api-reference/python/tilebox.datasets/Dataset.find.mdx b/api-reference/python/tilebox.datasets/Dataset.find.mdx
new file mode 100644
index 0000000..c693e22
--- /dev/null
+++ b/api-reference/python/tilebox.datasets/Dataset.find.mdx
@@ -0,0 +1,82 @@
+---
+title: Dataset.find
+icon: database
+---
+
+```python
+def Dataset.find(
+ datapoint_id: str | UUID,
+ collections: list[str] | list[UUID] | list[Collection] | list[CollectionInfo] | list[CollectionClient] | None = None,
+ skip_data: bool = False,
+) -> xarray.Dataset
+```
+
+Find a specific datapoint by ID across one or more collections in this dataset.
+
+If `collections` is not provided, all collections in the dataset are searched.
+
+## Parameters
+
+
+ The ID of the datapoint to find.
+
+
+
+ Optional collection scope for the lookup.
+
+ Supported values include collection names, collection IDs, and collection objects. If omitted or set to `None`, all collections in the dataset are searched.
+
+
+
+ If `True`, only required datapoint fields are returned (`time`, `id`, `ingestion_time`). Defaults to `False`.
+
+
+## Returns
+
+An [`xarray.Dataset`](/sdks/python/xarray) containing the requested datapoint.
+
+Since it returns only a single datapoint, the output dataset does not include a `time` dimension.
+
+## Errors
+
+
+ Raised when `datapoint_id` is not a valid UUID.
+
+
+
+ Raised when one or more collection names do not exist in the dataset.
+
+
+
+ Raised when one or more provided collection IDs/objects are not part of the dataset.
+
+
+
+ Raised when the datapoint cannot be found in the selected collections.
+
+
+
+```python Python
+# search all collections in the dataset
+datapoint = dataset.find("0186d6b6-66cc-fcfd-91df-bbbff72499c3")
+
+# search specific collections only
+datapoint = dataset.find(
+ "0186d6b6-66cc-fcfd-91df-bbbff72499c3",
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A"],
+)
+
+# check if a datapoint exists
+from tilebox.datasets.sync.dataset import NotFoundError
+
+try:
+ dataset.find(
+ "0186d6b6-66cc-fcfd-91df-bbbff72499c3",
+ collections=["S2A_S2MSI2A"],
+ skip_data=True,
+ )
+ exists = True
+except NotFoundError:
+ exists = False
+```
+
diff --git a/api-reference/python/tilebox.datasets/Dataset.query.mdx b/api-reference/python/tilebox.datasets/Dataset.query.mdx
new file mode 100644
index 0000000..47ab957
--- /dev/null
+++ b/api-reference/python/tilebox.datasets/Dataset.query.mdx
@@ -0,0 +1,92 @@
+---
+title: Dataset.query
+icon: database
+---
+
+```python
+def Dataset.query(
+ *,
+ collections: list[str] | list[UUID] | list[Collection] | list[CollectionInfo] | list[CollectionClient] | dict[str, CollectionClient] | None = None,
+ temporal_extent: TimeIntervalLike,
+ spatial_extent: SpatialFilterLike | None = None,
+ skip_data: bool = False,
+ show_progress: bool | Callable[[float], None] = False,
+) -> xarray.Dataset
+```
+
+Query data points across one or more collections in this dataset.
+
+If `collections` is not provided, all collections in the dataset are queried.
+If no data matches the filters, an empty `xarray.Dataset` is returned.
+
+## Parameters
+
+
+ Optional collection scope for the query.
+
+ Supported values include:
+ - A list of collection names (`list[str]`)
+ - A list of collection IDs (`list[UUID]`)
+ - A list of collection objects (`list[Collection]`, `list[CollectionInfo]`, `list[CollectionClient]`)
+ - The dictionary returned by `dataset.collections()`
+
+ If omitted or set to `None`, all collections in the dataset are queried.
+
+
+
+ The time or time interval to query. This can be a single time scalar, a tuple of two time scalars, or a `TimeInterval` object.
+
+
+
+ Optional spatial filter. Use this for spatial queries in spatio-temporal datasets.
+
+
+
+ If `True`, only required datapoint fields are returned (`time`, `id`, `ingestion_time`). Defaults to `False`.
+
+
+
+ If `True`, display a progress bar when pagination is required. You can also pass a callback to receive progress values between `0` and `1`. Defaults to `False`.
+
+
+## Returns
+
+An [`xarray.Dataset`](/sdks/python/xarray) containing matching datapoints.
+
+## Errors
+
+
+ Raised when `temporal_extent` is not provided.
+
+
+
+ Raised when one or more collection names do not exist in the dataset.
+
+
+
+ Raised when one or more provided collection IDs/objects are not part of the dataset.
+
+
+
+```python Python
+# query all collections in the dataset
+data = dataset.query(
+ temporal_extent=("2025-04-01", "2025-05-01"),
+)
+
+# query selected collections by name
+data = dataset.query(
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A"],
+ temporal_extent=("2025-04-01", "2025-05-01"),
+ show_progress=True,
+)
+
+# query selected collections by object
+collections = dataset.collections()
+data = dataset.query(
+ collections=[collections["S2A_S2MSI2A"], collections["S2B_S2MSI2A"]],
+ temporal_extent=("2025-04-01", "2025-05-01"),
+ skip_data=True,
+)
+```
+
diff --git a/console.mdx b/console.mdx
index d914981..5648dbb 100644
--- a/console.mdx
+++ b/console.mdx
@@ -58,7 +58,9 @@ timeInterval := query.NewTimeInterval(startDate, endDate)
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID}, &datapoints,
+ dataset.ID,
+ &datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(timeInterval),
)
if err != nil {
diff --git a/datasets/delete.mdx b/datasets/delete.mdx
index 8fe2f58..fc2f1d2 100644
--- a/datasets/delete.mdx
+++ b/datasets/delete.mdx
@@ -101,6 +101,7 @@ n_deleted = collection.delete(to_delete)
print(f"Deleted {n_deleted} data points.")
```
```go Go
+datasetID := uuid.MustParse("25a6f262-f6eb-4de5-be4f-b021f4f7dd13")
collectionID := uuid.MustParse("c5145c99-1843-4816-9221-970f9ce3ac93")
startDate := time.Date(2023, time.May, 1, 0, 0, 0, 0, time.UTC)
endDate := time.Date(2023, time.June, 1, 0, 0, 0, 0, time.UTC)
@@ -108,7 +109,9 @@ mai2023 := query.NewTimeInterval(startDate, endDate)
var toDelete []*v1.Sentinel2Msi
err := client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collectionID}, &toDelete,
+ datasetID,
+ &toDelete,
+ datasets.WithCollectionIDs(collectionID),
datasets.WithTemporalExtent(mai2023),
datasets.WithSkipData(),
)
diff --git a/datasets/ingest.mdx b/datasets/ingest.mdx
index e14bd7d..7407f26 100644
--- a/datasets/ingest.mdx
+++ b/datasets/ingest.mdx
@@ -284,7 +284,9 @@ endDate := time.Date(2025, time.March, 29, 0, 0, 0, 0, time.UTC)
var dataToCopy []*v1.MyCustomDataset
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{srcCollection.ID}, &dataToCopy,
+ dataset.ID,
+ &dataToCopy,
+ datasets.WithCollectionIDs(srcCollection.ID),
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
)
if err != nil {
@@ -402,4 +404,3 @@ Check out the [Ingestion from common file formats](/guides/datasets/ingest-forma
Ingesting Geometries can traditionally be a bit tricky, especially when working with geometries that cross the antimeridian or cover a pole.
Tilebox is designed to take away most of the friction involved in this, but it's still recommended to follow the [best practices for handling geometries](/datasets/geometries).
-
diff --git a/datasets/query/filter-by-id.mdx b/datasets/query/filter-by-id.mdx
index cb3b872..7f823f5 100644
--- a/datasets/query/filter-by-id.mdx
+++ b/datasets/query/filter-by-id.mdx
@@ -1,30 +1,76 @@
---
title: Querying individual datapoints by ID
sidebarTitle: Filter by ID
-description: Look up specific datapoints from a dataset collection by providing their unique identifiers, without needing to construct and execute a broader query.
+description: Look up specific datapoints by ID across one or more dataset collections, without needing to construct and execute a broader query.
icon: fingerprint
---
-If you already know the ID of the data point you want to query, you can query it directly, using
-[Collection.find](/api-reference/python/tilebox.datasets/Collection.find) in Python or [Datapoints.GetInto](/api-reference/go/datasets/Datapoints.GetInto) in Go.
-
-
- Check out [selecting a collection](/datasets/query/querying-data#selecting-a-collection) to learn how to get a collection object
- to query from.
-
+If you already know the ID of the datapoint you want to query, you can fetch it directly without needing to construct and execute a broader query.
+You can query a datapoint ID either in only specific collection of a dataset, a selected set of collections of a dataset, or from all collections of a dataset at once.
```python Python
datapoint_id = "0197a491-1520-102f-48f4-f087d6ef8603"
-datapoint = collection.find(datapoint_id)
+
+# query in all collections of a dataset
+datapoint = dataset.find(datapoint_id)
+
+# query in selected collections of a dataset
+datapoint = dataset.find(
+ datapoint_id,
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A"],
+)
+
+# query in a single collection
+datapoint = dataset.collection("S2A_S2MSI2A").find(
+ datapoint_id
+)
+
print(datapoint)
```
```go Go
datapointID := uuid.MustParse("0197a491-1520-102f-48f4-f087d6ef8603")
+// query in all collections of a dataset
var datapoint v1.Sentinel2Msi
err = client.Datapoints.GetInto(ctx,
- []uuid.UUID{collection.ID}, datapointID, &datapoint,
+ dataset.ID,
+ datapointID,
+ &datapoint,
+)
+if err != nil {
+ log.Fatalf("Failed to query datapoint: %v", err)
+}
+
+collectionA, err := client.Collections.Get(ctx, dataset.ID, "S2A_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+collectionB, err := client.Collections.Get(ctx, dataset.ID, "S2B_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+// query in selected collections of a dataset
+var datapointFromSelectedCollections v1.Sentinel2Msi
+err = client.Datapoints.GetInto(ctx,
+ dataset.ID,
+ datapointID,
+ &datapointFromSelectedCollections,
+ datasets.WithCollectionIDs(collectionA.ID, collectionB.ID),
+)
+if err != nil {
+ log.Fatalf("Failed to query datapoint: %v", err)
+}
+
+// query in a single collection
+var datapointFromSingleCollection v1.Sentinel2Msi
+err = client.Datapoints.GetInto(ctx,
+ dataset.ID,
+ datapointID,
+ &datapointFromSingleCollection,
+ datasets.WithCollectionIDs(collectionA.ID),
)
if err != nil {
log.Fatalf("Failed to query datapoint: %v", err)
@@ -110,7 +156,7 @@ from tilebox.datasets.sync.dataset import NotFoundError
datapoint_id = "0197a47f-a830-1160-6df5-61ac723dae17" # doesn't exist
try:
- collection.find(datapoint_id)
+ dataset.find(datapoint_id)
exists = True
except NotFoundError:
exists = False
@@ -118,10 +164,12 @@ except NotFoundError:
```go Go
datapointID := uuid.MustParse("0197a47f-a830-1160-6df5-61ac723dae17")
-exists := True
+exists := true
var datapoint v1.Sentinel2Msi
err = client.Datapoints.GetInto(ctx,
- []uuid.UUID{collection.ID}, datapointID, &datapoint,
+ dataset.ID,
+ datapointID,
+ &datapoint,
)
if err != nil {
if connect.CodeOf(err) == connect.CodeNotFound {
diff --git a/datasets/query/filter-by-location.mdx b/datasets/query/filter-by-location.mdx
index 5eb1885..ce68cca 100644
--- a/datasets/query/filter-by-location.mdx
+++ b/datasets/query/filter-by-location.mdx
@@ -15,7 +15,7 @@ When querying, you can specify arbitrary geometries as an area of interest. Tile
To filter by an area of interest, use a `Polygon` or `MultiPolygon` geometry as the spatial extent parameter.
-Here is how to query Sentinel-2 `S2A_S2MSI2A` data over Colorado for a specific day in April 2025.
+Here is how to query Sentinel-2 `L2A` data over Colorado for a specific day in April 2025.
```python Python
@@ -28,15 +28,15 @@ area = Polygon( # area roughly covering the state of Colorado
client = Client()
sentinel2_msi = client.dataset("open_data.copernicus.sentinel2_msi")
-collection = sentinel2_msi.collection("S2A_S2MSI2A")
-data = collection.query(
+data = sentinel2_msi.query(
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A", "S2C_S2MSI2A"],
temporal_extent=("2025-04-02", "2025-04-03"),
spatial_extent=area,
)
```
```go Go
startDate := time.Date(2025, 4, 2, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
@@ -54,10 +54,21 @@ if err != nil {
log.Fatalf("Failed to get collection: %v", err)
}
+collectionB, err := client.Collections.Get(ctx, dataset.ID, "S2B_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+collectionC, err := client.Collections.Get(ctx, dataset.ID, "S2C_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID, collectionB.ID, collectionC.ID),
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
datasets.WithSpatialExtent(area),
)
@@ -92,7 +103,7 @@ area = Polygon( # area roughly covering the state of Colorado
((-109.05, 41.00), (-109.045, 37.0), (-102.05, 37.0), (-102.05, 41.00), (-109.05, 41.00)),
)
-data = collection.query(
+data = dataset.query(
temporal_extent=("2025-04-02", "2025-04-03"),
# intersects is the default, so can also be omitted entirely
spatial_extent={"geometry": area, "mode": "intersects"},
@@ -101,14 +112,14 @@ print(f"There are {data.sizes['time']} Sentinel-2A granules intersecting the are
```
```go Go
startDate := time.Date(2025, 4, 2, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
var datapoints []*examplesv1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
datasets.WithSpatialExtentFilter(&query.SpatialFilter{
@@ -145,15 +156,16 @@ print(f"There are {data.sizes['time']} Sentinel-2A granules fully contained with
```
```go Go
startDate := time.Date(2025, 4, 2, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 5, 2, 0, 0, 0, 0, time.UTC)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
var datapoints []*examplesv1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
datasets.WithSpatialExtentFilter(&query.SpatialFilter{
Geometry: area,
@@ -220,7 +232,7 @@ area = Polygon( # area roughly covering the state of Colorado
((-109.05, 41.00), (-109.045, 37.0), (-102.05, 37.0), (-102.05, 41.00), (-109.05, 41.00)),
)
-data = collection.query(
+data = dataset.query(
temporal_extent=("2025-04-01", "2025-05-02"),
# spherical is the default, so can also be omitted entirely
spatial_extent={"geometry": area, "coordinate_system": "spherical"},
@@ -228,14 +240,14 @@ data = collection.query(
```
```go Go
startDate := time.Date(2025, 4, 2, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 5, 2, 0, 0, 0, 0, time.UTC)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
var datapoints []*examplesv1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
datasets.WithSpatialExtentFilter(&query.SpatialFilter{
@@ -261,21 +273,21 @@ area = Polygon( # area roughly covering the state of Colorado
((-109.05, 41.00), (-109.045, 37.0), (-102.05, 37.0), (-102.05, 41.00), (-109.05, 41.00)),
)
-data = collection.query(
+data = dataset.query(
temporal_extent=("2025-04-01", "2025-05-02"),
spatial_extent={"geometry": area, "coordinate_system": "cartesian"},
)
```
```go Go
startDate := time.Date(2025, 4, 2, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 4, 3, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 5, 2, 0, 0, 0, 0, time.UTC)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
var datapoints []*examplesv1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
datasets.WithSpatialExtentFilter(&query.SpatialFilter{
diff --git a/datasets/query/filter-by-time.mdx b/datasets/query/filter-by-time.mdx
index fb8c7b4..23538c5 100644
--- a/datasets/query/filter-by-time.mdx
+++ b/datasets/query/filter-by-time.mdx
@@ -7,8 +7,6 @@ icon: timeline
Both [Timeseries](/datasets/types/timeseries) and [Spatio-temporal](/datasets/types/spatiotemporal) datasets support efficient time-based queries.
-To query data from a collection, use the [query](/api-reference/python/tilebox.datasets/Collection.query) method. It requires a temporal extent parameter to specify the time or time interval for querying. The behavior of the `query` method depends on the exact temporal extent parameter you provide.
-
## Time interval queries
To query data for a specific time interval, use a `tuple` in the form `(start, end)` as the `temporal_extent` parameter. Both `start` and `end` must be [TimeScalars](#time-scalar-queries), which can be `datetime` objects or strings in ISO 8601 format.
@@ -19,10 +17,13 @@ from tilebox.datasets import Client
client = Client()
sentinel2_msi = client.dataset("open_data.copernicus.sentinel2_msi")
-collection = sentinel2_msi.collection("S2A_S2MSI2A")
+data = sentinel2_msi.query(
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A", "S2C_S2MSI2A"],
+ temporal_extent=("2025-05-01", "2025-06-01"),
+ show_progress=True,
+)
-interval = ("2025-05-01", "2025-06-01")
-data = collection.query(temporal_extent=interval, show_progress=True)
+print(f"Queried {data.sizes['time']} data points.")
```
```go Go
startDate := time.Date(2025, time.May, 1, 0, 0, 0, 0, time.UTC)
@@ -42,10 +43,21 @@ if err != nil {
log.Fatalf("Failed to get collection: %v", err)
}
+collectionB, err := client.Collections.Get(ctx, dataset.ID, "S2B_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+collectionC, err := client.Collections.Get(ctx, dataset.ID, "S2C_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID, collectionB.ID, collectionC.ID),
datasets.WithTemporalExtent(interval),
)
if err != nil {
@@ -56,38 +68,16 @@ log.Printf("Queried %d datapoints", len(datapoints))
```
-Output
-
-
-```plaintext Python
- Size: 39MB
-Dimensions: (time: 87121)
-Coordinates:
- * time (time) datetime64[ns] 697kB 2025-05-01T00:00:51.02...
-Data variables: (12/23)
- id (time)
The `show_progress` parameter is optional and can be used to display a [tqdm](https://tqdm.github.io/) progress bar while loading data.
+### Endpoint inclusivity
+
A time interval specified as a tuple is interpreted as a half-closed interval. This means the start time is inclusive, and the end time is exclusive.
For instance, using an end time of `2025-06-01` includes data points up to `2025-05-31 23:59:59.999`, but excludes those from `2025-06-01 00:00:00.000`.
This behavior mimics the Python `range` function and is useful for chaining time intervals.
@@ -114,8 +104,9 @@ for month := 4; month <= 6; month++ {
var partialDatapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&partialDatapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(query.NewTimeInterval(startDate, endDate)),
)
if err != nil {
@@ -131,7 +122,7 @@ for month := 4; month <= 6; month++ {
Above example demonstrates how to split a large time interval into smaller chunks while loading data in separate requests. Typically, this is not necessary as the datasets client auto-paginates large intervals.
-### Endpoint inclusivity
+### Manual endpoint inclusivity
For greater control over inclusivity of start and end times, you can explicitly specify a `TimeInterval`. This way you can specify both the `start` and `end` times, as well as their inclusivity. Here's an example of creating equivalent `TimeInterval` objects in two different ways.
@@ -182,8 +173,9 @@ log.Println(interval2.ToHalfOpen().String())
// Query data for a time interval
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(interval1),
)
```
@@ -205,19 +197,22 @@ You can query all datapoints linked to a specific timestamp by specifying a `Tim
Tilebox uses millisecond precision for time indexing datapoints. Thus, querying a specific time scalar, is equivalent to a time interval query of length 1 millisecond.
-Here's how to query a data point at a specific millisecond from a [collection](/datasets/concepts/collections).
+Here's how to query a data point at a specific millisecond from a [dataset](/datasets/concepts/datasets) or [collection](/datasets/concepts/collections).
```python Python
-data = collection.query(temporal_extent="2025-06-15T02:31:41.024")
-print(data)
+data = sentinel2_msi.query(temporal_extent="2025-06-15T02:31:41.024")
+print(f"Queried {data.sizes['time']} data points.")
+first_timestamp = data.time[0].dt.strftime("%Y-%m-%dT%H:%M:%S.%f").item()
+print("First datapoint time:", first_timestamp)
```
```go Go
temporalExtent := query.NewPointInTime(time.Date(2025, time.June, 15, 2, 31, 41, 024000000, time.UTC))
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID}, &datapoints,
+ dataset.ID,
+ &datapoints,
datasets.WithTemporalExtent(temporalExtent),
)
@@ -226,34 +221,10 @@ log.Printf("First datapoint time: %s", datapoints[0].GetTime().AsTime())
```
-Output
-
-
-```plaintext Python
- Size: 158kB
-Dimensions: (time: 357)
-Coordinates:
- * time (time) datetime64[ns] 3kB 2025-06-15T02:31:41.0240...
-Data variables: (12/23)
- id (time)
A collection may contain multiple datapoints for the same millisecond, so multiple data points may be returned. If you want to fetch only a single data point, [query the collection by id](#loading-a-data-point-by-id) instead.
@@ -289,7 +260,9 @@ log.Println(tokyoTime)
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID}, &datapoints,
+ dataset.ID,
+ &datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(tokyoTime),
)
if err != nil {
@@ -323,4 +296,3 @@ Queried 1 datapoints
First datapoint time: 2021-01-01 02:45:25.679 +0000 UTC
```
-
diff --git a/datasets/query/querying-data.mdx b/datasets/query/querying-data.mdx
index 470e2e3..424dcd0 100644
--- a/datasets/query/querying-data.mdx
+++ b/datasets/query/querying-data.mdx
@@ -1,61 +1,17 @@
---
title: Querying data
sidebarTitle: Querying data
-description: Access and filter data stored in your dataset collections using time-based and spatial queries, with built-in support for pagination and progress tracking.
+description: Access and filter data stored in your datasets using time-based and spatial queries, with built-in support for pagination and progress tracking.
icon: magnifying-glass
---
Tilebox offers a powerful and flexible querying API to access and filter data from your datasets. When querying, you can
[filter by time](/datasets/query/filter-by-time) and for [Spatio-temporal datasets](/datasets/types/spatiotemporal) optionally also [filter by a location in the form of a geometry](/datasets/query/filter-by-location).
-## Selecting a collection
-
-Querying is always done on a [collection](/datasets/concepts/collections) level, so to get started first select a collection to query.
-
-
-```python Python
-from tilebox.datasets import Client
-
-client = Client()
-sentinel2_msi = client.dataset("open_data.copernicus.sentinel2_msi")
-collection = sentinel2_msi.collection("S2A_S2MSI2A")
-```
-```go Go
-package main
-
-import (
- "context"
- "log"
-
- "github.com/tilebox/tilebox-go/datasets/v1"
-)
-
-func main() {
- ctx := context.Background()
- client := datasets.NewClient()
-
- dataset, err := client.Datasets.Get(ctx, "open_data.copernicus.sentinel2_msi")
- if err != nil {
- log.Fatalf("Failed to get dataset: %v", err)
- }
-
- collection, err := client.Collections.Get(ctx, dataset.ID, "S2A_S2MSI2A")
- if err != nil {
- log.Fatalf("Failed to get collection: %v", err)
- }
-}
-```
-
-
-
- Querying multiple dataset collections at once is a feature already on our roadmap. If you need this functionality, please [get in touch](mailto:support@tilebox.com) so we can let you know as soon as it is available.
-
-
## Running a query
-To query data points from a dataset collection, use the `query` method which is available for both [python](/api-reference/python/tilebox.datasets/Collection.query) and [go](/api-reference/go/datasets/Datapoints.Query).
-
-Below is a simple example of querying all Sentinel-2 `S2A_S2MSI2A` data for April 2025 over the state of Colorado.
+You can query data from either a specific collection of a dataset, from a selected set of collections of a dataset, or from all collections of a dataset at once.
+Below is a simple example showcasing those options by querying Sentinel-2 data for April 2025 over the state of Colorado.
```python Python
@@ -66,30 +22,87 @@ area = Polygon( # area roughly covering the state of Colorado
((-109.05, 41.00), (-109.045, 37.0), (-102.05, 37.0), (-102.05, 41.00), (-109.05, 41.00)),
)
+client = Client()
+sentinel2_msi = client.dataset("open_data.copernicus.sentinel2_msi")
+
+# query data from a specific collection
collection = sentinel2_msi.collection("S2A_S2MSI2A")
data = collection.query(
temporal_extent=("2025-04-01", "2025-05-01"),
spatial_extent=area,
show_progress=True,
)
+
+# query data from a selected set of collections
+data = sentinel2_msi.query(
+ collections=["S2A_S2MSI2A", "S2B_S2MSI2A", "S2C_S2MSI2A"],
+ temporal_extent=("2025-04-01", "2025-05-01"),
+ spatial_extent=area,
+ show_progress=True,
+)
+
+# query data from all collections in the dataset
+data = sentinel2_msi.query(
+ # omit the collections argument to query all collections
+ temporal_extent=("2025-04-01", "2025-05-01"),
+ spatial_extent=area,
+ show_progress=True,
+)
```
```go Go
startDate := time.Date(2025, 4, 1, 0, 0, 0, 0, time.UTC)
-endDate := time.Date(2025, 5, 2, 0, 0, 0, 0, time.UTC)
+endDate := time.Date(2025, 5, 1, 0, 0, 0, 0, time.UTC)
timeInterval := query.NewTimeInterval(startDate, endDate)
area := orb.Polygon{ // area roughly covering the state of Colorado
{{-109.05, 41.00}, {-109.045, 37.0}, {-102.05, 37.0}, {-102.05, 41.00}, {-109.05, 41.00}},
}
-collection, err := client.Collections.Get(ctx, dataset.ID, "S2A_S2MSI2A")
+collectionA, err := client.Collections.Get(ctx, dataset.ID, "S2A_S2MSI2A")
if err != nil {
log.Fatalf("Failed to get collection: %v", err)
}
+collectionB, err := client.Collections.Get(ctx, dataset.ID, "S2B_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+collectionC, err := client.Collections.Get(ctx, dataset.ID, "S2C_S2MSI2A")
+if err != nil {
+ log.Fatalf("Failed to get collection: %v", err)
+}
+
+// query data from a specific collection
+var fromCollection []*v1.Sentinel2Msi
+err = client.Datapoints.QueryInto(ctx,
+ dataset.ID,
+ &fromCollection,
+ datasets.WithCollectionIDs(collectionA.ID),
+ datasets.WithTemporalExtent(timeInterval),
+ datasets.WithSpatialExtent(area),
+)
+if err != nil {
+ log.Fatalf("Failed to query datapoints: %v", err)
+}
+
+// query data from a selected set of collections
var datapoints []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collectionA.ID, collectionB.ID, collectionC.ID),
+ datasets.WithTemporalExtent(timeInterval),
+ datasets.WithSpatialExtent(area),
+)
+if err != nil {
+ log.Fatalf("Failed to query datapoints: %v", err)
+}
+
+// query data from all collections in the dataset
+var datapointsFromAllCollections []*v1.Sentinel2Msi
+err = client.Datapoints.QueryInto(ctx,
+ dataset.ID,
+ &datapointsFromAllCollections,
datasets.WithTemporalExtent(timeInterval),
datasets.WithSpatialExtent(area),
)
@@ -146,6 +159,7 @@ Sometimes, only the ID or timestamp associated with a datapoint is required. In
querying by skipping downloading of all dataset fields except of the `time`, the `id` and the `ingestion_time` by setting the `skip_data` parameter to `True`.
For example, when checking how many datapoints exist in a given time interval, you can use `skip_data=True` to avoid loading the data fields.
+This works the same way on both collection-level and dataset-level queries.
```python Python
@@ -160,9 +174,10 @@ interval := query.NewTimeInterval(startDate, endDate)
var datapoints []*v1.Sentinel1Sar
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
- datasets.WithTemporalExtent(temporalExtent),
+ datasets.WithCollectionIDs(collection.ID),
+ datasets.WithTemporalExtent(interval),
datasets.WithSkipData(),
)
if err != nil {
@@ -212,8 +227,9 @@ timeWithNoDatapoints := query.NewPointInTime(time.Date(1997, time.February, 6, 1
var datapoints []*v1.Sentinel1Sar
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID},
+ dataset.ID,
&datapoints,
+ datasets.WithCollectionIDs(collection.ID),
datasets.WithTemporalExtent(timeWithNoDatapoints),
)
if err != nil {
@@ -236,4 +252,3 @@ No data points found
No data points found
```
-
diff --git a/datasets/types/spatiotemporal.mdx b/datasets/types/spatiotemporal.mdx
index cdda36d..5236509 100644
--- a/datasets/types/spatiotemporal.mdx
+++ b/datasets/types/spatiotemporal.mdx
@@ -44,7 +44,7 @@ already outlined will be automatically added to the dataset schema.
## Spatio-temporal queries
Spatio-temporal datasets support efficient time-based and spatially filtered queries. To query a specific location in a given time interval,
-specify a time range and a geometry when [querying data points](/datasets/query/filter-by-location) from a collection.
+specify a time range and a geometry when [querying data points](/datasets/query/filter-by-location) from a dataset or a collection.
## Geometries
diff --git a/datasets/types/timeseries.mdx b/datasets/types/timeseries.mdx
index f0a9fd1..aa51861 100644
--- a/datasets/types/timeseries.mdx
+++ b/datasets/types/timeseries.mdx
@@ -39,4 +39,4 @@ already outlined will be automatically added to the dataset schema.
## Time-based queries
-Timeseries datasets support time-based queries. To query a specific time interval, specify a time range when [querying data](/datasets/query/filter-by-time) from a collection.
+Timeseries datasets support time-based queries. To query a specific time interval, specify a time range when [querying data](/datasets/query/filter-by-time) from a dataset or a collection.
diff --git a/docs.json b/docs.json
index caa62b9..63d56ca 100644
--- a/docs.json
+++ b/docs.json
@@ -172,6 +172,8 @@
"api-reference/python/tilebox.datasets/Dataset.create_collection",
"api-reference/python/tilebox.datasets/Dataset.delete_collection",
"api-reference/python/tilebox.datasets/Dataset.get_or_create_collection",
+ "api-reference/python/tilebox.datasets/Dataset.find",
+ "api-reference/python/tilebox.datasets/Dataset.query",
"api-reference/python/tilebox.datasets/Collection.delete",
"api-reference/python/tilebox.datasets/Collection.find",
"api-reference/python/tilebox.datasets/Collection.info",
diff --git a/quickstart.mdx b/quickstart.mdx
index 1010771..6711a63 100644
--- a/quickstart.mdx
+++ b/quickstart.mdx
@@ -167,7 +167,6 @@ If you prefer to work locally, follow these steps to get started.
"log/slog"
"time"
- "github.com/google/uuid"
"github.com/paulmach/orb"
"github.com/paulmach/orb/encoding/wkt"
"github.com/tilebox/tilebox-go/datasets/v1"
@@ -201,10 +200,12 @@ If you prefer to work locally, follow these steps to get started.
// You have to use tilebox-generate to generate the dataset type
var datapointsOverColorado []*v1.Sentinel2Msi
err = client.Datapoints.QueryInto(ctx,
- []uuid.UUID{collection.ID}, &datapointsOverColorado,
- datasets.WithTemporalExtent(march2025),
- datasets.WithSpatialExtent(colorado),
- )
+ dataset.ID,
+ &datapointsOverColorado,
+ datasets.WithCollectionIDs(collection.ID),
+ datasets.WithTemporalExtent(march2025),
+ datasets.WithSpatialExtent(colorado),
+ )
if err != nil {
log.Fatalf("Failed to query datapoints: %v", err)
}