pyhazards package

Subpackages

Module contents

class pyhazards.CNNPatchEncoder(in_channels=3, hidden_dim=64)[source]

Bases: Module

Lightweight CNN encoder for raster patches.

forward(x)[source]
class pyhazards.ClassificationHead(in_dim, num_classes)[source]

Bases: Module

Simple classification head.

forward(x)[source]
class pyhazards.ClassificationMetrics[source]

Bases: MetricBase

_abc_impl = <_abc._abc_data object>
compute()[source]
Return type:

Dict[str, float]

reset()[source]
Return type:

None

update(preds, targets)[source]
Return type:

None

class pyhazards.DataBundle(splits, feature_spec, label_spec, metadata=<factory>)[source]

Bases: object

Bundle of train/val/test splits plus metadata. Keeps feature/label specs to make model construction easy.

feature_spec: FeatureSpec
get_split(name)[source]
Return type:

DataSplit

label_spec: LabelSpec
metadata: Dict[str, Any]
splits: Dict[str, DataSplit]
class pyhazards.DataSplit(inputs, targets, metadata=<factory>)[source]

Bases: object

Container for a single split.

inputs: Any
metadata: Dict[str, Any]
targets: Any
class pyhazards.Dataset(cache_dir=None)[source]

Bases: object

Base class for hazard datasets. Subclasses should load data and return a DataBundle with splits ready for training.

_load()[source]
Return type:

DataBundle

load(split=None, transforms=None)[source]

Return a DataBundle. Optionally return a specific split if provided.

Return type:

DataBundle

name: str = 'base'
class pyhazards.FeatureSpec(input_dim=None, channels=None, description=None, extra=<factory>)[source]

Bases: object

Describes input features (shapes, dtypes, normalization).

channels: Optional[int] = None
description: Optional[str] = None
extra: Dict[str, Any]
input_dim: Optional[int] = None
class pyhazards.LabelSpec(num_targets=None, task_type='regression', description=None, extra=<factory>)[source]

Bases: object

Describes labels/targets for downstream tasks.

description: Optional[str] = None
extra: Dict[str, Any]
num_targets: Optional[int] = None
task_type: str = 'regression'
class pyhazards.MLPBackbone(input_dim, hidden_dim=256, depth=2)[source]

Bases: Module

Simple MLP for tabular features.

forward(x)[source]
class pyhazards.MetricBase[source]

Bases: ABC

_abc_impl = <_abc._abc_data object>
abstract compute()[source]
Return type:

Dict[str, float]

abstract reset()[source]
Return type:

None

abstract update(preds, targets)[source]
Return type:

None

class pyhazards.RegressionHead(in_dim, out_dim=1)[source]

Bases: Module

Regression head for scalar or multi-target outputs.

forward(x)[source]
class pyhazards.RegressionMetrics[source]

Bases: MetricBase

_abc_impl = <_abc._abc_data object>
compute()[source]
Return type:

Dict[str, float]

reset()[source]
Return type:

None

update(preds, targets)[source]
Return type:

None

class pyhazards.SegmentationHead(in_channels, num_classes)[source]

Bases: Module

Segmentation head for raster masks.

forward(x)[source]
class pyhazards.SegmentationMetrics(num_classes=None)[source]

Bases: MetricBase

_abc_impl = <_abc._abc_data object>
compute()[source]
Return type:

Dict[str, float]

reset()[source]
Return type:

None

update(preds, targets)[source]
Return type:

None

class pyhazards.TemporalEncoder(input_dim, hidden_dim=128, num_layers=1)[source]

Bases: Module

GRU-based encoder for time-series signals.

forward(x)[source]
class pyhazards.Trainer(model, device=None, metrics=None, strategy='auto', mixed_precision=False)[source]

Bases: object

Lightweight training abstraction with a familiar API: fit -> evaluate -> predict.

_make_loader(inputs, targets, batch_size, num_workers, collate_fn, shuffle=True)[source]
Return type:

Iterable

_to_device(obj)[source]
Return type:

Any

evaluate(data, split='test', batch_size=64, num_workers=0, collate_fn=None)[source]
Return type:

Dict[str, float]

fit(data, train_split='train', val_split=None, max_epochs=1, optimizer=None, loss_fn=None, batch_size=32, num_workers=0, collate_fn=None)[source]

Minimal fit loop that works for tensor-based splits. Extend/replace with custom DataLoaders for complex data.

Return type:

None

predict(data, split='test', batch_size=64, num_workers=0, collate_fn=None)[source]
Return type:

List[Tensor]

save_checkpoint(path)[source]
Return type:

None

pyhazards.available_datasets()[source]
pyhazards.available_models()[source]
pyhazards.build_model(name, task, **kwargs)[source]

Build a model by name and task. This delegates to registry metadata to keep a consistent interface.

Return type:

Module

pyhazards.load_dataset(name, **kwargs)[source]
Return type:

Dataset

pyhazards.register_dataset(name, builder)[source]
Return type:

None

pyhazards.register_model(name, builder, defaults=None)[source]
Return type:

None