Shortcuts

ride

Subpackages

Submodules

Package Contents

Classes

Main

Complete main programme for the lifecycle of a machine learning project

Configs

Configs module for holding project configurations.

RideClassificationDataset

Base-class for Ride classification datasets.

RideDataset

Base-class for Ride datasets.

RideModule

Base-class for modules using the Ride ecosystem.

Finetunable

Adds finetune capabilities to model

Hparamsearch

Lifecycle

Adds train, val, and test lifecycle methods with cross_entropy loss

FlopsMetric

Computes Floating Point Operations (FLOPs) for the model and adds it as metric

FlopsWeightedAccuracyMetric

Computes acc * (flops / target_gflops) ** (-0.07)

MeanAveragePrecisionMetric

Mean Average Precision (mAP) metric

AdamWOneCycleOptimizer

Abstract base-class for Optimizer mixins

AdamWOptimizer

Abstract base-class for Optimizer mixins

SgdOneCycleOptimizer

Abstract base-class for Optimizer mixins

SgdOptimizer

Abstract base-class for Optimizer mixins

Functions

getLogger(name[, log_once])

MetricSelector(→ MetricMixin)

TopKAccuracyMetric(→ MetricMixin)

class ride.Main(Module: Type[ride.core.RideModule])[source]

Complete main programme for the lifecycle of a machine learning project

Usage:

Main(YourRideModule).argparse()

argparse(args: List[str] = None, run=True)
main(args: pytorch_lightning.utilities.parsing.AttributeDict)
class ride.Configs[source]

Bases: corider.Configs

Configs module for holding project configurations.

This is a wrapper of the Configs found as a stand-alone package in https://github.com/LukasHedegaard/co-rider

static collect(cls: RideModule) Configs

Collect the configs from all class bases

Returns:

Aggregated configurations

Return type:

Configs

default_values()
class ride.RideClassificationDataset(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: RideDataset

Base-class for Ride classification datasets.

If no dataset is specified otherwise, this mixin is automatically add as a base of RideModule childen.

User-specified datasets must inherit from this class, and specify the following: - self.input_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.output_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.classes: List[str]

and either the functions: - train_dataloader: Callable[[Any], DataLoader] - val_dataloader: Callable[[Any], DataLoader] - test_dataloader: Callable[[Any], DataLoader]

or: - self.datamodule, which has train_dataloader, val_dataloader, and test_dataloader attributes.

property num_classes: int
classes: List[str]
static configs() Configs
validate_attributes()
metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, prefix: str = None, *args, **kwargs)
class ride.RideDataset(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: RideMixin

Base-class for Ride datasets.

If no dataset is specified otherwise, this mixin is automatically add as a base of RideModule childen.

User-specified datasets must inherit from this class, and specify the following: - self.input_shape: Union[int, Sequence[int], Sequence[Sequence[int]]] - self.output_shape: Union[int, Sequence[int], Sequence[Sequence[int]]]

and either the functions: - train_dataloader: Callable[[Any], DataLoader] - val_dataloader: Callable[[Any], DataLoader] - test_dataloader: Callable[[Any], DataLoader]

or: - self.datamodule, which has train_dataloader, val_dataloader, and test_dataloader attributes.

input_shape: DataShape
output_shape: DataShape
validate_attributes()
static configs() Configs
train_dataloader(*args: Any, **kwargs: Any) torch.utils.data.DataLoader

The train dataloader

val_dataloader(*args: Any, **kwargs: Any) Union[torch.utils.data.DataLoader, List[torch.utils.data.DataLoader]]

The val dataloader

test_dataloader(*args: Any, **kwargs: Any) Union[torch.utils.data.DataLoader, List[torch.utils.data.DataLoader]]

The test dataloader

class ride.RideModule[source]

Base-class for modules using the Ride ecosystem.

This module should be inherited as the highest-priority parent (first in sequence).

Example:

class MyModule(ride.RideModule, ride.SgdOneCycleOptimizer):
    def __init__(self, hparams):
        ...

It handles proper initialisation of RideMixin parents and adds automatic attribute validation.

If pytorch_lightning.LightningModule is omitted as lowest-priority parent, RideModule will automatically add it.

If training_step, validation_step, and test_step methods are not found, the ride.Lifecycle will be automatically mixed in by this module.

property hparams: pytorch_lightning.utilities.parsing.AttributeDict
classmethod __init_subclass__()
classmethod with_dataset(ds: RideDataset)
ride.getLogger(name, log_once=False)
class ride.Finetunable(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: ride.unfreeze.Unfreezable

Adds finetune capabilities to model

This module is automatically added when RideModule is inherited

hparams: Ellipsis
static configs() ride.core.Configs
validate_attributes()
map_loaded_weights(file, loaded_state_dict)
on_init_end(hparams, *args, **kwargs)
class ride.Hparamsearch(Module: Type[ride.core.RideModule])[source]
configs() ride.core.Configs
__call__(args: pytorch_lightning.utilities.parsing.AttributeDict)
run(args: pytorch_lightning.utilities.parsing.AttributeDict)

Run hyperparameter search using the tune.schedulers.ASHAScheduler

Parameters:

args (AttributeDict) – Arguments

Side-effects:

Saves logs to TUNE_LOGS_PATH / args.id

static dump(hparams: dict, identifier: str, extention='yaml') str

Dumps haparams to TUNE_LOGS_PATH / identifier / “best_hparams.json”

static load(path: Union[pathlib.Path, str], old_args=AttributeDict(), Cls: Type[ride.core.RideModule] = None, auto_scale_lr=False) pytorch_lightning.utilities.parsing.AttributeDict

Loads hparams from path

Parameters:
  • path (Union[Path, str]) – Path to jsonfile containing hparams

  • old_args (Optional[AttributeDict]) – The AttributeDict to be updated with the new hparams

  • cls (Optional[RideModule]) – A class whole hyperparameters can be used to select the relevant hparams to take

Returns:

AttributeDict with updated hyperparameters

Return type:

AttributeDict

class ride.Lifecycle(hparams=None, *args, **kwargs)[source]

Bases: ride.metrics.MetricMixin

Adds train, val, and test lifecycle methods with cross_entropy loss

During its traning_epoch_end(epoch) lifecycle method, it will call on_traning_epoch_end for all superclasses of its child class

hparams: Ellipsis
forward: Callable[[torch.Tensor], torch.Tensor]
_epoch: int
classmethod _metrics()
validate_attributes()
static configs() ride.core.Configs
metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) ride.metrics.MetricDict
common_step(pred, target, prefix='train/', log=False)
common_epoch_end(step_outputs, prefix='train/', exclude_keys={'pred', 'target'})
preprocess_batch(batch)
training_step(batch, batch_idx=None)
training_epoch_end(step_outputs)
validation_step(batch, batch_idx=None)
validation_epoch_end(step_outputs)
test_step(batch, batch_idx=None)
test_epoch_end(step_outputs)
class ride.FlopsMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: MetricMixin

Computes Floating Point Operations (FLOPs) for the model and adds it as metric

classmethod _metrics()
on_init_end(*args, **kwargs)
metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict
class ride.FlopsWeightedAccuracyMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: FlopsMetric

Computes acc * (flops / target_gflops) ** (-0.07)

classmethod _metrics()
validate_attributes()
static configs() ride.core.Configs
metrics_step(preds: torch.Tensor, targets: torch.Tensor, **kwargs) MetricDict
class ride.MeanAveragePrecisionMetric(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: MetricMixin

Mean Average Precision (mAP) metric

validate_attributes()
_compute_mean_average_precision(preds, targets)
classmethod _metrics()
metrics_step(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict
metrics_epoch(preds: torch.Tensor, targets: torch.Tensor, *args, **kwargs) MetricDict
ride.MetricSelector(mapping: Dict[str, Union[MetricMixin, Iterable[MetricMixin]]] = None, default_config: str = '', **kwargs: Union[MetricMixin, Iterable[MetricMixin]]) MetricMixin[source]
ride.TopKAccuracyMetric(*Ks) MetricMixin[source]
class ride.AdamWOneCycleOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: ride.core.OptimizerMixin

Abstract base-class for Optimizer mixins

hparams: Ellipsis
parameters: Callable
train_dataloader: Callable
validate_attributes()
static configs() ride.core.Configs
configure_optimizers()
class ride.AdamWOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: ride.core.OptimizerMixin

Abstract base-class for Optimizer mixins

hparams: Ellipsis
parameters: Callable
validate_attributes()
static configs() ride.core.Configs
configure_optimizers()
class ride.SgdOneCycleOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: ride.core.OptimizerMixin

Abstract base-class for Optimizer mixins

hparams: Ellipsis
parameters: Callable
train_dataloader: Callable
validate_attributes()
static configs() ride.core.Configs
configure_optimizers()
class ride.SgdOptimizer(hparams: pytorch_lightning.utilities.parsing.AttributeDict, *args, **kwargs)[source]

Bases: ride.core.OptimizerMixin

Abstract base-class for Optimizer mixins

hparams: Ellipsis
parameters: Callable
validate_attributes()
static configs() ride.core.Configs
configure_optimizers()
Read the Docs v: stable
Versions
latest
stable
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.