Skip to content

Commit e7a03a1

Browse files
committed
ENH: Validate PET data objects' attributes at instantiation
Validate PET data objects' attributes at instantiation: ensures that the attributes are present and match the expected dimensionalities. **PET class attributes** Refactor the PET attributes so that the required (`frame_time` and `uptake`) and optional (`frame_duration`, `midframe`, `total_duration`) parameters are accepted by the constructor. Although the optional parameters can be computed from the required parameters, excluding them from the `__init__` (using the `init=False` `attrs` option) would make such that when dumping a PET instance to an HDF5, further processing would be required to exclude those elements to allow reading the file, and they would need to be computed at every instantiation. Also, they may take user-provided values, so it is necessary for the constructor to allow them. Although `uptake` can also be computed from the PET frame data, the rationale behind requiring it is similar to the one for the DWI class `bzero`: users will be able to compute the `uptake` using their preferred strategy and provide it to the constructor. For the `from_nii` function, if a callable is provided, it will be used to compute the value; otherwise a default strategy is used to compute it. Validate and format attributes so that the computation of the relevant temporal and uptake attributes is done at a single place, i.e. when instantiating the `PET` object. Avoids potential inconsistencies. Time-origin shift the `frame_time` values when formatting them. Make the `_compute_uptake_statistic` public so that users can call it. **`from_nii`** function: Refactor the `from_nii` function to accept filenames instead of a mix of filenames (e.g. the PET image sequence and brainmask) and temporal and uptake attribute arrays. Honors the name of the function, increases consistency with the dMRI counterpart and allows to offer a uniform API. This allows to read the required and optional parameters from the provided files so that they are present when instantiating the PET object. Use the `get_data` utils function in `from_nii` to handle automatically the data type when loading the PET data. **`PET.load`** class method: Remove the `PET.load` class method and rely on the `data.__init__.load` function: - If an HDF5 filename is provided, it is assumed that it hosts all necessary information, and the data module `load` function should take of loading all data. - If the provided arguments are NIfTI files plus other data files, the function will call the `pet.PET.from_nii` function. Change the `kwargs` arguments to be able to identify the relevant keyword arguments that are now present in the `from_nii` function. Change accordingly the `PET.load(pet_file, json_file)` call in the PET notebook and the `test_pet_load` test function. **Tests**: Refactor the PET data creation fixture in `conftest.py` to accept the required/optional arguments and to return the necessary data. Refactor the tests accordingly and increase consistency with the `dmri` data module testing helper functions. Reduces cognitive load and maintenance burden. Add additional object instantiation equality checks: check that objects intantiated through reading NIfTI files equal objects instantiated directly. Check the PET dataset attributes systematically in round trip tests by collecting all named attributes that need to be tested. Modify accordingly the PET model and integration tests. Modify test parameterization values to values that make sense (i.e. are consistent with the way they are computed from the `frame_time` attribute). Take advantage of the patch set to make other opinionated choices: - Prefer using the global `setup_random_pet_data` fixture over the local `random_dataset` fixture: it allows to control the parameters of the generated data and increases consistency with the practice adopted across the dMRI dataset tests. Remove the `random_dataset` fixture. - Prefer using `assert np.allclose` over `np.testing.assert_array_equal` for the sake of consistency **Dependencies** Require `attrs>24.1.0` so that `attrs.Converter` can be used. Documentation: https://www.attrs.org/en/25.4.0/api.html#converters
1 parent 0ed2364 commit e7a03a1

File tree

11 files changed

+1403
-234
lines changed

11 files changed

+1403
-234
lines changed

docs/notebooks/pet_motion_estimation.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
"from os import getenv\n",
1111
"from pathlib import Path\n",
1212
"\n",
13-
"from nifreeze.data.pet import PET\n",
13+
"from nifreeze.data.pet import from_nii\n",
1414
"\n",
1515
"# Install test data from gin.g-node.org:\n",
1616
"# $ datalad install -g https://gin.g-node.org/nipreps-data/tests-nifreeze.git\n",
@@ -29,7 +29,7 @@
2929
" DATA_PATH / \"pet_data\" / \"sub-02\" / \"ses-baseline\" / \"pet\" / \"sub-02_ses-baseline_pet.json\"\n",
3030
")\n",
3131
"\n",
32-
"pet_dataset = PET.load(pet_file, json_file)"
32+
"pet_dataset = from_nii(pet_file, temporal_file=json_file)"
3333
]
3434
},
3535
{

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ classifiers = [
2020
license = "Apache-2.0"
2121
requires-python = ">=3.10"
2222
dependencies = [
23-
"attrs>=20.1.0",
23+
"attrs>=24.1.0",
2424
"dipy>=1.5.0",
2525
"joblib",
2626
"nipype>=1.5.1,<2.0",

src/nifreeze/data/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ def load(
7676
from nifreeze.data.dmri import from_nii as dmri_from_nii
7777

7878
return dmri_from_nii(filename, brainmask_file=brainmask_file, **kwargs)
79-
elif {"frame_time", "frame_duration"} & set(kwargs):
79+
elif {"temporal_file"} & set(kwargs):
8080
from nifreeze.data.pet import from_nii as pet_from_nii
8181

8282
return pet_from_nii(filename, brainmask_file=brainmask_file, **kwargs)

0 commit comments

Comments
 (0)