-
Notifications
You must be signed in to change notification settings - Fork 4
New epochs class #14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
New epochs class #14
Conversation
Refactored the Epochs class to use a dictionary of epochs indexed by epoch index, added properties for empty epochs, and improved overlap checking and warnings. Updated the to_numpy method to support flexible sampling rates and interpolation, and improved baseline correction error handling. Modified plot_epochs and its helpers to work with the new Epochs API, and updated the pupil_size_and_epoching tutorial to use the new sample data and API.
Expanded and clarified docstrings for epoching functions and the Epochs class, including detailed parameter and return value descriptions. Refactored Dataset to automatically handle native data without requiring a 'custom' flag, and improved section construction for native recordings. Updated tutorials to use consistent event naming and native data examples. Minor bugfixes and doc improvements in export and utility modules.
|
@JGHartel Currently the baseline correction method is not updated yet. What do you think would be the best API/code for it? For operability with other methods such as |
JGHartel
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, I welcome the suggested changes. In particular, the rename towards epoch_info and the unified access to epochs data as epoch.epochs, rather than the previous mixed access, is appreciated. Still, I would like to invite some discussions on the assumed datatypes, e.g. epochs.epochs and epochs.annotate being a dictionary. This would especially futureproof the development, if dataset level operations should be introduced
| ``t_after``: Time after the reference time to end the epoch.\n | ||
| ``description``: Description or label associated with the epoch. | ||
| """ | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will defer to you on this, but I personally think it a bit over the top to use decorators for the docstring. Besides generating the API doc, sometimes one just wants to look on the functions/methods themselves. I would not want to have to dig through a dictionary of docstring defaults for that.
That being said, the current ones seem to be simple enough to be redundant in code
pyneon/epochs.py
Outdated
| # Create epochs | ||
| self.epochs, self.data = _create_epochs(source, times_df) | ||
| @cached_property | ||
| def epochs(self) -> dict[int, Stream | Events | None]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
flagging this in case we ever want to support multi-recording functionality. It would be good to have a Unique ID for each event, so that one can easily concat these dicts between recordings. Alternatively, we could then construct an implicit multiindex (recording, epoch_number), which would require a custom concat function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems a bit of an advanced user case and also in principle should apply to other classes (e.g. streams and events). My evaluation would be it would require another PR if we see value in supporting multi-recording concatenation
pyneon/epochs.py
Outdated
| ts = source.ts if isinstance(source, Stream) else source.start_ts | ||
| source_index = source.data.index | ||
| annot = {i: [] for i in source_index} # Initialize empty lists for each index | ||
|
|
||
| # Iterate over each event time to create epochs | ||
| for i, row in times_df.iterrows(): | ||
| t_ref_i, t_before_i, t_after_i, description_i = row[ | ||
| ["t_ref", "t_before", "t_after", "description"] | ||
| ].to_list() | ||
| empty_epochs = [] | ||
| for i, row in epochs_info.iterrows(): | ||
| t_ref_i, t_before_i, t_after_i = row[["t_ref", "t_before", "t_after"]].to_list() | ||
|
|
||
| start_time = t_ref_i - t_before_i | ||
| end_time = t_ref_i + t_after_i | ||
| mask = np.logical_and(ts >= start_time, ts <= end_time) | ||
|
|
||
| if not mask.any(): | ||
| warnings.warn(f"No data found for epoch {i}.", RuntimeWarning) | ||
| epochs.at[i, "epoch data"] = pd.DataFrame() | ||
| continue | ||
| empty_epochs.append(i) | ||
|
|
||
| data.loc[mask, "epoch index"] = i | ||
| data.loc[mask, "epoch description"] = str(description_i) | ||
| data.loc[mask, "epoch time"] = ( | ||
| data.loc[mask].index.to_numpy() - t_ref_i | ||
| ).astype("int64") | ||
| # Annotate the data with the epoch index | ||
| for idx in source_index[mask]: | ||
| annot[idx].append(i) | ||
|
|
||
| local_data = data.loc[mask].copy() | ||
| local_data.drop(columns=["epoch index", "epoch description"], inplace=True) | ||
| epochs.at[i, "data"] = local_data | ||
| if empty_epochs: | ||
| warn(f"No data found for epoch(s): {empty_epochs}.", RuntimeWarning) | ||
|
|
||
| return epochs, data | ||
| return annot |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why do you want to return a dict here. Would it not be better to have a df?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are right that a df in this case seems intuitive (with an "epoch_index" column of object type)
| def detect_events_from_derivative( | ||
| self, | ||
| column: str, | ||
| threshold: Number, | ||
| min_duration: Optional[Number] = None, | ||
| ) -> "Events": | ||
| """ | ||
| Detect events based on velocity thresholding of a specified data column. | ||
| Parameters | ||
| ---------- | ||
| column : str | ||
| The data column on which to compute velocity for event detection. | ||
| threshold : Number | ||
| Threshold above which an event is detected. | ||
| The units depend on the units of the specified column | ||
| and the timestamps (e.g., px/s for gaze position in pixels). | ||
| min_duration : float | ||
| Minimum duration (in seconds) for an event to be valid. | ||
| Returns | ||
| ------- | ||
| Events | ||
| An Events object containing the detected events with | ||
| ``start timestamp [ns]`` and ``end timestamp [ns]`` columns. | ||
| """ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
placeholder?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this is an idea I have to detect events like saccades or blinks from derivative of stream data (gaze, pupil diameter)
Developers and users have found the old
Epochsclass quite unintuitive and operationally hard to use (e.g., nested DataFrame). This PR aims to reorganize the class.Current features:
event_timesis renamed toepochs_infofor easier comprehensionEpochsinstance will not compute the epochs or annotated data automatically.epochsis now a cached property in the form of a dictionary. Keys are indices ofepochs_infoand values areStream/Eventsinstances.annotate_epochs.