diff --git a/docs/source/_static/css/custom.css b/docs/source/_static/css/custom.css new file mode 100644 index 00000000..680bd1af --- /dev/null +++ b/docs/source/_static/css/custom.css @@ -0,0 +1,74 @@ +/* arg formatting by line, taken from https://github.com/sphinx-doc/sphinx/issues/1514#issuecomment-742703082 */ + +/* For general themes */ +div.body, .wy-nav-content { + max-width: 1000px; /* Set the content width */ + margin: 0; /* Remove auto-centering */ + padding-left: 30; /* Optional: Adjust padding */ +} + +/* For Read the Docs theme specifically */ +.wy-nav-content { + margin: 0; /* Remove centering (auto) */ + padding-left: 30px; /* Align content to the left */ +} + + +/*Newlines (\a) and spaces (\20) before each parameter*/ +dl.class em:not([class])::before { + content: "\a\20\20\20\20\20\20\20\20\20\20\20\20\20\20\20\20"; + white-space: pre; +} + +/*Newline after the last parameter (so the closing bracket is on a new line)*/ +dl.class em:not([class]):last-of-type::after { + content: "\a"; + white-space: pre; +} + +/*To have blue background of width of the block (instead of width of content)*/ +dl.class > dt:first-of-type { + display: block !important; +} + +.rst-content code.literal, .rst-content tt.literal { + color: #2b417e; /* Replace with your desired color */ +} +.rst-content div[class^=highlight], .rst-content pre.literal-block { + margin: 1px 0 14px +} + +.rst-content .section ol li>*, .rst-content .section ul li>*, .rst-content .toctree-wrapper ol li>*, .rst-content .toctree-wrapper ul li>*, .rst-content section ol li>*, .rst-content section ul li>* { + margin-top: 0px; +} + +/* Ensure there is 10px spacing between nested list items at different levels*/ +.rst-content li > dl > dt { + margin-bottom: 10px; +} +.rst-content dd > ul > li { + margin-bottom: 10px; +} +.rst-content .section ol.simple li>*, .rst-content .section ol.simple li ol, .rst-content .section ol.simple li ul, .rst-content .section ul.simple li>*, .rst-content .section ul.simple li ol, .rst-content .section ul.simple li ul, .rst-content .toctree-wrapper ol.simple li>*, .rst-content .toctree-wrapper ol.simple li ol, .rst-content .toctree-wrapper ol.simple li ul, .rst-content .toctree-wrapper ul.simple li>*, .rst-content .toctree-wrapper ul.simple li ol, .rst-content .toctree-wrapper ul.simple li ul, .rst-content section ol.simple li>*, .rst-content section ol.simple li ol, .rst-content section ol.simple li ul, .rst-content section ul.simple li>*, .rst-content section ul.simple li ol, .rst-content section ul.simple li ul{ + margin-bottom: 10px; +} + +/* Improve padding and margins for function docstring section titles */ +.rst-content dd > dl > dt { + padding-left: 5px; + margin-top: 20px; + margin-bottom: 10px; +} +html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt, html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{ + margin-top: 28px; + margin-bottom: 10px; +} + +button.copybtn { + height:25px; + width:25px; + opacity: 0.5; + padding: 0; + border: none; + background: none; +} diff --git a/docs/source/_static/html/tutorials/UnitTimes.png b/docs/source/_static/html/tutorials/UnitTimes.png new file mode 100644 index 00000000..d31f8b55 Binary files /dev/null and b/docs/source/_static/html/tutorials/UnitTimes.png differ diff --git a/docs/source/_static/html/tutorials/basicUsage.html b/docs/source/_static/html/tutorials/basicUsage.html new file mode 100644 index 00000000..c9cab5d2 --- /dev/null +++ b/docs/source/_static/html/tutorials/basicUsage.html @@ -0,0 +1,431 @@ + +Using NWB Data

Using NWB Data

last updated: February 9, 2021
In this tutorial, we demonstrate the reading and usage of the NWB file produced in the File Conversion Tutorial. The output is a near-reproduction of Figure 1e from the Li et al publication, showing raster and peristimulus time histogram (PSTH) plots for neural recordings from anterior lateral motor cortex (ALM). This figure illustrates the main finding of the publication, showing the robustness of motor planning behavior and neural dynamics following short unilateral network silencing via optogenetic inhibition.

Reading NWB Files

NWB files can be read in using the nwbRead() function. This function returns a nwbfile object which is the in-memory representation of the NWB file structure.
nwb = nwbRead('out\ANM255201_20141124.nwb');

Constrained Sets

Analyzed data in NWB is placed under the analysis property, which is a Constrained Set. A constrained set consists of an arbitrary amount of key-value pairs similar to Map containers in MATLAB or a dictionary in Python. However, constrained sets also have the ability to validate their own properties closer to how a typed Object would.
You can get/set values in constrained sets using their respective .get()/.set() methods and retrieve all Set properties using the keys() method, like in a containers.Map.
unit_names = keys(nwb.analysis);

Dynamic Tables

nwb.intervals_trials returns a unique type of table called a Dynamic Table. Dynamic tables inherit from the NWB type types.hdmf_common.DynamicTable and allow for a table-like interface in NWB. In the case below, we grab the special column start_time. Dynamic Tables allow adding your own vectors using the vectordata property, which are Constrained Sets. All columns are represented by either a types.hdmf_common.VectorData or a types.hdmf_common.VectorIndex type.

Data Stubs

The data property of the column id in nwb.units is a types.untyped.DataStub. This object is a representation of a dataset that is not loaded in memory, and is what allows MatNWB to lazily load its file data. To load the data into memory, use the .load() method which extracts all data from the NWB file. Alternatively, you can index into the DataStub directly using conventional MATLAB syntax.

Jagged Arrays in Dynamic Tables

With the new addition of addRow and getRow to Dynamic Tables, the concept of jagged arrays can be worked around and no longer require full understanding outside of specific data format concerns or low-level nwb tool development. The below paragraph is retained in its entirety from its original form as purely informational.
All data in a Dynamic Table must be aligned by row and column, but not all data fits into this paradigm neatly. In order to represent variable amounts of data that is localised to each row and column, NWB uses a concept called Jagged Arrays. These arrays consist of two column types: the familiar types.core.VectorData, and the new types.core.VectorIndex. A Vector Index holds no data, instead holding a reference to another Vector Data and a vector of indices that align to the Dynamic Table dimensions. The indices represent the last index boundary in the Vector Data object for the Vector Index row. As an example, an index of three in the first row of the Vector Index column points to the first three values in the referenced Vector Data column. Subsequently, if the next index were a five, it would indicate the fourth and fifth elements in the referenced Vector Data column.
The jagged arrays serve to represent multiple trials and spike times associated to each unit by id. A convenient way to represent these in MATLAB is to use Map containers where each unit's data is indexed directly by its unit id. Below, we utilize getRow in order to build the same Map.
unit_ids = nwb.units.id.data.load(); % array of unit ids represented within this
% Initialize trials & times Map containers indexed by unit_ids
unit_trials = containers.Map('KeyType',class(unit_ids),'ValueType','any');
unit_times = containers.Map('KeyType',class(unit_ids),'ValueType','any');
last_idx = 0;
for i = 1:length(unit_ids)
unit_id = unit_ids(i);
row = nwb.units.getRow(unit_id, 'useId', true, 'columns', {'spike_times', 'trials'});
unit_trials(unit_id) = row.trials{1};
unit_times(unit_id) = row.spike_times{1};
end

Process Units

We now do the following for each Unit:
sorted_ids = sort(unit_ids);
Photostim = struct(...
'ind', true,... % mask into xs and ys for this photostim
'period', 'none',...
'duration', 0,... % in seconds
'ramp_offset', 0); % in seconds
% Initialize Map container of plotting data for each unit, stored as structure
Unit = containers.Map('KeyType',class(unit_ids),'ValueType','any');
unit_struct = struct(...
'id', [],...
'xs', [],...
'ys', [],...
'xlim', [-Inf Inf],...
'sample', 0,...
'delay', 0,...
'response', 0,...
'left_scatter', false,...
'right_scatter', false,...
'photostim', Photostim); % can have multiple photostim
for unit_id = unit_ids'
We first extract trial IDs from the Unit IDs.
unit_trial_id = unit_trials(unit_id);
Then filter out outliers from the Sample, Delay, and Response time points with which we derive a "good enough" estimate.
trial = nwb.intervals_trials.getRow(unit_trial_id, 'useId', true,...
'columns', {'PoleInTime', 'PoleOutTime', 'CueTime', 'GoodTrials'});
unit_sample = trial.PoleInTime;
unit_delay = trial.PoleOutTime;
unit_response = trial.CueTime;
unit_good_trials = trial.GoodTrials;
% Subjective parameters
delay_threshold = 0.064;
response_threshold = 0.43;
expected_delay_offset = 1.3; % determined from figure 1a
expected_response_offset = 1.3;
expected_delay = unit_sample + expected_delay_offset;
expected_response = unit_delay + expected_response_offset;
good_delay = (unit_delay > expected_delay - delay_threshold) &...
(unit_delay < expected_delay + delay_threshold);
good_response = (unit_response > expected_response - response_threshold) &...
(unit_response < expected_response + response_threshold);
avg_sample = mean(unit_sample(good_delay & good_response));
avg_delay = mean(unit_delay(good_delay & good_response));
avg_response = mean(unit_response(good_delay & good_response));
Filter the rest of the data by "good" trials.
unit_good_trials = unit_good_trials & good_delay & good_response;
unit_trial_id = unit_trial_id(unit_good_trials);
unit_spike_time = unit_times(unit_id);
unit_spike_time = unit_spike_time(unit_good_trials);
Retrieve good trial data and organize by stimulation type.
trial = nwb.intervals_trials.getRow(unit_trial_id, 'useId', true,...
'columns', {'start_time', 'HitR', 'HitL', 'StimTrials', 'PhotostimulationType'});
unit_is_photostim = logical(trial.StimTrials);
unit_stim_type = trial.PhotostimulationType;
unit_no_stim = ~unit_is_photostim & 0 == unit_stim_type;
unit_sample_stim = unit_is_photostim & 1 == unit_stim_type;
unit_early_stim = unit_is_photostim & 2 == unit_stim_type;
unit_middle_stim = unit_is_photostim & 3 == unit_stim_type;
Compose Scatter Plots and the Peristimulus Time Histogram zeroed on the Response time.
xs = unit_spike_time - trial.start_time - avg_response;
ys = unit_trial_id;
curr_unit = unit_struct;
curr_unit.xs = xs;
curr_unit.ys = ys;
curr_unit.left_scatter = logical(trial.HitL);
curr_unit.right_scatter = logical(trial.HitR);
curr_unit.sample = avg_sample - avg_response;
curr_unit.delay = avg_delay - avg_response;
curr_unit.response = 0;
% Photostim periods
curr_unit.photostim.ind = unit_no_stim;
% Sample
if any(unit_sample_stim)
SampleStim = Photostim;
SampleStim.ind = unit_sample_stim;
SampleStim.period = 'Sample';
SampleStim.duration = 0.5;
SampleStim.ramp_offset = 0.1;
curr_unit.photostim(end+1) = SampleStim;
end
% Early Delay
if any(unit_early_stim)
early_stim_types = unique(unit_stim_type(unit_early_stim));
for i_early_types=1:length(early_stim_types)
early_type = early_stim_types(i_early_types);
EarlyStim = Photostim;
EarlyStim.period = 'Early Delay';
EarlyStim.ind = early_type == unit_stim_type & unit_early_stim;
if early_type == 2
EarlyStim.duration = 0.5;
EarlyStim.ramp_offset = 0.1;
else
EarlyStim.duration = 0.8;
EarlyStim.ramp_offset = 0.2;
end
curr_unit.photostim(end+1) = EarlyStim;
end
end
% Middle Delay
if any(unit_middle_stim)
MiddleStim = Photostim;
MiddleStim.ind = unit_middle_stim;
MiddleStim.period = 'Middle Delay';
MiddleStim.duration = 0.5;
MiddleStim.ramp_offset = 0.1;
curr_unit.photostim(end+1) = MiddleStim;
end
Unit(unit_id) = curr_unit;
end

Plot Example Neurons

neuron_labels = [2, 3]; % neuron labels from Figure 1e
neuron_ids = [11, 2]; % neuron unit IDs corresponding to the Fig 1e labels
num_conditions = 4; % photostim conditions: nostim, sample, early, middle if applicable
num_neurons = length(neuron_ids);
% Inititalize data structures for each summary plot of categorized neural spike data at specified stimulus condition
RasterPlot = struct(...
'xs', 0,...
'ys', 0);
ConditionPlot = struct(...
'label', '',...
'xlim', 0,...
'sample', 0,...
'delay', 0,...
'response', 0,...
'right_scatter', RasterPlot,...
'left_scatter', RasterPlot,...
'psth_bin_window', 0,...
'stim_type', '');
fig = figure;
% Plot neural spike data for each neuron and stimulus condition in a subplot array: num_neurons (rows) x num_conditions (columns)
for nn=1:num_neurons
Neuron = Unit(neuron_ids(nn));
% Initialize structure with neural + stimulus condition data
CurrPlot = ConditionPlot;
CurrPlot.xlim = [min(Neuron.xs) max(Neuron.xs)];
CurrPlot.sample = Neuron.sample;
CurrPlot.delay = Neuron.delay;
CurrPlot.response = Neuron.response;
% Plot each neuron/condition
plot_row = (nn - 1) * num_conditions;
for cc=1:num_conditions
ax = subplot(num_neurons, num_conditions, plot_row + cc, 'Parent', fig);
Stim = Neuron.photostim(cc);
CurrPlot.stim_type = Stim.period;
if strcmp(Stim.period, 'none')
CurrPlot.label = sprintf('Neuron %d', neuron_labels(nn));
CurrPlot.psth_bin_window = 9;
else
CurrPlot.label = Stim.period;
CurrPlot.psth_bin_window = 2;
end
stim_left_scatter_ind = Stim.ind & Neuron.left_scatter;
stim_left_scatter_trials = Neuron.ys(stim_left_scatter_ind);
CurrPlot.left_scatter.xs = Neuron.xs(stim_left_scatter_ind);
[~,CurrPlot.left_scatter.ys] = ismember(stim_left_scatter_trials,unique(stim_left_scatter_trials));
stim_right_scatter_ind = Stim.ind & Neuron.right_scatter;
stim_right_scatter_trials = Neuron.ys(stim_right_scatter_ind);
CurrPlot.right_scatter.xs = Neuron.xs(stim_right_scatter_ind);
[~,CurrPlot.right_scatter.ys] = ismember(stim_right_scatter_trials,unique(stim_right_scatter_trials));
plot_condition(ax, CurrPlot);
end
end
+ + +

Helper Functions

PSTH helper function
function [psth_xs, psth_ys] = calculate_psth(xs, bin_window, bin_width)
[bin_counts, edges] = histcounts(xs, 'BinWidth', bin_width);
psth_xs = edges(1:end-1) + (bin_width / 2);
moving_avg_b = (1/bin_window) * ones(1,bin_window);
psth_ys = filter(moving_avg_b, 1, bin_counts);
end
Plotter function for each stimulus condition
function plot_condition(ax, ConditionPlot)
left_cdata = [1 0 0]; % red
right_cdata = [0 0 1]; % blue
hist_margin = 50;
scatter_margin = 10;
% Calculate PSTH values
% moving average over 200 ms as per figure 1e
hist_bin_width = 0.2 / ConditionPlot.psth_bin_window;
[left_psth_xs, left_psth_ys] =...
calculate_psth(ConditionPlot.left_scatter.xs, ConditionPlot.psth_bin_window, hist_bin_width);
[right_psth_xs, right_psth_ys] =...
calculate_psth(ConditionPlot.right_scatter.xs, ConditionPlot.psth_bin_window, hist_bin_width);
right_scatter_offset = min(ConditionPlot.right_scatter.ys);
right_scatter_height = max(ConditionPlot.right_scatter.ys) - right_scatter_offset;
left_scatter_offset = min(ConditionPlot.left_scatter.ys);
left_scatter_height = max(ConditionPlot.left_scatter.ys) - left_scatter_offset;
psth_height = max([left_psth_ys right_psth_ys]);
left_y_offset = hist_margin...
+ psth_height...
- left_scatter_offset;
right_y_offset = scatter_margin...
+ left_y_offset...
+ left_scatter_offset...
+ left_scatter_height...
- right_scatter_offset;
subplot_height = right_y_offset...
+ right_scatter_offset...
+ right_scatter_height;
hold(ax, 'on');
% PSTH
plot(ax, left_psth_xs, left_psth_ys, 'Color', left_cdata);
plot(ax, right_psth_xs, right_psth_ys, 'Color', right_cdata);
% Scatter Plot
scatter(ax,...
ConditionPlot.left_scatter.xs,...
left_y_offset + ConditionPlot.left_scatter.ys,...
'Marker', '.',...
'CData', left_cdata,...
'SizeData', 1);
scatter(ax,...
ConditionPlot.right_scatter.xs,...
right_y_offset + ConditionPlot.right_scatter.ys,...
'Marker', '.',...
'CData', right_cdata,...
'SizeData', 1);
% sample, delay, response lines
line(ax, repmat(ConditionPlot.sample, 1, 2), [0 subplot_height],...
'Color', 'k', 'LineStyle', '--');
line(ax, repmat(ConditionPlot.delay, 1, 2), [0 subplot_height],...
'Color', 'k', 'LineStyle', '--');
line(ax, repmat(ConditionPlot.response, 1, 2), [0 subplot_height],...
'Color', 'k', 'LineStyle', '--');
% blue bar for photoinhibition period
if ~strcmp(ConditionPlot.stim_type, 'none')
stim_height = subplot_height;
stim_width = 0.5; % seconds
% end time relative to 'go' cue as described in the paper.
switch ConditionPlot.stim_type
case 'Sample'
end_offset = 1.6;
case 'Early Delay'
end_offset = 0.8;
case 'Middle Delay'
end_offset = 0.3;
otherwise
error('Invalid photostim period `%s`', ConditionPlot.stim_type);
end
stim_offset = ConditionPlot.response - stim_width - end_offset;
patch_vertices = [...
stim_offset, 0;...
stim_offset, stim_height;...
stim_offset+stim_width, stim_height;...
stim_offset+stim_width, 0];
patch(ax,...
'Faces', 1:4,...
'Vertices', patch_vertices,...
'FaceColor', '#B3D3EC',... % light blue shading
'EdgeColor', 'none',...
'FaceAlpha', 0.8);
end
title(ax, ConditionPlot.label);
xlabel(ax, 'Time (Seconds)');
ylabel(ax, 'Spikes s^{-1}')
xticks(ax, [-2 0 2]);
yticks(ax, [0 max(10, round(psth_height, -1))]);
% legend(ax, [scatter_left_plot, scatter_right_plot], {'Left Lick', 'Right Lick'},...
% 'location', 'northwestoutside');
ax.TickDir = 'out';
ax.XLim = ConditionPlot.xlim;
ax.YLim = [0 subplot_height];
hold(ax, 'off');
end
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/behavior.html b/docs/source/_static/html/tutorials/behavior.html new file mode 100644 index 00000000..4f70f592 --- /dev/null +++ b/docs/source/_static/html/tutorials/behavior.html @@ -0,0 +1,369 @@ + +Behavior Data

Behavior Data

This tutorial will guide you in writing behavioral data to NWB.

Creating an NWB File

Create an NWBFile object with the required fields (session_description, identifier, and session_start_time) and additional metadata.
nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', 'Mouse5_Day3', ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'general_experimenter', 'My Name', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', 'DOI:10.1016/j.neuron.2016.12.011'); % optional
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.7.0' + file_create_date: [] + identifier: 'Mouse5_Day3' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: [] + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [0×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'My Name' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: 'DOI:10.1016/j.neuron.2016.12.011' + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [0×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] + +Warning: The following required properties are missing for instance for type "NwbFile": + timestamps_reference_time

SpatialSeries: Storing continuous spatial data

SpatialSeries is a subclass of TimeSeries that represents data in space, such as the spatial direction e.g., of gaze or travel or position of an animal over time.
Create data that corresponds to x, y position over time.
position_data = [linspace(0, 10, 50); linspace(0, 8, 50)]; % 2 x nT array
In SpatialSeries data, the first dimension is always time (in seconds), the second dimension represents the x, y position. However, as described in the dimensionMapNoDataPipes tutorial, when a MATLAB array is exported to HDF5, the array is transposed. Therefore, in order to correctly export the data, in MATLAB the last dimension of an array should be time. SpatialSeries data should be stored as one continuous stream as it is acquired, not by trials as is often reshaped for analysis. Data can be trial-aligned on-the-fly using the trials table. See the trials tutorial for further information.
For position data reference_frame indicates the zero-position, e.g. the 0,0 point might be the bottom-left corner of an enclosure, as viewed from the tracking camera.
timestamps = linspace(0, 50, 50)/ 200;
position_spatial_series = types.core.SpatialSeries( ...
'description', 'Postion (x, y) in an open field.', ...
'data', position_data, ...
'timestamps', timestamps, ...
'reference_frame', '(0,0) is the bottom left corner.' ...
)
position_spatial_series =
SpatialSeries with properties: + + reference_frame: '(0,0) is the bottom left corner.' + starting_time_unit: 'seconds' + timestamps_interval: 1 + timestamps_unit: 'seconds' + data: [2×50 double] + comments: 'no comments' + control: [] + control_description: '' + data_continuity: '' + data_conversion: 1 + data_offset: 0 + data_resolution: -1 + data_unit: 'meters' + description: 'Postion (x, y) in an open field.' + starting_time: [] + starting_time_rate: [] + timestamps: [0 0.0051 0.0102 0.0153 0.0204 0.0255 0.0306 0.0357 0.0408 0.0459 0.0510 0.0561 0.0612 0.0663 0.0714 0.0765 0.0816 0.0867 0.0918 0.0969 0.1020 0.1071 0.1122 0.1173 0.1224 0.1276 0.1327 0.1378 0.1429 0.1480 0.1531 … ] (1×50 double) +

Position: Storing position measured over time

To help data analysis and visualization tools know that this SpatialSeries object represents the position of the subject, store the SpatialSeries object inside a Position object, which can hold one or more SpatialSeries objects.
position = types.core.Position();
position.spatialseries.set('SpatialSeries', position_spatial_series);

Create a Behavior Processing Module

Create a processing module called "behavior" for storing behavioral data in the NWBFile, then add the Position object to the processing module.
behavior_processing_module = types.core.ProcessingModule('description', 'stores behavioral data.');
behavior_processing_module.nwbdatainterface.set("Position", position);
nwb.processing.set("behavior", behavior_processing_module);

CompassDirection: Storing view angle measured over time

Analogous to how position can be stored, we can create a SpatialSeries object for representing the view angle of the subject.
For direction data reference_frame indicates the zero direction, for instance in this case "straight ahead" is 0 radians.
view_angle_data = linspace(0, 4, 50);
direction_spatial_series = types.core.SpatialSeries( ...
'description', 'View angle of the subject measured in radians.', ...
'data', view_angle_data, ...
'timestamps', timestamps, ...
'reference_frame', 'straight ahead', ...
'data_unit', 'radians' ...
);
direction = types.core.CompassDirection();
direction.spatialseries.set('spatial_series', direction_spatial_series);
We can add a CompassDirection object to the behavior processing module the same way we have added the position data.
%behavior_processing_module = types.core.ProcessingModule("stores behavioral data."); % if you have not already created it
behavior_processing_module.nwbdatainterface.set('CompassDirection', direction);
%nwb.processing.set('behavior', behavior_processing_module); % if you have not already added it

BehaviorTimeSeries: Storing continuous behavior data

BehavioralTimeSeries is an interface for storing continuous behavior data, such as the speed of a subject.
speed_data = linspace(0, 0.4, 50);
 
speed_time_series = types.core.TimeSeries( ...
'data', speed_data, ...
'starting_time', 1.0, ... % NB: Important to set starting_time when using starting_time_rate
'starting_time_rate', 10.0, ... % Hz
'description', 'he speed of the subject measured over time.', ...
'data_unit', 'm/s' ...
);
 
behavioral_time_series = types.core.BehavioralTimeSeries();
behavioral_time_series.timeseries.set('speed', speed_time_series);
 
%behavior_processing_module = types.core.ProcessingModule("stores behavioral data."); % if you have not already created it
behavior_processing_module.nwbdatainterface.set('BehavioralTimeSeries', behavioral_time_series);
%nwb.processing.set('behavior', behavior_processing_module); % if you have not already added it

BehavioralEvents: Storing behavioral events

BehavioralEvents is an interface for storing behavioral events. We can use it for storing the timing and amount of rewards (e.g. water amount) or lever press times.
reward_amount = [1.0, 1.5, 1.0, 1.5];
event_timestamps = [1.0, 2.0, 5.0, 6.0];
 
time_series = types.core.TimeSeries( ...
'data', reward_amount, ...
'timestamps', event_timestamps, ...
'description', 'The water amount the subject received as a reward.', ...
'data_unit', 'ml' ...
);
 
behavioral_events = types.core.BehavioralEvents();
behavioral_events.timeseries.set('lever_presses', time_series);
 
%behavior_processing_module = types.core.ProcessingModule("stores behavioral data."); % if you have not already created it
behavior_processing_module.nwbdatainterface.set('BehavioralEvents', behavioral_events);
%nwb.processing.set('behavior', behavior_processing_module); % if you have not already added it
Storing only the timestamps of the events is possible with the ndx-events NWB extension. You can also add labels associated with the events with this extension. You can find information about installation and example usage here.

BehavioralEpochs: Storing intervals of behavior data

BehavioralEpochs is for storing intervals of behavior data. BehavioralEpochs uses IntervalSeries to represent the time intervals. Create an IntervalSeries object that represents the time intervals when the animal was running. IntervalSeries uses 1 to indicate the beginning of an interval and -1 to indicate the end.
run_intervals = types.core.IntervalSeries( ...
'description', 'Intervals when the animal was running.', ...
'data', [1, -1, 1, -1, 1, -1], ...
'timestamps', [0.5, 1.5, 3.5, 4.0, 7.0, 7.3] ...
);
 
behavioral_epochs = types.core.BehavioralEpochs();
behavioral_epochs.intervalseries.set('running', run_intervals);
You can add more than one IntervalSeries to a BehavioralEpochs object.
sleep_intervals = types.core.IntervalSeries( ...
'description', 'Intervals when the animal was sleeping', ...
'data', [1, -1, 1, -1], ...
'timestamps', [15.0, 30.0, 60.0, 95.0] ...
);
behavioral_epochs.intervalseries.set('sleeping', sleep_intervals);
 
% behavior_processing_module = types.core.ProcessingModule("stores behavioral data.");
% behavior_processing_module.nwbdatainterface.set('BehavioralEvents', behavioral_events);
% nwb.processing.set('behavior', behavior_processing_module);

Another approach: TimeIntervals

Using TimeIntervals to represent time intervals is often preferred over BehavioralEpochs and IntervalSeries. TimeIntervals is a subclass of DynamicTable, which offers flexibility for tabular data by allowing the addition of optional columns which are not defined in the standard DynamicTable class.
sleep_intervals = types.core.TimeIntervals( ...
'description', 'Intervals when the animal was sleeping.', ...
'colnames', {'start_time', 'stop_time', 'stage'} ...
);
 
sleep_intervals.addRow('start_time', 0.3, 'stop_time', 0.35, 'stage', 1);
sleep_intervals.addRow('start_time', 0.7, 'stop_time', 0.9, 'stage', 2);
sleep_intervals.addRow('start_time', 1.3, 'stop_time', 3.0, 'stage', 3);
 
nwb.intervals.set('sleep_intervals', sleep_intervals);

EyeTracking: Storing continuous eye-tracking data of gaze direction

EyeTracking is for storing eye-tracking data which represents direction of gaze as measured by an eye tracking algorithm. An EyeTracking object holds one or more SpatialSeries objects that represent the gaze direction over time extracted from a video.
eye_position_data = [linspace(-20, 30, 50); linspace(30, -20, 50)];
 
right_eye_position = types.core.SpatialSeries( ...
'description', 'The position of the right eye measured in degrees.', ...
'data', eye_position_data, ...
'starting_time', 1.0, ... % NB: Important to set starting_time when using starting_time_rate
'starting_time_rate', 50.0, ... % Hz
'reference_frame', '(0,0) is middle', ...
'data_unit', 'degrees' ...
);
 
left_eye_position = types.core.SpatialSeries( ...
'description', 'The position of the right eye measured in degrees.', ...
'data', eye_position_data, ...
'starting_time', 1.0, ... % NB: Important to set starting_time when using starting_time_rate
'starting_time_rate', 50.0, ... % Hz
'reference_frame', '(0,0) is middle', ...
'data_unit', 'degrees' ...
);
 
eye_tracking = types.core.EyeTracking();
eye_tracking.spatialseries.set('right_eye_position', right_eye_position);
eye_tracking.spatialseries.set('left_eye_position', left_eye_position);
 
% behavior_processing_module = types.core.ProcessingModule("stores behavioral data.");
behavior_processing_module.nwbdatainterface.set('EyeTracking', eye_tracking);
% nwb.processing.set('behavior', behavior_processing_module);

PupilTracking: Storing continuous eye-tracking data of pupil size

PupilTracking is for storing eye-tracking data which represents pupil size. PupilTracking holds one or more TimeSeries objects that can represent different features such as the dilation of the pupil measured over time by a pupil tracking algorithm.
pupil_diameter = types.core.TimeSeries( ...
'description', 'Pupil diameter extracted from the video of the right eye.', ...
'data', linspace(0.001, 0.002, 50), ...
'starting_time', 1.0, ... % NB: Important to set starting_time when using starting_time_rate
'starting_time_rate', 20.0, ... % Hz
'data_unit', 'meters' ...
);
 
pupil_tracking = types.core.PupilTracking();
pupil_tracking.timeseries.set('pupil_diameter', pupil_diameter);
 
% behavior_processing_module = types.core.ProcessingModule("stores behavioral data.");
behavior_processing_module.nwbdatainterface.set('PupilTracking', pupil_tracking);
% nwb.processing.set('behavior', behavior_processing_module);

Writing the behavior data to an NWB file

All of the above commands build an NWBFile object in-memory. To write this file, use nwbExport.
% Save to tutorials/tutorial_nwb_files folder
nwbFilePath = misc.getTutorialNwbFilePath('behavior_tutorial.nwb');
nwbExport(nwb, nwbFilePath);
fprintf('Exported NWB file to "%s"\n', 'behavior_tutorial.nwb')
Exported NWB file to "behavior_tutorial.nwb"
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/convertTrials.html b/docs/source/_static/html/tutorials/convertTrials.html new file mode 100644 index 00000000..1df73a43 --- /dev/null +++ b/docs/source/_static/html/tutorials/convertTrials.html @@ -0,0 +1,1119 @@ + + + + + +NWB File Conversion Tutorial + + + + + + + +
+

NWB File Conversion Tutorial

+ +

How to convert trial-based experimental data to the Neurodata Without Borders file format using MatNWB. This example uses the CRCNS ALM-3 data set. Information on how to download the data can be found on the CRCNS Download Page. One should first familiarize themselves with the file format, which can be found on the ALM-3 About Page under the Documentation files.

+
author: Lawrence Niu
+contact: lawrence@vidriotech.com
+last updated: Sep 14, 2024
+ +

Contents

+
+ +
+

Script Configuration

+

The following section describes configuration parameters specific to the publishing script, and can be skipped when implementing your own conversion. The parameters can be changed to fit any of the available sessions.

+
animal = 'ANM255201';
+session = '20141124';
+
+identifier = [animal '_' session];
+
+% Specify the local path for the downloaded data:
+data_root_path = 'data';
+
+metadata_loc = fullfile(data_root_path, 'metadata', ['meta_data_' identifier '.mat']);
+datastructure_loc = fullfile(data_root_path, 'data_structure_files',...
+    ['data_structure_' identifier '.mat']);
+rawdata_loc = fullfile(data_root_path, 'RawVoltageTraces', [identifier '.tar']);
+
+

The animal and session specifier can be changed with the animal and session variable name respectively. metadata_loc, datastructure_loc, and rawdata_loc should refer to the metadata .mat file, the data structure .mat file, and the raw .tar file.

+
output_directory = 'out';
+
+if ~isfolder(output_directory)
+    mkdir(output_directory);
+end
+
+source_file = [mfilename() '.m'];
+[~, source_script, ~] = fileparts(source_file);
+
+

The NWB file will be saved in the output directory indicated by output_directory +

+

General Information

+
nwb = NwbFile();
+nwb.identifier = identifier;
+nwb.general_source_script = source_script;
+nwb.general_source_script_file_name = source_file;
+nwb.general_lab = 'Svoboda';
+nwb.general_keywords = {'Network models', 'Premotor cortex', 'Short-term memory'};
+nwb.general_institution = ['Janelia Research Campus,'...
+    ' Howard Huges Medical Institute, Ashburn, Virginia 20147, USA'];
+nwb.general_related_publications = ...
+    ['Li N, Daie K, Svoboda K, Druckmann S (2016).',...
+    ' Robust neuronal dynamics in premotor cortex during motor planning.',...
+    ' Nature. 7600:459-64. doi: 10.1038/nature17643'];
+nwb.general_stimulus = 'photostim';
+nwb.general_protocol = 'IACUC';
+nwb.general_surgery = ['Mice were prepared for photoinhibition and ',...
+    'electrophysiology with a clear-skull cap and a headpost. ',...
+    'The scalp and periosteum over the dorsal surface of the skull were removed. ',...
+    'A layer of cyanoacrylate adhesive (Krazy glue, Elmer''s Products Inc.) ',...
+    'was directly applied to the intact skull. A custom made headpost ',...
+    'was placed on the skull with its anterior edge aligned with the suture lambda ',...
+    '(approximately over cerebellum) and cemented in place ',...
+    'with clear dental acrylic (Lang Dental Jet Repair Acrylic; 1223-clear). ',...
+    'A thin layer of clear dental acrylic was applied over the cyanoacrylate adhesive ',...
+    'covering the entire exposed skull, ',...
+    'followed by a thin layer of clear nail polish (Electron Microscopy Sciences, 72180).'];
+nwb.session_description = sprintf('Animal `%s` on Session `%s`', animal, session);
+
+

All properties with the prefix general contain context for the entire experiment such as lab, institution, and experimentors. For session-delimited data from the same experiment, these fields will all be the same. Note that most of this information was pulled from the publishing paper and not from any of the downloadable data.

+

The only required property is the identifier, which distinguishes one session from another within an experiment. In our case, the ALM-3 data uses a combination of session date and animal ID.

+

The ALM-3 File Structure

+

Each ALM-3 session has three files: a metadata .mat file describing the experiment, a data structures .mat file containing analyzed data, and a raw .tar archive containing multiple raw electrophysiology data separated by trials as .mat files. All files will be merged into a single NWB file.

+

Metadata

+

ALM-3 Metadata contains information about the reference times, experimental context, methodology, as well as details of the electrophysiology, optophysiology, and behavioral portions of the experiment. A vast majority of these details are placed in general prefixed properties in NWB.

+
fprintf('Processing Meta Data from `%s`\n', metadata_loc);
+loaded = load(metadata_loc, 'meta_data');
+meta = loaded.meta_data;
+
+% Experiment-specific treatment for animals with the ReaChR gene modification
+isreachr = any(cell2mat(strfind(meta.animalGeneModification, 'ReaChR')));
+
+% Sessions are separated by date of experiment.
+nwb.general_session_id = meta.dateOfExperiment;
+
+% ALM-3 data start time is equivalent to the reference time.
+nwb.session_start_time = datetime([meta.dateOfExperiment meta.timeOfExperiment],...
+    'InputFormat', 'yyyyMMddHHmmss', 'TimeZone', 'America/New_York'); % Eastern Daylight Time
+nwb.timestamps_reference_time = nwb.session_start_time;
+
+nwb.general_experimenter = strjoin(meta.experimenters, ', ');
+
+
Processing Meta Data from `data/metadata/meta_data_ANM255201_20141124.mat`
+
+
nwb.general_subject = types.core.Subject(...
+    'species', meta.species{1}, ...
+    'subject_id', meta.animalID{1}(1,:), ... %weird case with duplicate Animal ID
+    'sex', meta.sex, ...
+    'age', meta.dateOfBirth, ...
+    'description', [...
+        'Whisker Config: ' strjoin(meta.whiskerConfig, ', ') newline...
+        'Animal Source: ' strjoin(meta.animalSource, ', ')]);
+
+

Ideally, if a raw data field does not correspond directly to a NWB field, one would create their own using a custom NWB extension class. However, since these fields are mostly experimental annotations, we instead pack the extra values into the description field as a string.

+
+% The formatStruct function simply prints the field and values given the struct.
+% An optional cell array of field names specifies whitelist of fields to print.
+% This function is provided with this script in the tutorials directory.
+nwb.general_subject.genotype = formatStruct(...
+    meta, ...
+    {'animalStrain'; 'animalGeneModification'; 'animalGeneCopy';...
+    'animalGeneticBackground'});
+
+weight = {};
+if ~isempty(meta.weightBefore)
+    weight{end+1} = 'weightBefore';
+end
+if ~isempty(meta.weightAfter)
+    weight{end+1} = 'weightAfter';
+end
+weight = weight(~cellfun('isempty', weight));
+if ~isempty(weight)
+    nwb.general_subject.weight = formatStruct(meta, weight);
+end
+
+% general/experiment_description
+nwb.general_experiment_description = [...
+    formatStruct(meta, {'experimentType'; 'referenceAtlas'}), ...
+    newline, ...
+    formatStruct(meta.behavior, {'task_keyword'})];
+
+% Miscellaneous collection information from ALM-3 that didn't quite fit any NWB
+% properties are stored in general/data_collection.
+nwb.general_data_collection = formatStruct(meta.extracellular,...
+    {'extracellularDataType';'cellType';'identificationMethod';'amplifierRolloff';...
+    'spikeSorting';'ADunit'});
+
+% Device objects are essentially just a list of device names.  We store the probe
+% and laser hardware names here.
+probetype = meta.extracellular.probeType{1};
+probeSource = meta.extracellular.probeSource{1};
+deviceName = [probetype ' (' probeSource ')'];
+nwb.general_devices.set(deviceName, types.core.Device());
+
+if isreachr
+    laserName = 'laser-594nm (Cobolt Inc., Cobolt Mambo 100)';
+else
+    laserName = 'laser-473nm (Laser Quantum, Gem 473)';
+end
+nwb.general_devices.set(laserName, types.core.Device());
+
+
structDesc = {'recordingCoordinates';'recordingMarker';'recordingType';'penetrationN';...
+    'groundCoordinates'};
+if ~isempty(meta.extracellular.referenceCoordinates)
+    structDesc{end+1} = 'referenceCoordinates';
+end
+recordingLocation = meta.extracellular.recordingLocation{1};
+egroup = types.core.ElectrodeGroup(...
+    'description', formatStruct(meta.extracellular, structDesc),...
+    'location', recordingLocation,...
+    'device', types.untyped.SoftLink(['/general/devices/' deviceName]));
+nwb.general_extracellular_ephys.set(deviceName, egroup);
+
+

The NWB ElectrodeGroup object stores experimental information regarding a group of probes. Doing so requires a SoftLink to the probe specified under general_devices. SoftLink objects are direct maps to HDF5 Soft Links on export, and thus, require a true HDF5 path.

+
+% You can specify column names and values as key-value arguments in the DynamicTable
+% constructor.
+dtColNames = {'x', 'y', 'z', 'imp', 'location', 'filtering','group', 'group_name'};
+dynTable = types.hdmf_common.DynamicTable(...
+    'colnames', dtColNames,...
+    'description', 'Electrodes',...
+    'x', types.hdmf_common.VectorData('description', 'x coordinate of the channel location in the brain (+x is posterior).'),...
+    'y', types.hdmf_common.VectorData('description', 'y coordinate of the channel location in the brain (+y is inferior).'),...
+    'z', types.hdmf_common.VectorData('description', 'z coordinate of the channel location in the brain (+z is right).'),...
+    'imp', types.hdmf_common.VectorData('description', 'Impedance of the channel.'),...
+    'location', types.hdmf_common.VectorData('description', ['Location of the electrode (channel). '...
+    'Specify the area, layer, comments on estimation of area/layer, stereotaxic coordinates if '...
+    'in vivo, etc. Use standard atlas names for anatomical regions when possible.']),...
+    'filtering', types.hdmf_common.VectorData('description', 'Description of hardware filtering.'),...
+    'group', types.hdmf_common.VectorData('description', 'Reference to the ElectrodeGroup this electrode is a part of.'),...
+    'group_name', types.hdmf_common.VectorData('description', 'Name of the ElectrodeGroup this electrode is a part of.'));
+
+% Raw HDF5 path to the above electrode group. Referenced by
+% the general/extracellular_ephys Dynamic Table
+egroupPath = ['/general/extracellular_ephys/' deviceName];
+eGroupReference = types.untyped.ObjectView(egroupPath);
+for i = 1:length(meta.extracellular.siteLocations)
+    location = meta.extracellular.siteLocations{i};
+    % Add each row in the dynamic table. The `id` column is populated
+    % dynamically.
+    dynTable.addRow(...
+        'x', location(1), 'y', location(2), 'z', location(3),...
+        'imp', 0,...
+        'location', recordingLocation,...
+        'filtering', '',...
+        'group', eGroupReference,...
+        'group_name', probetype);
+end
+
+

The group column in the Dynamic Table contains an ObjectView to the previously created ElectrodeGroup. An ObjectView can be best thought of as a direct pointer to another typed object. It also directly maps to a HDF5 Object Reference, thus the HDF5 path requirement. ObjectViews are slightly different from SoftLinks in that they can be stored in datasets (data columns, tables, and data fields in NWBData objects).

+
nwb.general_extracellular_ephys_electrodes = dynTable;
+
+

The electrodes property in extracellular_ephys is a special keyword in NWB that must be paired with a Dynamic Table. These are tables which can have an unbounded number of columns and rows, each as their own dataset. With the exception of the id column, all other columns must be VectorData or VectorIndex objects. The id column, meanwhile, must be an ElementIdentifiers object. The names of all used columns are specified in the in the colnames property as a cell array of strings.

+
+% general/optogenetics/photostim
+nwb.general_optogenetics.set('photostim', ...
+    types.core.OptogeneticStimulusSite(...
+    'excitation_lambda', meta.photostim.photostimWavelength{1}, ...
+    'location', meta.photostim.photostimLocation{1}, ...
+    'device', types.untyped.SoftLink(['/general/devices/' laserName]), ...
+    'description', formatStruct(meta.photostim, {...
+    'stimulationMethod';'photostimCoordinates';'identificationMethod'})));
+
+

Analysis Data Structure

+

The ALM-3 data structures .mat file contains analyzed spike data, trial-specific parameters, and behavioral analysis data.

+

Hashes

+

ALM-3 stores its data structures in the form of hashes which are essentially the same as python's dictionaries or MATLAB's maps but where the keys and values are stored under separate struct fields. Getting a hashed value from a key involves retrieving the array index that the key is in and applying it to the parallel array in the values field.

+

You can find more information about hashes and how they're used on the ALM-3 about page.

+
fprintf('Processing Data Structure `%s`\n', datastructure_loc);
+loaded = load(datastructure_loc, 'obj');
+data = loaded.obj;
+
+% wherein each cell is one trial. We must populate this way because trials
+% may not be in trial order.
+% Trial timeseries will be a compound type under intervals/trials.
+trial_timeseries = cell(size(data.trialIds));
+
+
Processing Data Structure `data/data_structure_files/data_structure_ANM255201_20141124.mat`
+
+

NWB comes with default support for trial-based data. These must be TimeIntervals that are placed in the intervals property. Note that trials is a special keyword that is required for PyNWB compatibility.

+
ephus = data.timeSeriesArrayHash.value{1};
+ephusUnit = data.timeUnitNames{data.timeUnitIds(ephus.timeUnit)};
+
+% Lick direction and timestamps trace
+tsIdx = strcmp(ephus.idStr, 'lick_trace');
+bts = types.core.BehavioralTimeSeries();
+
+bts.timeseries.set('lick_trace_ts', ...
+    types.core.TimeSeries(...
+    'data', ephus.valueMatrix(:,tsIdx),...
+    'data_unit', ephusUnit,...
+    'description', ephus.idStrDetailed{tsIdx}, ...
+    'timestamps', ephus.time, ...
+    'timestamps_unit', ephusUnit));
+nwb.acquisition.set('lick_trace', bts);
+bts_ref = types.untyped.ObjectView('/acquisition/lick_trace/lick_trace_ts');
+
+% Acousto-optic modulator input trace
+tsIdx = strcmp(ephus.idStr, 'aom_input_trace');
+ts = types.core.TimeSeries(...
+    'data', ephus.valueMatrix(:,tsIdx), ...
+    'data_unit', 'Volts', ...
+    'description', ephus.idStrDetailed{tsIdx}, ...
+    'timestamps', ephus.time, ...
+    'timestamps_unit', ephusUnit);
+nwb.stimulus_presentation.set('aom_input_trace', ts);
+ts_ref = types.untyped.ObjectView('/stimulus/presentation/aom_input_trace');
+
+% Laser power
+tsIdx = strcmp(ephus.idStr, 'laser_power');
+ots = types.core.OptogeneticSeries(...
+    'data', ephus.valueMatrix(:, tsIdx), ...
+    'data_conversion', 1e-3, ... % data is stored in mW, data unit for OptogeneticSeries is watts
+    'description', ephus.idStrDetailed{tsIdx}, ...
+    'timestamps', ephus.time, ...
+    'timestamps_unit', ephusUnit, ...
+    'site', types.untyped.SoftLink('/general/optogenetics/photostim'));
+nwb.stimulus_presentation.set('laser_power', ots);
+ots_ref = types.untyped.ObjectView('/stimulus/presentation/laser_power');
+
+% Append trials timeseries references in order
+[ephus_trials, ~, trials_to_data] = unique(ephus.trial);
+for i=1:length(ephus_trials)
+    i_loc = i == trials_to_data;
+    t_start = find(i_loc, 1);
+    t_count = sum(i_loc);
+    trial = ephus_trials(i);
+
+    trial_timeseries{trial}(end+(1:3), :) = {...
+        bts_ref int64(t_start) int64(t_count);...
+        ts_ref  int64(t_start) int64(t_count);...
+        ots_ref int64(t_start) int64(t_count)};
+end
+
+

The timeseries property of the TimeIntervals object is an example of a compound data type. These types are essentially tables of data in HDF5 and can be represented by a MATLAB table, an array of structs, or a struct of arrays. Beware: validation of column lengths here is not guaranteed by the type checker until export.

+

+VectorIndex objects index into a larger VectorData column. The object that is being referenced is indicated by the target property, which uses an ObjectView. Each element in the VectorIndex marks the last element in the corresponding vector data object for the VectorIndex row. Thus, the starting index for this row would be the previous index + 1. Note that these indices must be 0-indexed for compatibility with pynwb. You can see this in effect with the timeseries property which is indexed by the timeseries_index property.

+

Though TimeIntervals is a subclass of the DynamicTable type, we opt for populating the Dynamic Table data by column instead of using `addRow` here because of how the data is formatted. DynamicTable is flexible enough to accomodate both styles of data conversion.

+
trials_epoch = types.core.TimeIntervals(...
+    'colnames', {'start_time'}, ...
+    'description', 'trial data and properties', ...
+    'start_time', types.hdmf_common.VectorData(...
+        'data', data.trialStartTimes', ...
+        'description', 'Start time of epoch, in seconds.'), ...
+    'id', types.hdmf_common.ElementIdentifiers(...
+        'data', data.trialIds' ) );
+
+% Add columns for the trial types
+for i=1:length(data.trialTypeStr)
+    columnName = data.trialTypeStr{i};
+    columnData = types.hdmf_common.VectorData(...
+         'data', data.trialTypeMat(i,:)', ... % transpose for column vector
+         'description', data.trialTypeStr{i});
+    trials_epoch.addColumn( columnName, columnData )
+end
+
+% Add columns for the trial properties
+for i=1:length(data.trialPropertiesHash.keyNames)
+    columnName = data.trialPropertiesHash.keyNames{i};
+    descr = data.trialPropertiesHash.descr{i};
+    if iscellstr(descr)
+        descr = strjoin(descr, newline);
+    end
+    columnData = types.hdmf_common.VectorData(...
+         'data', data.trialPropertiesHash.value{i},...
+         'description', data.trialTypeStr{i});
+    trials_epoch.addColumn( columnName, columnData )
+end
+
+nwb.intervals_trials = trials_epoch;
+
+

Ephus spike data is separated into units which directly maps to the NWB property of the same name. Each such unit contains a group of analysed waveforms and spike times, all linked to a different subset of trials IDs.

+

The waveforms are placed in the analysis Set and are paired with their unit name ('unitx' where 'x' is some unit ID).

+

Trial IDs, wherever they are used, are placed in a relevent control property in the data object and will indicate what data is associated with what trial as defined in trials's id column.

+
nwb.units = types.core.Units('colnames',...
+    {'spike_times', 'trials', 'waveforms'},...
+    'description', 'Analysed Spike Events');
+esHash = data.eventSeriesHash;
+ids = regexp(esHash.keyNames, '^unit(\d+)$', 'once', 'tokens');
+ids = str2double([ids{:}]);
+nwb.units.spike_times = types.hdmf_common.VectorData(...
+    'description', 'timestamps of spikes');
+
+for i=1:length(ids)
+    esData = esHash.value{i};
+    % Add trials ID reference
+
+    good_trials_mask = ismember(esData.eventTrials, nwb.intervals_trials.id.data);
+    eventTrials = esData.eventTrials(good_trials_mask);
+    eventTimes = esData.eventTimes(good_trials_mask);
+    waveforms = esData.waveforms(good_trials_mask,:);
+    channel = esData.channel(good_trials_mask);
+
+    % Add waveform data to "unitx" and associate with "waveform" column as ObjectView.
+    ses = types.core.SpikeEventSeries(...
+        'control', ids(i),...
+        'control_description', 'Units Table ID',...
+        'data', waveforms .', ...
+        'description', esHash.descr{i}, ...
+        'timestamps', eventTimes, ...
+        'timestamps_unit', data.timeUnitNames{data.timeUnitIds(esData.timeUnit)},...
+        'electrodes', types.hdmf_common.DynamicTableRegion(...
+            'description', 'Electrodes involved with these spike events',...
+            'table', types.untyped.ObjectView('/general/extracellular_ephys/electrodes'),...
+            'data', channel - 1));
+    ses_name = esHash.keyNames{i};
+    ses_ref = types.untyped.ObjectView(['/analysis/', ses_name]);
+    if ~isempty(esData.cellType)
+        ses.comments = ['cellType: ' esData.cellType{1}];
+    end
+    nwb.analysis.set(ses_name, ses);
+    nwb.units.addRow(...
+        'id', ids(i), 'trials', eventTrials, 'spike_times', eventTimes, 'waveforms', ses_ref);
+
+    % Add this timeseries into the trials table as well.
+    [s_trials, ~, trials_to_data] = unique(eventTrials);
+    for j=1:length(s_trials)
+        trial = s_trials(j);
+        j_loc = j == trials_to_data;
+        t_start = find(j_loc, 1);
+        t_count = sum(j_loc);
+
+        trial_timeseries{trial}(end+1, :) = {ses_ref int64(t_start) int64(t_count)};
+    end
+end
+
+

To better understand how spike_times_index and spike_times map to each other, refer to this diagram from the Extracellular Electrophysiology Tutorial.

+

Raw Acquisition Data

+

Each ALM-3 session is associated with a large number of raw voltage data grouped by trial ID. To map this data to NWB, each trial is created as its own ElectricalSeries object under the name 'trial n' where 'n' is the trial ID. The trials are then linked to the trials dynamic table for easy referencing.

+
fprintf('Processing Raw Acquisition Data from `%s` (will take a while)\n', rawdata_loc);
+untarLoc = strrep(rawdata_loc, '.tar', '');
+if ~isfolder(untarLoc)
+    untar(rawdata_loc, fileparts(rawdata_loc));
+end
+
+rawfiles = dir(untarLoc);
+rawfiles = fullfile(untarLoc, {rawfiles(~[rawfiles.isdir]).name});
+
+nrows = length(nwb.general_extracellular_ephys_electrodes.id.data);
+tablereg = types.hdmf_common.DynamicTableRegion(...
+    'description','Relevent Electrodes for this Electrical Series',...
+    'table',types.untyped.ObjectView('/general/extracellular_ephys/electrodes'),...
+    'data',(1:nrows) - 1);
+objrefs = cell(size(rawfiles));
+
+endTimestamps = trials_epoch.start_time.data;
+for i=1:length(rawfiles)
+    tnumstr = regexp(rawfiles{i}, '_trial_(\d+)\.mat$', 'tokens', 'once');
+    tnumstr = tnumstr{1};
+    rawdata = load(rawfiles{i}, 'ch_MUA', 'TimeStamps');
+    tnum = str2double(tnumstr);
+
+    if tnum > length(endTimestamps)
+        continue; % sometimes there are extra trials without an associated start time.
+    end
+
+    es = types.core.ElectricalSeries(...
+        'data', rawdata.ch_MUA,...
+        'description', ['Raw Voltage Acquisition for trial ' tnumstr],...
+        'electrodes', tablereg,...
+        'timestamps', rawdata.TimeStamps);
+    tname = ['trial ' tnumstr];
+    nwb.acquisition.set(tname, es);
+
+    endTimestamps(tnum) = endTimestamps(tnum) + rawdata.TimeStamps(end);
+    objrefs{tnum} = types.untyped.ObjectView(['/acquisition/' tname]);
+end
+
+% Link to the raw data by adding the acquisition column with ObjectViews
+% to the data
+emptyrefs = cellfun('isempty', objrefs);
+objrefs(emptyrefs) = {types.untyped.ObjectView('')};
+
+trials_epoch.addColumn('acquisition', types.hdmf_common.VectorData(...
+    'description', 'soft link to acquisition data for this trial',...
+    'data', [objrefs{:}]'));
+
+trials_epoch.stop_time = types.hdmf_common.VectorData(...
+     'data', endTimestamps',...
+     'description', 'the end time of each trial');
+trials_epoch.colnames{end+1} = 'stop_time';
+
+
Processing Raw Acquisition Data from `data/RawVoltageTraces/ANM255201_20141124.tar` (will take a while)
+
+

Add timeseries to trials_epoch

+

First, we'll format and store trial_timeseries into intervals_trials. note that timeseries_index data is 0-indexed.

+
ts_len = cellfun('size', trial_timeseries, 1);
+nwb.intervals_trials.timeseries_index = types.hdmf_common.VectorIndex(...
+    'description', 'Index into Timeseries VectorData', ...
+    'data', cumsum(ts_len)', ...
+    'target', types.untyped.ObjectView('/intervals/trials/timeseries') );
+
+% Intervals/trials/timeseries is a compound type so we use cell2table to
+% convert this 2-d cell array into a compatible table.
+is_len_nonzero = ts_len > 0;
+trial_timeseries_table = cell2table(vertcat(trial_timeseries{is_len_nonzero}),...
+    'VariableNames', {'timeseries', 'idx_start', 'count'});
+trial_timeseries_table = movevars(trial_timeseries_table, 'timeseries', 'After', 'count');
+
+interval_trials_timeseries = types.core.TimeSeriesReferenceVectorData(...
+    'description', 'Index into TimeSeries data', ...
+    'data', trial_timeseries_table);
+nwb.intervals_trials.timeseries = interval_trials_timeseries;
+nwb.intervals_trials.colnames{end+1} = 'timeseries';
+
+

Export

+
nwbFilePath = fullfile(output_directory, [identifier '.nwb']);
+if isfile(nwbFilePath)
+    delete(nwbFilePath);
+end
+nwbExport(nwb, nwbFilePath);
+
+ +
+ + + diff --git a/docs/source/_static/html/tutorials/dataPipe.html b/docs/source/_static/html/tutorials/dataPipe.html new file mode 100644 index 00000000..1d404b4f --- /dev/null +++ b/docs/source/_static/html/tutorials/dataPipe.html @@ -0,0 +1,441 @@ + + + + + +Neurodata Without Borders (NWB) advanced write using DataPipe + + + + + + + +
+

Neurodata Without Borders (NWB) advanced write using DataPipe

+ +

How to utilize HDF5 compression using dataPipe

+
authors: Ivan Smalianchuk and Ben Dichter
+contact: smalianchuk.ivan@gmail.com, ben.dichter@catalystneuro.com
+last edited: Jan 04, 2021
+ +

Contents

+
+ +
+

Neurophysiology data can be quite large, often in the 10s of GB per session and sometimes much larger. Here, we demonstrate methods in MatNWB that allow you to deal with large datasets. These methods are compression and iterative write. Both of these techniques use the types.untyped.DataPipe object, which sends specific instructions to the HDF5 backend about how to store data.

+

Compression - basic implementation

+

To compress experimental data (in this case a 3D matrix with dimensions [250 250 70]) one must assign it as a DataPipe type:

+
DataToCompress = randi(100, 250, 250, 70);
+DataPipe = types.untyped.DataPipe('data', DataToCompress);
+
+

This is the most basic way to acheive compression, and all of the optimization decisions are automatically determined by MatNWB.

+

Background

+

HDF5 has built-in ability to compress and decompress individual datasets. If applied intelligently, this can dramatically reduce the amount of space used on the hard drive to represent the data. The end user does not need to worry about the compression status of the dataset- HDF5 will automatically decompress the dataset on read.

+

The above example uses default chunk size and compression level (3). To optimize compression, compressionLevel and chunkSize must be considered. compressionLevel ranges from 0 - 9 where 9 is the highest level of compression and 0 is the lowest. chunkSize is less intuitive to adjust; to implement compression, chunk size must be less than data size.

+

+DataPipe Arguments

+

+ + + + + + + +
maxSizeSets the maximum size of the HDF5 Dataset. Unless using iterative writing, this should match the size of Data. To append data later, use the maxSize for the full dataset. You can use Inf for a value of a dimension if you do not know its final size.
dataThe data to compress. Must be numerical data.
axisSet which axis to increment when appending more data.
dataTypeSets the type of the experimental data. This must be a numeric data type. Useful to include when using iterative write to append data as the appended data must be the same data type. If data is provided and dataType is not, the dataType is inferred from the provided data.
chunkSizeSets chunk size for the compression. Must be less than maxSize.
compressionLevelLevel of compression ranging from 0-9 where 9 is the highest level of compression. The default is level 3.
offsetAxis offset of dataset to append. May be used to overwrite data.
+

+

Chunking

+

HDF5 Datasets can be either stored in continuous or chunked mode. Continuous means that all of the data is written to one continuous block on the hard drive, and chunked means that the dataset is automatically split into chunks that are distributed across the hard drive. The user does not need to know the mode used- HDF5 handles the gathering of chunks automatically. However, it is worth understanding these chunks because they can have a big impact on space used and read and write speed. When using compression, the dataset MUST be chunked. HDF5 is not able to apply compression to continuous datasets.

+

If chunkSize is not explicitly specified, dataPipe will determine an appropriate chunk size. However, you can optimize the performance of the compression by manually specifying the chunk size using chunkSize argument.

+

We can demonstrate the benefit of chunking by exploring the following scenario. The following code utilizes DataPipe's default chunk size:

+
fData = randi(250, 100, 1000); % Create fake data
+
+% create an nwb structure with required fields
+nwb = NwbFile( ...
+    'session_start_time', datetime('2020-01-01 00:00:00', 'TimeZone', 'local'), ...
+    'identifier', 'ident1', ...
+    'session_description', 'DataPipeTutorial');
+
+fData_compressed = types.untyped.DataPipe('data', fData);
+
+fdataNWB=types.core.TimeSeries( ...
+    'data', fData_compressed, ...
+    'data_unit', 'mV', ...
+    'starting_time', 0.0, ...
+    'starting_time_rate', 30.0);
+
+nwb.acquisition.set('data', fdataNWB);
+
+nwbExport(nwb, 'DefaultChunks.nwb');
+
+

This results in a file size of 47MB (too large), and the process takes 11 seconds (far too long). Setting the chunk size manually as in the example code below resolves these issues:

+
fData_compressed = types.untyped.DataPipe( ...
+    'data', fData, ...
+    'chunkSize', [1, 1000], ...
+    'axis', 1);
+
+

This change results in the operation completing in 0.7 seconds and resulting file size of 1.1MB. The chunk size was chosen such that it spans each individual row of the matrix.

+

Use the combination of arugments that fit your need. When dealing with large datasets, you may want to use iterative write to ensure that you stay within the bounds of your system memory and use chunking and compression to optimize storage, read and write of the data.

+

Iterative Writing

+

If experimental data is close to, or exceeds the available system memory, performance issues may arise. To combat this effect of large data, DataPipe can utilize iterative writing, where only a portion of the data is first compressed and saved, and then additional portions are appended.

+

To demonstrate, we can create a nwb file with a compressed time series data:

+
dataPart1 = randi(250, 1, 1000); % "load" 1/4 of the entire dataset
+fullDataSize = [1 40000]; % this is the size of the TOTAL dataset
+
+% create an nwb structure with required fields
+nwb=NwbFile( ...
+    'session_start_time', datetime('2020-01-01 00:00:00', 'TimeZone', 'local'), ...
+    'identifier', 'ident1', ...
+    'session_description', 'DataPipeTutorial');
+
+% compress the data
+fData_use = types.untyped.DataPipe( ...
+    'data', dataPart1, ...
+    'maxSize', fullDataSize, ...
+    'axis', 2);
+
+%Set the compressed data as a time series
+fdataNWB = types.core.TimeSeries( ...
+    'data', fData_use, ...
+    'data_unit', 'mV', ...
+    'starting_time', 0.0, ...
+    'starting_time_rate', 30.0);
+
+nwb.acquisition.set('time_series', fdataNWB);
+
+nwbExport(nwb, 'DataPipeTutorial_iterate.nwb');
+
+

To append the rest of the data, simply load the NWB file and use the append method:

+
nwb = nwbRead('DataPipeTutorial_iterate.nwb', 'ignorecache'); %load the nwb file with partial data
+
+% "load" each of the remaining 1/4ths of the large dataset
+for i = 2:4 % iterating through parts of data
+    dataPart_i=randi(250, 1, 10000); % faked data chunk as if it was loaded
+    nwb.acquisition.get('time_series').data.append(dataPart_i); % append the loaded data
+end
+
+

The axis property defines the dimension in which additional data will be appended. In the above example, the resulting dataset will be 4000x1. However, if we set axis to 2 (and change fullDataSize appropriately), then the resulting dataset will be 1000x4.

+

Timeseries example

+

Following is an example of how to compress and add a timeseries to an NWB file:

+
fData=randi(250, 1, 10000); % create fake data;
+
+%assign data without compression
+nwb=NwbFile(...
+    'session_start_time', datetime(2020, 1, 1, 0, 0, 0, 'TimeZone', 'local'), ...
+    'identifier','ident1', ...
+    'session_description', 'DataPipeTutorial');
+
+ephys_module = types.core.ProcessingModule( ...
+    'description', 'holds processed ephys data');
+
+nwb.processing.set('ephys', ephys_module);
+
+% compress the data
+fData_compressed=types.untyped.DataPipe( ...
+    'data', fData, ...
+    'compressionLevel', 3, ...
+    'chunkSize', [100 1], ...
+    'axis', 1);
+
+% Assign the data to appropriate module and write the NWB file
+fdataNWB=types.core.TimeSeries( ...
+    'data', fData_compressed, ...
+    'data_unit', 'mV', ...
+    'starting_time', 0.0, ...
+    'starting_time_rate', 30.0);
+
+ephys_module.nwbdatainterface.set('data', fdataNWB);
+nwb.processing.set('ephys', ephys_module);
+
+% write the file
+nwbExport(nwb, 'Compressed.nwb');
+
+ +
+ + + diff --git a/docs/source/_static/html/tutorials/dimensionMapNoDataPipes.html b/docs/source/_static/html/tutorials/dimensionMapNoDataPipes.html new file mode 100644 index 00000000..34e8d692 --- /dev/null +++ b/docs/source/_static/html/tutorials/dimensionMapNoDataPipes.html @@ -0,0 +1,92 @@ + +MatNWB <-> HDF5 Dimension Mapping

MatNWB <-> HDF5 Dimension Mapping

This tutorial demonstrates how the dimensions of a MATLAB array maps onto a dataset in HDF5. There are two main differences between the way MATLAB and HDF5 represents dimensions:
  1. C-ordering vs F-ordering: HDF5 is C-ordered, which means it stores data in a rows-first pattern, whereas MATLAB is F-ordered, storing data in the reverse pattern, with the last dimension of the array stored consecutively. The result is that the data in HDF5 is effectively the transpose of the array in MATLAB.
  2. 1D data (i.e vectors): HDF5 can store 1-D arrays, but in MATLAB the lowest dimensionality of an array is 2-D.
Due to differences in how MATLAB and HDF5 represent data, the dimensions of datasets are flipped when writing to/from file in MatNWB. Additionally, MATLAB represents 1D vectors in a 2D format, either as row vectors or column vectors, whereas HDF5 treats vectors as truly 1D. Consequently, when a 1D dataset from HDF5 is loaded into MATLAB, it is always represented as a column vector. To avoid unintentional changes in data dimensions, it is therefore recommended to avoid writing row vectors into an NWB file for 1D datasets.
Contrast this tutorial with the dimensionMapWithDataPipes tutorial that illustrates how vectors are represented differently when using DataPipe objects within VectorData objects.

Create Table

First, create a TimeIntervals table of height 10.
% Define VectorData objects for each column
% 1D column
start_col = types.hdmf_common.VectorData( ...
'description', 'start_times column', ...
'data', (1:10)' ... # maps onto HDF5 dataset of size (10,)
);
% 1D column
stop_col = types.hdmf_common.VectorData( ...
'description', 'stop_times column', ...
'data', (2:11)' ... # maps onto HDF5 dataset of size (10,)
);
% 4D column
randomval_col = types.hdmf_common.VectorData( ...
'description', 'randomvalues column', ...
'data', rand(5,2,3,10) ... # maps onto HDF5 dataset of size (10, 3, 2, 5)
);
 
% 1D column
id_col = types.hdmf_common.ElementIdentifiers('data', int64(0:9)'); % maps onto HDF5 dataset of size (10,)
 
% Create table
trials_table = types.core.TimeIntervals(...
'description', 'test dynamic table column',...
'colnames', {'start_time','stop_time','randomvalues'}, ...
'start_time', start_col, ...
'stop_time', stop_col, ...
'randomvalues', randomval_col, ...
'id', id_col ...
);

Export Table

Create NWB file with TimeIntervals table and export.
% Create NwbFile object with required arguments
file = NwbFile( ...
'session_start_time', datetime('2022-01-01 00:00:00', 'TimeZone', 'local'), ...
'identifier', 'ident1', ...
'session_description', 'test file' ...
);
% Assign to intervals_trials
file.intervals_trials = trials_table;
% Export
nwbExport(file, 'testFileNoDataPipes.nwb');
You can examine the dimensions of the datasets on file using HDFView. Screenshots for this file are below.
Screen Shot 2022-01-07 at 11.07.25 AM.png
Screen Shot 2022-01-07 at 11.07.19 AM.png
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/dimensionMapWithDataPipes.html b/docs/source/_static/html/tutorials/dimensionMapWithDataPipes.html new file mode 100644 index 00000000..fb17536d --- /dev/null +++ b/docs/source/_static/html/tutorials/dimensionMapWithDataPipes.html @@ -0,0 +1,110 @@ + +MatNWB <-> HDF5 Dimension Mapping

MatNWB <-> HDF5 Dimension Mapping

This tutorial is easier to follow if you have already looked at the dimensionMapNoDataPipes tutorial or if you compare these side by side.
The key difference when using DataPipe instead of VectorData is that 1D data can be represented in HDF5 as 2D, thus allowing you to write either row or column vectors. This is made possible because of the maxSize property of the DataPipe class, which lets you specify a max size for each dimension. By setting the maxSize to [1, N] or [N, 1], vectors in HDF5 are represented as 2D arrays, just like in MATLAB. The flipping of the dimension order still applies, so a row vector in MATLAB becomes a column vector in HDF5 and vice versa.
Please note: The following tutorial mixes row and column vectors and does not produce a valid dynamic table. The tutorial is only meant to showcase how data maps onto HDF5 datasets when using DataPipe objects.

Create Table

First, create an expandable TimeIntervals table of height 10.
% 1D column
start_col = types.hdmf_common.VectorData( ...
'description', 'start times column', ...
'data', types.untyped.DataPipe( ...
'data', 1:10, ... # maps onto HDF5 dataset of size (10, )
'maxSize', Inf ...
) ...
);
% 1D column
stop_col = types.hdmf_common.VectorData( ...
'description', 'stop times column', ...
'data', types.untyped.DataPipe( ...
'data', 1:10, ... # maps onto HDF5 dataset of size (10, 1)
'maxSize', [1 Inf], ...
'axis', 2 ...
) ...
);
% 1D column
cond_col = types.hdmf_common.VectorData( ...
'description', 'condition column', ...
'data', types.untyped.DataPipe( ...
'data', randi(2,10,1), ... # maps onto HDF5 dataset of size (1, 10)
'maxSize', [Inf, 1], ...
'axis', 1 ...
) ...
);
% 4D column
randomval_col = types.hdmf_common.VectorData( ...
'description', 'randomvalues column', ...
'data', types.untyped.DataPipe( ...
'data', rand(5,2,3,10), ... # maps onto HDF5 dataset of size (10, 3, 2, 5)
'maxSize', [5, 2, 3, Inf], ...
'axis', 4 ...
) ...
);
% 1D column
ids_col = types.hdmf_common.ElementIdentifiers( ...
'data', types.untyped.DataPipe( ...
'data', int64(0:9), ... # maps onto HDF5 dataset of size (10, )
'maxSize', Inf ...
) ...
);
% Create table
trials_table = types.core.TimeIntervals(...
'description', 'test dynamic table column',...
'colnames', {'start_time', 'stop_time', 'randomvalues', 'conditions'}, ...
'start_time', start_col, ...
'stop_time', stop_col, ...
'conditions', cond_col, ...
'randomvalues', randomval_col, ...
'id', ids_col ...
);

Export Table

Create NWB file with expandable TimeIntervals table and export.
% Create NwbFile object with required arguments
file = NwbFile( ...
'session_start_time', datetime('2022-01-01 00:00:00', 'TimeZone', 'local'), ...
'identifier', 'ident1', ...
'session_description', 'test file' ...
);
% Assign to intervals_trials
file.intervals_trials = trials_table;
% Export
nwbExport(file, 'testFileWithDataPipes.nwb');
You can examine the dimensions of the datasets on file using HDFView. Screenshots for this file are below.
Screen Shot 2022-01-12 at 1.12.42 PM.png
Screen Shot 2022-01-12 at 1.12.47 PM.png
Screen Shot 2022-01-07 at 4.26.21 PM.png
Screen Shot 2022-01-07 at 4.26.12 PM.png
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/dynamic_tables.html b/docs/source/_static/html/tutorials/dynamic_tables.html new file mode 100644 index 00000000..af3f6c88 --- /dev/null +++ b/docs/source/_static/html/tutorials/dynamic_tables.html @@ -0,0 +1,577 @@ + +DynamicTables Tutorial

DynamicTables Tutorial

This is a user guide to interacting with DynamicTable objects in MatNWB.
Table of Contents

MatNWB Setup

Start by setting up your MATLAB workspace. The code below adds the directory containing the MatNWB package to the MATLAB search path. MatNWB works by automatically creating API classes based on a defined schema.
%{
path_to_matnwb = '~/Repositories/matnwb'; % change to your own path location
addpath(genpath(pwd));
%}

Constructing a table with initialized columns

The DynamicTable class represents a column-based table to which you can add custom columns. It consists of a description, a list of columns , and a list of row IDs. You can create a DynamicTable by first defining the VectorData objects that will make up the columns of the table. Each VectorData object must contain the same number of rows. A list of rows IDs may be passed to the DynamicTable using the id argument. Row IDs are a useful way to access row information independent of row location index. The list of row IDs must be cast as an ElementIdentifiers object before being passed to the DynamicTable object. If no value is passed to id, an ElementIdentifiers object with 0-indexed row IDs will be created for you automatically.
MATLAB Syntax Note: Using column vectors is crucial to properly build vectors and tables. When defining individual values, make sure to use semi-colon (;) instead of instead of comma (,) when defining the data fields of these.
col1 = types.hdmf_common.VectorData( ...
'description', 'column #1', ...
'data', [1;2] ...
);
 
col2 = types.hdmf_common.VectorData( ...
'description', 'column #2', ...
'data', {'a';'b'} ...
);
 
my_table = types.hdmf_common.DynamicTable( ...
'description', 'an example table', ...
'colnames', {'col1', 'col2'}, ...
'col1', col1, ...
'col2', col2, ...
'id', types.hdmf_common.ElementIdentifiers('data', [0;1]) ... % 0-indexed, for compatibility with Python
);
my_table
my_table =
DynamicTable with properties: + + id: [1×1 types.hdmf_common.ElementIdentifiers] + colnames: {'col1' 'col2'} + description: 'an example table' + vectordata: [2×1 types.untyped.Set] +

Adding rows

You can add rows to an existing DynamicTable using the object's addRow method. One way of using this method is to pass in the names of columns as parameter names followed by the elements to append. The class of the elements of the column must match the elements to append.
my_table.addRow('col1', 3, 'col2', {'c'}, 'id', 2);

Adding columns

You can add new columns to an existing DynamicTable object using the addColumn method. One way of using this method is to pass in the names of each new column followed by the corresponding values for each new column. The height of the new columns must match the height of the table.
col3 = types.hdmf_common.VectorData('description', 'column #3', ...
'data', [100; 200; 300]);
col4 = types.hdmf_common.VectorData('description', 'column #4', ...
'data', {'a1'; 'b2'; 'c3'});
 
my_table.addColumn('col3', col3,'col4', col4);

Create MATLAB table and convert to dynamic table

As an alternative to building a dynamic table using the DynamicTable and VectorData data types, it is also possible to create a MATLAB table and convert it to a dynamic table. Lets create the same table as before, but using MATLAB's table class:
% Create a table with two variables (columns):
T = table([1;2], {'a';'b'}, 'VariableNames', {'col1', 'col2'});
T.Properties.VariableDescriptions = {'column #1', 'column #2'};

Adding rows

T(end+1, :) = {3, 'c'};

Adding variables (columns)

T = addvars(T, [100;200;300], 'NewVariableNames',{'col3'});
T.Properties.VariableDescriptions{3} = 'column #3';
 
% Alternatively, a new variable can be added directly using dot syntax.
T.col4 = {'a1'; 'b2'; 'c3'};
T.Properties.VariableDescriptions{4} = 'column #4';
T
T = 3×4 table
 col1col2col3col4
11'a'100'a1'
22'b'200'b2'
33'c'300'c3'

Convert to dynamic table

dynamic_table = util.table2nwb(T, 'A MATLAB table that was converted to a dynamic table')
dynamic_table =
DynamicTable with properties: + + id: [1×1 types.hdmf_common.ElementIdentifiers] + colnames: {'col1' 'col2' 'col3' 'col4'} + description: 'A MATLAB table that was converted to a dynamic table' + vectordata: [4×1 types.untyped.Set] +

Enumerated (categorical) data

EnumData is a special type of column for storing an enumerated data type. This way each unique value is stored once, and the data references those values by index. Using this method is more efficient than storing a single value many times, and has the advantage of communicating to downstream tools that the data is categorical in nature.

Warning Regarding EnumData

EnumData is currently an experimental feature and as such should not be used in a production environment.
CellTypeElements = types.hdmf_common.VectorData(...
'description', 'fixed set of elements referenced by cell_type' ...
, 'data', {'aa', 'bb', 'cc'} ... % the enumerated elements
);
CellType = types.hdmf_experimental.EnumData( ...
'description', 'this column holds categorical variables' ... % properties derived from VectorData
, 'data', [0, 1, 2, 1, 0] ... % zero-indexed offset to elements.
, 'elements', types.untyped.ObjectView(CellTypeElements) ...
);
 
MyTable = types.hdmf_common.DynamicTable('description', 'an example table');
MyTable.vectordata.set('cell_type_elements', CellTypeElements); % the *_elements format is required for compatibility with pynwb
MyTable.addColumn('cell_type', CellType);

Ragged array columns

A table column with a different number of elements for each row is called a "ragged array column." To define a table with a ragged array column, pass both the VectorData and the corresponding VectorIndex as columns of the DynamicTable object. The VectorData columns will contain the data values. The VectorIndex column serves to indicate how to arrange the data across rows. By convention the VectorIndex object corresponding to a particular column must have have the same name with the addition of the '_index' suffix.
Below, the VectorIndex values indicate to place the 1st to 3rd (inclusive) elements of the VectorData into the first row and 4th element into the second row. The resulting table will have the cell {'1a'; '1b'; '1c'} in the first row and the cell {'2a'} in the second row.
 
col1 = types.hdmf_common.VectorData( ...
'description', 'column #1', ...
'data', {'1a'; '1b'; '1c'; '2a'} ...
);
 
col1_index = types.hdmf_common.VectorIndex( ...
'description', 'column #1 index', ...
'target',types.untyped.ObjectView(col1), ... % object view of target column
'data', [3; 4] ...
);
 
table_ragged_col = types.hdmf_common.DynamicTable( ...
'description', 'an example table', ...
'colnames', {'col1'}, ...
'col1', col1, ...
'col1_index', col1_index, ...
'id', types.hdmf_common.ElementIdentifiers('data', [0; 1]) ... % 0-indexed, for compatibility with Python
);

Adding ragged array rows

You can add a new row to the ragged array column. Under the hood, the addRow method will add the appropriate value to the VectorIndex column to maintain proper formatting.
table_ragged_col.addRow('col1', {'3a'; '3b'; '3c'}, 'id', 2);

Accessing row elements

You can access data from entire rows of a DynamicTable object by calling the getRow method for the corresponding object. You can supply either an individual row number or a list of row numbers.
my_table.getRow(1)
ans = 1×4 table
 col1col2col3col4
11'a'100'a1'
If you want to access values for just a subset of columns you can pass in the 'columns' argument along with a cell array with the desired column names
my_table.getRow(1:3, 'columns', {'col1'})
ans = 3×1 table
 col1
11
22
33
You can also access specific rows by their corresponding row ID's, if they have been defined, by supplying a 'true' Boolean to the 'useId' parameter
my_table.getRow(1, 'useId', true)
ans = 1×4 table
 col1col2col3col4
12'b'200'b2'
For a ragged array columns, the getRow method will return a cell with different number of elements for each row
table_ragged_col.getRow(1:2)
ans = 2×1 table
 col1
1[{'1a'};{'1b'};{'1c'}]
21×1 cell

Accessing column elements

To access all rows from a particular column use the .get method on the vectordata field of the DynamicTable object
 
my_table.vectordata.get('col2').data
ans = 3×1 cell
'a'
'b'
'c'

Referencing rows of other tables

You can create a column that references rows of other tables by adding a DynamicTableRegion object as a column of a DynamicTable. This is analogous to a foreign key in a relational database. The DynamicTableRegion class takes in an ObjectView object as argument. ObjectView objects create links from one object type referencing another.
dtr_col = types.hdmf_common.DynamicTableRegion( ...
'description', 'references multiple rows of earlier table', ...
'data', [0; 1; 1; 0], ... # 0-indexed
'table',types.untyped.ObjectView(my_table) ... % object view of target table
);
 
data_col = types.hdmf_common.VectorData( ...
'description', 'data column', ...
'data', {'a'; 'b'; 'c'; 'd'} ...
);
 
dtr_table = types.hdmf_common.DynamicTable( ...
'description', 'test table with DynamicTableRegion', ...
'colnames', {'data_col', 'dtr_col'}, ...
'dtr_col', dtr_col, ...
'data_col',data_col, ...
'id',types.hdmf_common.ElementIdentifiers('data', [0; 1; 2; 3]) ...
);

Converting a DynamicTable to a MATLAB table

You can convert a DynamicTable object to a MATLAB table by making use of the object's toTable method. This is a useful way to view the whole table in a human-readable format.
my_table.toTable()
ans = 3×5 table
 idcol1col2col3col4
101'a'100'a1'
212'b'200'b2'
323'c'300'c3'
When the DynamicTable object contains a column that references other tables, you can pass in a Boolean to indicate whether to include just the row indices of the referenced table. Passing in false will result in inclusion of the referenced rows as nested tables.
dtr_table.toTable(false)
ans = 4×3 table
 iddata_coldtr_col
10'a'1×4 table
21'b'1×4 table
32'c'1×4 table
43'd'1×4 table

Creating an expandable table

When using the default HDF5 backend, each column of these tables is an HDF5 Dataset, which by default are set to an unchangeable size. This means that once a file is written, it is not possible to add a new row. If you want to be able to save this file, load it, and add more rows to the table, you will need to set this up when you create the VectorData and ElementIdentifiers columns of a DynamicTable. Specifically, you must wrap the column data with a DataPipe object. The DataPipe class takes in maxSize and axis as arguments to indicate the maximum desired size for each axis and the axis to which to append to, respectively. For example, creating a DataPipe object with a maxSize value equal to [Inf, 1] indicates that the number of rows may increase indifinetely. In contrast, setting maxSize equal to [8, 1] would allow the column to grow to a maximum height of 8.
% create NwbFile object with required fields
file= NwbFile( ...
'session_start_time', datetime('2021-01-01 00:00:00', 'TimeZone', 'local'), ...
'identifier', 'ident1', ...
'session_description', 'ExpandableTableTutorial' ...
);
 
% create VectorData objects with DataPipe objects
start_time_exp = types.hdmf_common.VectorData( ...
'description', 'start times column', ...
'data', types.untyped.DataPipe( ...
'data', [1, 2], ... # data must be numerical
'maxSize', Inf ...
) ...
);
 
stop_time_exp = types.hdmf_common.VectorData( ...
'description', 'stop times column', ...
'data', types.untyped.DataPipe( ...
'data', [2, 3], ... #data must be numerical
'maxSize', Inf ...
) ...
);
 
random_exp = types.hdmf_common.VectorData( ...
'description', 'random data column', ...
'data', types.untyped.DataPipe( ...
'data', rand(5, 2), ... #data must be numerical
'maxSize', [5, Inf], ...
'axis', 2 ...
) ...
);
 
ids_exp = types.hdmf_common.ElementIdentifiers( ...
'data', types.untyped.DataPipe( ...
'data', int32([0; 1]), ... # data must be numerical
'maxSize', Inf ...
) ...
);
% create expandable table
colnames = {'start_time', 'stop_time', 'randomvalues'};
file.intervals_trials = types.core.TimeIntervals( ...
'description', 'test expdandable dynamic table', ...
'colnames', colnames, ...
'start_time', start_time_exp, ...
'stop_time', stop_time_exp, ...
'randomvalues', random_exp, ...
'id', ids_exp ...
);
% export file
nwbExport(file, 'expandableTableTestFile.nwb');
Now, you can read in the file, add more rows, and save again to file
readFile = nwbRead('expandableTableTestFile.nwb', 'ignorecache');
readFile.intervals_trials.addRow( ...
'start_time', 3, ...
'stop_time', 4, ...
'randomvalues', rand(5,1), ...
'id', 2 ...
)
nwbExport(readFile, 'expandableTableTestFile.nwb');
Note: DataPipe objects change how the dimension of the datasets for each column map onto the shape of HDF5 datasets. See README for more details.

Multidimensional Columns

The order of dimensions of multidimensional columns in MatNWB is reversed relative to the Python HDMF package (see README for detailed explanation). Therefore, the height of a multidimensional column belonging to a DynamicTable object is defined by the shape of its last dimension. A valid DynamicTable must have matched height across columns.

Constructing multidimensional columns

% Define 1D column
simple_col = types.hdmf_common.VectorData( ...
'description', '1D column',...
'data', rand(10,1) ...
);
% Define ND column
multi_col = types.hdmf_common.VectorData( ...
'description', 'multidimensional column',...
'data', rand(3,2,10) ...
);
% construct table
multi_dim_table = types.hdmf_common.DynamicTable( ...
'description','test table', ...
'colnames', {'simple','multi'}, ...
'simple', simple_col, ...
'multi', multi_col, ...
'id', types.hdmf_common.ElementIdentifiers('data', (0:9)') ... % 0-indexed, for compatibility with Python
);
 

Multidimensional ragged array columns

DynamicTable objects with multidimensional ragged array columns can be constructed by passing in the corresponding VectorIndex column
% Define column with data
multi_ragged_col = types.hdmf_common.VectorData( ...
'description', 'multidimensional ragged array column',...
'data', rand(2,3,5) ...
);
% Define column with VectorIndex
multi_ragged_index = types.hdmf_common.VectorIndex( ...
'description', 'index to multi_ragged_col', ...
'target', types.untyped.ObjectView(multi_ragged_col),'data', [2; 3; 5] ...
);
 
multi_ragged_table = types.hdmf_common.DynamicTable( ...
'description','test table', ...
'colnames', {'multi_ragged'}, ...
'multi_ragged', multi_ragged_col, ...
'multi_ragged_index', multi_ragged_index, ...
'id', types.hdmf_common.ElementIdentifiers('data', [0; 1; 2]) ... % 0-indexed, for compatibility with Python
);

Adding rows to multidimensional array columns

DynamicTable objects with multidimensional array columns can also be constructed by adding a single row at a time. This method makes use of DataPipe objects due to the fact that MATLAB doesn't support singleton dimensions for arrays with more than 2 dimensions. The code block below demonstrates how to build a DynamicTable object with a mutidimensional raaged array column in this manner.
% Create file
file = NwbFile( ...
'session_start_time', datetime('2021-01-01 00:00:00', 'TimeZone', 'local'), ...
'identifier', 'ident1', ...
'session_description', 'test_file' ...
);
 
% Define Vector Data Objects with first row of table
start_time_exp = types.hdmf_common.VectorData( ...
'description', 'start times column', ...
'data', types.untyped.DataPipe( ...
'data', 1, ...
'maxSize', Inf ...
) ...
);
stop_time_exp = types.hdmf_common.VectorData( ...
'description', 'stop times column', ...
'data', types.untyped.DataPipe( ...
'data', 10, ...
'maxSize', Inf ...
) ...
);
random_exp = types.hdmf_common.VectorData( ...
'description', 'random data column', ...
'data', types.untyped.DataPipe( ...
'data', rand(3,2,5), ... #random data
'maxSize', [3, 2, Inf], ...
'axis', 3 ...
) ...
);
random_exp_index = types.hdmf_common.VectorIndex( ...
'description', 'index to random data column', ...
'target',types.untyped.ObjectView(random_exp), ...
'data', types.untyped.DataPipe( ...
'data', uint64(5), ...
'maxSize', Inf ...
) ...
);
ids_exp = types.hdmf_common.ElementIdentifiers( ...
'data', types.untyped.DataPipe( ...
'data', int64(0), ... # data must be numerical
'maxSize', Inf ...
) ...
);
% Create expandable table
colnames = {'start_time', 'stop_time', 'randomvalues'};
file.intervals_trials = types.core.TimeIntervals( ...
'description', 'test expdandable dynamic table', ...
'colnames', colnames, ...
'start_time', start_time_exp, ...
'stop_time', stop_time_exp, ...
'randomvalues', random_exp, ...
'randomvalues_index', random_exp_index, ...
'id', ids_exp ...
);
% Export file
nwbExport(file, 'multiRaggedExpandableTableTest.nwb');
% Read in file
read_file = nwbRead('multiRaggedExpandableTableTest.nwb', 'ignorecache');
% add individual rows
read_file.intervals_trials.addRow( ...
'start_time', 2, ...
'stop_time', 20, ...
'randomvalues', rand(3,2,6), ...
'id', 1 ...
);
read_file.intervals_trials.addRow( ...
'start_time', 3, ...
'stop_time', 30, ...
'randomvalues', rand(3,2,3), ...
'id', 2 ...
);
read_file.intervals_trials.addRow( ...
'start_time', 4, ...
'stop_time', 40, ...
'randomvalues', rand(3,2,8), ...
'id', 3 ...
);
 

Learn More!

Python Tutorial

DynamicTable Tutorial
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/dynamically_loaded_filters.html b/docs/source/_static/html/tutorials/dynamically_loaded_filters.html new file mode 100644 index 00000000..7250e315 --- /dev/null +++ b/docs/source/_static/html/tutorials/dynamically_loaded_filters.html @@ -0,0 +1,141 @@ + +Using Dynamically Loaded Filters in MatMWB

Using Dynamically Loaded Filters in MatMWB

Installing Dynamically Loaded Filters

HDF5 can use various filters to compress data when writing datasets. GZIP is the default filter, and it can be read with any HDF5 installation without any setup, but many users find that other filters, such as Zstd, offer better performance. If you want to read an HDF5 Dataset that was compressed using another filter in MATLAB, such as Zstd, you will need to configure MATLAB to read using dynamically loaded filters.
The easiest way we have found to set up dynamically loaded filters is to use the Python package hdf5plugin. This library has a sophisticated installation process that compiles several of the most popular dynamically loaded filters and works across popular operating systems. Installing this Python package is a trick that allows us to offload the tricky parts of installing dynamically loaded filters in MATLAB.

Linux or Mac

1. In your Terminal window, install hdf5plugin:
pip install hdf5plugin
2. In that same Terminal window, set the environment variable HDF5_PLUGIN_PATH:
export HDF5_PLUGIN_PATH=$(python -c "import hdf5plugin; print(hdf5plugin.PLUGINS_PATH)");
3. From that same Terminal window, launch MATLAB:
/Applications/MATLAB_R2021b.app/bin/matlab
The path above is an example of a common location for OSX. The exact path of MATLAB may vary on your computer.

Windows

1. Install hdf5plugin in the Command Prompt:
pip install hdf5plugin
2. Determine the path of the plugin installation. In the Command Prompt, run:
python -c "import hdf5plugin; print(hdf5plugin.PLUGINS_PATH)
2. Set the environment variable HDF5_PLUGIN_PATH to point to the local installation of the plugins (from hdf5plugin.PLUGINS_PATH) through System Properties > Advanced > Environment Variables:
path-screenshot.png
3. Restart MATLAB.
That's it! Now you can read datasets that use the following filters:
The beauty of HDF5 is that it handles the rest under the hood. When you read a dataset that uses any of these filters, HDF5 will identify the correct decompression algorithm and decompress the data on-the-fly as you access it from the dataset.
For more information about installing filter plugins, see the MATLAB documentation.

Writing with Dynamically Loaded Filters

To write with dynamically loaded filters, first follow the installation steps above. This feature requires MATLAB version ≥ 2022a.
DataPipe objects can be used to write using Dynamically loaded filters. This tutorial will be using the Zstd dynamic filter as an example.
The DynamicFilter property takes in an enumerated type Filter which is a hard-coded list of all listed registered filter plugins in HDF5.
import types.untyped.datapipe.properties.DynamicFilter
import types.untyped.datapipe.dynamic.Filter
import types.untyped.datapipe.properties.Shuffle
 
zstdProperty = DynamicFilter(Filter.ZStandard);

Parameters

Some filter plugins allow for setting special configuration parameters to modify the filter's behavior. The DynamicFilter property type contains a modifiable parameters field which can be used to set your parameters. This is equivalent to setting the cd_values argument in HDF5. In the case of the Zstandard HDF5 plugin, the first (and only) array argument value indicates the compression level.
zstdProperty.parameters = 4; % compression level.

Multiple Filters

You can use multiple dynamic filters by concatenating multiple DynamicFilter properties together. They will be applied in order of the inserted array.
ShuffleProperty = Shuffle();
dynamicProperties = [ShuffleProperty zstdProperty];

Writing

The DataPipe class takes in a keyword argument called filters which is an array of DynamicFilter objects. Supplying a 'filters' argument will deactivate the default GZIP compression.
% We're already compressing using zstd so we should disable
% compressionLevel (gzip).
dataPipe = types.untyped.DataPipe('data', rand(1, 10000), 'filters', dynamicProperties);
 
timeseries = types.core.TimeSeries(...
'data', dataPipe, ...
'data_unit', 'data-unit', ...
'starting_time_rate', 1.0, ...
'starting_time', 0.0);
 
nwbFile = NwbFile(...
'identifier', 'dynamically_loaded_filters_tutorial', ...
'session_description', 'test_datapipe_filters', ...
'session_start_time', datetime("now", 'TimeZone', 'local') );
nwbFile.acquisition.set('ts', timeseries);
 
nwbExport(nwbFile, 'test.nwb');
 
The data is now compressed using Zstandard compression using a compression level of 4 and Shuffled
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/ecephys.html b/docs/source/_static/html/tutorials/ecephys.html new file mode 100644 index 00000000..8d82796e --- /dev/null +++ b/docs/source/_static/html/tutorials/ecephys.html @@ -0,0 +1,1356 @@ + +Neurodata Without Borders Extracellular Electrophysiology Tutorial

Neurodata Without Borders Extracellular Electrophysiology Tutorial

Table of Contents

This tutorial

Create fake data for a hypothetical extracellular electrophysiology experiment. The types of data we will convert are:
It is recommended to first work through the Introduction to MatNWB tutorial, which demonstrates installing MatNWB and creating an NWB file with subject information, animal position, and trials, as well as writing and reading NWB files in MATLAB.

Setting up the NWB File

An NWB file represents a single session of an experiment. Each file must have a session_description, identifier, and session start time. Create a new NWBFile object with those and additional metadata. For all MatNWB functions, we use the Matlab method of entering keyword argument pairs, where arguments are entered as name followed by value.
nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', 'Mouse5_Day3', ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'timestamps_reference_time', datetime(2018, 4, 25, 3, 0, 45, 'TimeZone', 'local'), ...
'general_experimenter', 'Last Name, First Name', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', {'DOI:10.1016/j.neuron.2016.12.011'}); % optional
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.7.0' + file_create_date: [] + identifier: 'Mouse5_Day3' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: {[2018-04-25T03:00:45.000000+02:00]} + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [0×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'Last Name, First Name' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: {'DOI:10.1016/j.neuron.2016.12.011'} + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [0×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +

Extracellular Electrophysiology

In order to store extracellular electrophysiology data, you first must create an electrodes table describing the electrodes that generated this data. Extracellular electrodes are stored in an electrodes table, which is also a DynamicTable. electrodes has several required fields: x, y, z, impedance, location, filtering, and electrode_group.

Electrodes Table

Since this is a DynamicTable, we can add additional metadata fields. We will be adding a "label" column to the table.
numShanks = 4;
numChannelsPerShank = 3;
 
ElectrodesDynamicTable = types.hdmf_common.DynamicTable(...
'colnames', {'location', 'group', 'group_name', 'label'}, ...
'description', 'all electrodes');
 
Device = types.core.Device(...
'description', 'the best array', ...
'manufacturer', 'Probe Company 9000' ...
);
nwb.general_devices.set('array', Device);
for iShank = 1:numShanks
shankGroupName = sprintf('shank%d', iShank);
EGroup = types.core.ElectrodeGroup( ...
'description', sprintf('electrode group for %s', shankGroupName), ...
'location', 'brain area', ...
'device', types.untyped.SoftLink(Device) ...
);
nwb.general_extracellular_ephys.set(shankGroupName, EGroup);
for iElectrode = 1:numChannelsPerShank
ElectrodesDynamicTable.addRow( ...
'location', 'unknown', ...
'group', types.untyped.ObjectView(EGroup), ...
'group_name', shankGroupName, ...
'label', sprintf('%s-electrode%d', shankGroupName, iElectrode));
end
end
ElectrodesDynamicTable.toTable() % Display the table
ans = 12×5 table
 idlocationgroupgroup_namelabel
10'unknown'1×1 ObjectView'shank1''shank1-electrode1'
21'unknown'1×1 ObjectView'shank1''shank1-electrode2'
32'unknown'1×1 ObjectView'shank1''shank1-electrode3'
43'unknown'1×1 ObjectView'shank2''shank2-electrode1'
54'unknown'1×1 ObjectView'shank2''shank2-electrode2'
65'unknown'1×1 ObjectView'shank2''shank2-electrode3'
76'unknown'1×1 ObjectView'shank3''shank3-electrode1'
87'unknown'1×1 ObjectView'shank3''shank3-electrode2'
98'unknown'1×1 ObjectView'shank3''shank3-electrode3'
109'unknown'1×1 ObjectView'shank4''shank4-electrode1'
1110'unknown'1×1 ObjectView'shank4''shank4-electrode2'
1211'unknown'1×1 ObjectView'shank4''shank4-electrode3'
 
nwb.general_extracellular_ephys_electrodes = ElectrodesDynamicTable;

Links

In the above loop, we create ElectrodeGroup objects. The electrodes table then uses an ObjectView in each row to link to the corresponding ElectrodeGroup object. An ObjectView is an object that allow you to create a link from one neurodata type referencing another.

ElectricalSeries

Voltage data are stored in ElectricalSeries objects. ElectricalSeries is a subclass of TimeSeries specialized for voltage data. In order to create our ElectricalSeries object, we will need to reference a set of rows in the electrodes table to indicate which electrodes were recorded. We will do this by creating a DynamicTableRegion, which is a type of link that allows you to reference specific rows of a DynamicTable, such as the electrodes table, by row indices.
Create a DynamicTableRegion that references all rows of the electrodes table.
electrode_table_region = types.hdmf_common.DynamicTableRegion( ...
'table', types.untyped.ObjectView(ElectrodesDynamicTable), ...
'description', 'all electrodes', ...
'data', (0:length(ElectrodesDynamicTable.id.data)-1)');
Now create an ElectricalSeries object to hold acquisition data collected during the experiment.
electrical_series = types.core.ElectricalSeries( ...
'starting_time', 0.0, ... % seconds
'starting_time_rate', 30000., ... % Hz
'data', randn(12, 3000), ...
'electrodes', electrode_table_region, ...
'data_unit', 'volts');
This is the voltage data recorded directly from our electrodes, so it goes in the acquisition group.
nwb.acquisition.set('ElectricalSeries', electrical_series);

LFP

Local field potential (LFP) refers in this case to data that has been downsampled and/or filtered from the original acquisition data and is used to analyze signals in the lower frequency range. Filtered and downsampled LFP data would also be stored in an ElectricalSeries. To help data analysis and visualization tools know that this ElectricalSeries object represents LFP data, store it inside an LFP object, then place the LFP object in a ProcessingModule named 'ecephys'. This is analogous to how we stored the SpatialSeries object inside of a Position object and stored the Position object in a ProcessingModule named 'behavior' earlier.
electrical_series = types.core.ElectricalSeries( ...
'starting_time', 0.0, ... % seconds
'starting_time_rate', 1000., ... % Hz
'data', randn(12, 100), ...
'electrodes', electrode_table_region, ...
'data_unit', 'volts');
 
lfp = types.core.LFP('ElectricalSeries', electrical_series);
 
ecephys_module = types.core.ProcessingModule(...
'description', 'extracellular electrophysiology');
 
ecephys_module.nwbdatainterface.set('LFP', lfp);
nwb.processing.set('ecephys', ecephys_module);

Sorted Spike Times

Ragged Arrays

Spike times are stored in another DynamicTable of subtype Units. The default Units table is at /units in the HDF5 file. You can add columns to the Units table just like you did for electrodes and trials. Here, we generate some random spike data and populate the table.
num_cells = 10;
firing_rate = 20;
spikes = cell(1, num_cells);
for iShank = 1:num_cells
spikes{iShank} = rand(1, randi([16, 28]));
end
spikes
spikes = 1×10 cell
 12345678910
11×21 double1×24 double1×18 double1×28 double1×25 double1×18 double1×21 double1×28 double1×16 double1×19 double
Spike times are an example of a ragged array- it's like a matrix, but each row has a different number of elements. We can represent this type of data as an indexed column of the units DynamicTable. These indexed columns have two components, the vector data object that holds the data and the vector index object that holds the indices in the vector that indicate the row breaks. You can use the convenience function util.create_indexed_column to create these objects.
[spike_times_vector, spike_times_index] = util.create_indexed_column(spikes);
 
nwb.units = types.core.Units( ...
'colnames', {'spike_times'}, ...
'description', 'units table', ...
'spike_times', spike_times_vector, ...
'spike_times_index', spike_times_index ...
);
 
nwb.units.toTable
ans = 10×2 table
 idspike_times
1121×1 double
2224×1 double
3318×1 double
4428×1 double
5525×1 double
6618×1 double
7721×1 double
8828×1 double
9916×1 double
101019×1 double

Unsorted Spike Times

In MATLAB, while the Units table is used to store spike times and waveform data for spike-sorted, single-unit activity, you may also want to store spike times and waveform snippets of unsorted spiking activity. This is useful for recording multi-unit activity detected via threshold crossings during data acquisition. Such information can be stored using SpikeEventSeries objects.
% In the SpikeEventSeries the dimensions should be ordered as
% [num_events, num_channels, num_samples].
% Define spike snippets: 20 events, 3 channels, 40 samples per event.
spike_snippets = rand(20, 3, 40);
% Permute spike snippets (See dimensionMapNoDataPipes tutorial)
spike_snippets = permute(spike_snippets, [3,2,1])
spike_snippets =
spike_snippets(:,:,1) = + + 0.2857 0.6608 0.4367 + 0.3600 0.1626 0.8400 + 0.5475 0.0764 0.6130 + 0.1149 0.6369 0.1302 + 0.0839 0.7551 0.8632 + 0.5529 0.3187 0.0518 + 0.0574 0.7624 0.8898 + 0.6274 0.6888 0.5444 + 0.0840 0.9389 0.8369 + 0.2149 0.7280 0.1379 + 0.9564 0.0906 0.6076 + 0.3179 0.7600 0.8335 + 0.1967 0.6754 0.0739 + 0.8329 0.0264 0.5930 + 0.5987 0.1428 0.5505 + 0.8641 0.5217 0.8748 + 0.9503 0.9436 0.6849 + 0.0988 0.6326 0.2984 + 0.8497 0.6740 0.0928 + 0.9513 0.1322 0.6433 + 0.7060 0.3816 0.9528 + 0.9193 0.2140 0.7737 + 0.6863 0.3327 0.0457 + 0.1618 0.8800 0.0862 + 0.0080 0.8270 0.6271 + 0.8749 0.5075 0.5516 + 0.5649 0.8422 0.1858 + 0.8480 0.8885 0.5991 + 0.4204 0.1576 0.2430 + 0.5801 0.3614 0.8714 + 0.2796 0.5922 0.5711 + 0.7843 0.8961 0.5321 + 0.3181 0.6451 0.6615 + 0.1203 0.4240 0.7307 + 0.5491 0.9083 0.7756 + 0.1166 0.2054 0.1056 + 0.2286 0.7850 0.2984 + 0.7974 0.3422 0.4096 + 0.6017 0.6665 0.8854 + 0.6872 0.0761 0.4004 + + +spike_snippets(:,:,2) = + + 0.3537 0.5763 0.0558 + 0.3264 0.0852 0.3904 + 0.6326 0.9991 0.9243 + 0.0142 0.3299 0.1727 + 0.7906 0.1044 0.6286 + 0.5325 0.6482 0.3741 + 0.1618 0.3340 0.6112 + 0.1891 0.3239 0.9190 + 0.8035 0.1656 0.9514 + 0.8351 0.5910 0.6103 + 0.4504 0.8050 0.4005 + 0.6363 0.7321 0.7352 + 0.3229 0.4718 0.3551 + 0.5228 0.9125 0.7943 + 0.5838 0.8591 0.1924 + 0.1646 0.9811 0.3952 + 0.5876 0.7013 0.5506 + 0.6731 0.0672 0.7196 + 0.4278 0.2166 0.7300 + 0.3082 0.0349 0.0998 + 0.6421 0.0494 0.0303 + 0.8604 0.1456 0.3563 + 0.8400 0.7252 0.2455 + 0.1522 0.9904 0.1299 + 0.1487 0.3543 0.3340 + 0.7294 0.4209 0.4987 + 0.7297 0.4765 0.9868 + 0.5830 0.8183 0.6142 + 0.0503 0.9344 0.5153 + 0.1720 0.1192 0.9391 + 0.9381 0.5673 0.6301 + 0.2820 0.4955 0.2350 + 0.8668 0.6107 0.3925 + 0.6064 0.8543 0.9834 + 0.6108 0.4038 0.0298 + 0.3266 0.5498 0.5972 + 0.1845 0.5921 0.9274 + 0.6278 0.8075 0.1228 + 0.1886 0.3756 0.3849 + 0.0585 0.6472 0.9450 + + +spike_snippets(:,:,3) = + + 0.0790 0.9798 0.3705 + 0.2565 0.3337 0.3708 + 0.4972 0.1644 0.8170 + 0.8341 0.4508 0.3991 + 0.4084 0.5430 0.4807 + 0.3439 0.6642 0.5703 + 0.9957 0.1053 0.1126 + 0.1819 0.7859 0.5963 + 0.1255 0.1200 0.8429 + 0.8847 0.4127 0.9843 + 0.8943 0.6678 0.0982 + 0.0207 0.9872 0.9731 + 0.2515 0.3483 0.4459 + 0.6889 0.5565 0.2033 + 0.3814 0.8721 0.6273 + 0.5217 0.2188 0.8595 + 0.9447 0.0148 0.1967 + 0.8213 0.0100 0.3596 + 0.5204 0.0630 0.8961 + 0.3929 0.7670 0.6272 + 0.0254 0.7624 0.0719 + 0.3342 0.8155 0.2478 + 0.5622 0.2468 0.7245 + 0.3886 0.1705 0.6961 + 0.6668 0.7717 0.4452 + 0.0322 0.9974 0.1323 + 0.6823 0.3939 0.0021 + 0.3614 0.2507 0.6621 + 0.7852 0.7651 0.5013 + 0.3042 0.7980 0.0191 + 0.0733 0.7833 0.2535 + 0.2063 0.2281 0.8499 + 0.4524 0.5881 0.5972 + 0.7459 0.2085 0.7855 + 0.7223 0.9363 0.3330 + 0.3031 0.0221 0.9362 + 0.0528 0.5472 0.0018 + 0.8966 0.7395 0.3072 + 0.0647 0.9329 0.9689 + 0.7266 0.1662 0.9855 + + +spike_snippets(:,:,4) = + + 0.2545 0.1178 0.0623 + 0.1434 0.7919 0.9073 + 0.4296 0.1924 0.6309 + 0.5161 0.1803 0.0288 + 0.3179 0.7982 0.2427 + 0.2748 0.3133 0.0101 + 0.2930 0.1957 0.6955 + 0.6836 0.9551 0.2006 + 0.1638 0.1767 0.6102 + 0.9307 0.7772 0.1972 + 0.9277 0.5743 0.1019 + 0.7270 0.7339 0.1749 + 0.4948 0.2646 0.5360 + 0.8644 0.7049 0.4960 + 0.5880 0.8958 0.5753 + 0.0046 0.9635 0.3048 + 0.2872 0.7341 0.1214 + 0.0006 0.4956 0.6200 + 0.8845 0.5147 0.8833 + 0.1212 0.1808 0.5548 + 0.5224 0.7684 0.0541 + 0.6122 0.9229 0.9004 + 0.2470 0.2454 0.3503 + 0.4826 0.1955 0.2739 + 0.9240 0.7621 0.3529 + 0.1734 0.0407 0.7469 + 0.6788 0.3651 0.0434 + 0.1148 0.2578 0.8098 + 0.0898 0.4526 0.1006 + 0.2722 0.9148 0.1465 + 0.3516 0.7715 0.5164 + 0.6509 0.2735 0.4134 + 0.6057 0.5503 0.6118 + 0.7246 0.3308 0.9303 + 0.4009 0.2966 0.9465 + 0.8563 0.4760 0.5895 + 0.3833 0.8435 0.3921 + 0.1216 0.6948 0.9617 + 0.0278 0.8568 0.0843 + 0.2423 0.5999 0.6319 + + +spike_snippets(:,:,5) = + + 0.5522 0.6140 0.5597 + 0.4872 0.7144 0.3502 + 0.4616 0.4691 0.4971 + 0.3555 0.4293 0.6880 + 0.0012 0.4328 0.0814 + 0.6816 0.5303 0.6519 + 0.5734 0.7206 0.2286 + 0.6116 0.0223 0.9631 + 0.0033 0.3719 0.2236 + 0.9431 0.1396 0.0793 + 0.4107 0.9010 0.0474 + 0.8080 0.4957 0.3896 + 0.0888 0.6888 0.4446 + 0.6716 0.0954 0.6248 + 0.9566 0.5706 0.0673 + 0.8721 0.6358 0.3166 + 0.2042 0.8066 0.7187 + 0.2414 0.9353 0.2232 + 0.5845 0.9186 0.1913 + 0.7381 0.7520 0.3618 + 0.9733 0.9983 0.0858 + 0.7678 0.0200 0.8217 + 0.0691 0.5319 0.3943 + 0.0594 0.5005 0.8020 + 0.2298 0.7428 0.3926 + 0.3677 0.9118 0.1565 + 0.6005 0.3140 0.8609 + 0.0459 0.6408 0.3229 + 0.2813 0.0560 0.4407 + 0.0798 0.9051 0.9379 + 0.5298 0.4222 0.8412 + 0.8171 0.5180 0.8489 + 0.9351 0.5550 0.4569 + 0.2314 0.2319 0.0156 + 0.9743 0.4728 0.9320 + 0.5814 0.4281 0.6112 + 0.7730 0.4871 0.8687 + 0.7774 0.5609 0.8990 + 0.6401 0.7684 0.9780 + 0.6937 0.3855 0.2773 + + +spike_snippets(:,:,6) = + + 0.7563 0.7273 0.0133 + 0.3592 0.8510 0.3783 + 0.2910 0.2957 0.0741 + 0.2222 0.6956 0.2550 + 0.7024 0.7359 0.8882 + 0.7378 0.5554 0.4322 + 0.3820 0.1226 0.2407 + 0.7241 0.7887 0.3233 + 0.5451 0.4922 0.9290 + 0.8519 0.6864 0.1900 + 0.8484 0.4515 0.9817 + 0.7977 0.2750 0.3051 + 0.0746 0.6490 0.7667 + 0.0516 0.7495 0.5505 + 0.6045 0.0482 0.7680 + 0.1531 0.1959 0.0531 + 0.3139 0.2795 0.4414 + 0.0599 0.3964 0.1188 + 0.7404 0.0559 0.8768 + 0.9584 0.2563 0.1812 + 0.5370 0.5704 0.2613 + 0.3878 0.6127 0.9154 + 0.3381 0.5848 0.7191 + 0.6035 0.1744 0.6816 + 0.8049 0.4899 0.3228 + 0.4373 0.2797 0.0940 + 0.8263 0.6326 0.0613 + 0.6774 0.9938 0.4658 + 0.0273 0.3114 0.2264 + 0.5349 0.2503 0.0345 + 0.2270 0.1829 0.8232 + 0.1635 0.3015 0.7972 + 0.3778 0.3966 0.0747 + 0.7063 0.9485 0.0910 + 0.3828 0.1401 0.8424 + 0.3238 0.2967 0.7296 + 0.4243 0.0952 0.8124 + 0.4995 0.4657 0.7694 + 0.1477 0.7937 0.8987 + 0.4048 0.1036 0.0002 + + +spike_snippets(:,:,7) = + + 0.4580 0.9207 0.2481 + 0.1836 0.6374 0.5111 + 0.0742 0.8433 0.9611 + 0.0159 0.8911 0.7889 + 0.2008 0.8055 0.7266 + 0.5994 0.8005 0.9522 + 0.4349 0.8104 0.8747 + 0.3428 0.7336 0.5881 + 0.1481 0.0946 0.9118 + 0.0831 0.5271 0.7814 + 0.0389 0.8958 0.8084 + 0.3233 0.3729 0.3681 + 0.4450 0.6990 0.3603 + 0.6181 0.9658 0.1873 + 0.5898 0.2502 0.0125 + 0.2308 0.2501 0.7988 + 0.8013 0.1657 0.0417 + 0.9670 0.8803 0.1257 + 0.0179 0.6742 0.8754 + 0.3485 0.9857 0.9407 + 0.2647 0.2288 0.6305 + 0.0033 0.4910 0.1960 + 0.9030 0.8703 0.3570 + 0.5235 0.2891 0.1002 + 0.2324 0.0010 0.2280 + 0.2612 0.3101 0.0855 + 0.5968 0.3174 0.4167 + 0.9640 0.7214 0.4913 + 0.1577 0.9385 0.1025 + 0.1977 0.5359 0.2251 + 0.3496 0.7907 0.7362 + 0.0281 0.6693 0.0673 + 0.2872 0.3925 0.5873 + 0.3650 0.4479 0.9278 + 0.8428 0.9876 0.6636 + 0.8083 0.6426 0.9633 + 0.5935 0.5707 0.7156 + 0.6495 0.7981 0.8620 + 0.4248 0.5334 0.1100 + 0.0895 0.6389 0.0086 + + +spike_snippets(:,:,8) = + + 0.3173 0.7239 0.5842 + 0.9806 0.3725 0.7392 + 0.2726 0.0862 0.7087 + 0.7864 0.9975 0.8527 + 0.1554 0.4242 0.5882 + 0.7534 0.3430 0.0112 + 0.3874 0.5027 0.4471 + 0.3280 0.3424 0.3045 + 0.1938 0.7505 0.7932 + 0.2593 0.2949 0.4888 + 0.5396 0.1103 0.9481 + 0.1231 0.6017 0.7679 + 0.0303 0.7178 0.3082 + 0.6338 0.5286 0.1721 + 0.5765 0.0735 0.1011 + 0.4303 0.6201 0.7509 + 0.4182 0.1388 0.1476 + 0.2248 0.5659 0.0135 + 0.4919 0.7920 0.0609 + 0.2389 0.4693 0.8358 + 0.7148 0.3369 0.5812 + 0.1331 0.7267 0.6000 + 0.7245 0.4390 0.8241 + 0.5550 0.9808 0.7514 + 0.9064 0.5307 0.9582 + 0.8849 0.6957 0.7481 + 0.2192 0.9088 0.5844 + 0.2364 0.5855 0.7902 + 0.7584 0.0846 0.1370 + 0.7727 0.2945 0.2277 + 0.8112 0.0527 0.8338 + 0.3527 0.3030 0.9882 + 0.0002 0.2667 0.9848 + 0.2195 0.0832 0.4510 + 0.8938 0.0728 0.4832 + 0.8841 0.6265 0.6150 + 0.8769 0.5066 0.4031 + 0.6807 0.3424 0.9336 + 0.2345 0.0089 0.7260 + 0.2382 0.3721 0.6684 + + +spike_snippets(:,:,9) = + + 0.7887 0.1178 0.4502 + 0.8269 0.3962 0.5211 + 0.9859 0.7808 0.8825 + 0.7219 0.9935 0.1152 + 0.0173 0.1858 0.4448 + 0.0044 0.2231 0.0820 + 0.8686 0.5382 0.2548 + 0.4520 0.9463 0.1562 + 0.5869 0.8375 0.2189 + 0.7871 0.7757 0.4243 + 0.1588 0.5478 0.6302 + 0.4440 0.3944 0.8391 + 0.3337 0.1684 0.5720 + 0.0759 0.9428 0.7316 + 0.2520 0.8726 0.6824 + 0.5137 0.2414 0.0452 + 0.8611 0.8641 0.2069 + 0.5499 0.9632 0.1246 + 0.3078 0.4369 0.9144 + 0.7565 0.7739 0.5479 + 0.3609 0.3100 0.5928 + 0.3456 0.5537 0.2476 + 0.6165 0.3551 0.7373 + 0.5038 0.6011 0.9023 + 0.9612 0.7047 0.1553 + 0.3012 0.6818 0.3364 + 0.7665 0.1590 0.6036 + 0.7387 0.1920 0.7323 + 0.6689 0.6163 0.1122 + 0.0597 0.1126 0.9075 + 0.4382 0.9988 0.5944 + 0.5352 0.5993 0.9150 + 0.0732 0.8298 0.4190 + 0.1000 0.1752 0.8498 + 0.2800 0.1332 0.4710 + 0.8509 0.2318 0.1789 + 0.0231 0.6613 0.9997 + 0.8469 0.7678 0.8919 + 0.7672 0.0896 0.8968 + 0.3195 0.3421 0.0274 + + +spike_snippets(:,:,10) = + + 0.0246 0.1593 0.0619 + 0.1265 0.4231 0.4204 + 0.8991 0.6043 0.3761 + 0.9254 0.1290 0.2395 + 0.3477 0.0261 0.3631 + 0.4255 0.5196 0.4987 + 0.4745 0.6889 0.5786 + 0.6804 0.9587 0.7136 + 0.2899 0.7803 0.5361 + 0.4230 0.3868 0.5688 + 0.3492 0.5184 0.7157 + 0.3322 0.2954 0.5267 + 0.3834 0.0758 0.3822 + 0.3095 0.9617 0.0138 + 0.1127 0.2855 0.8902 + 0.4317 0.8442 0.9792 + 0.7420 0.0902 0.0456 + 0.7662 0.1624 0.1115 + 0.6543 0.3400 0.6864 + 0.6296 0.1255 0.5455 + 0.9029 0.0879 0.5006 + 0.6050 0.8464 0.5591 + 0.3601 0.3826 0.1305 + 0.8042 0.7910 0.4607 + 0.5606 0.6963 0.7884 + 0.3205 0.2410 0.7519 + 0.0024 0.6577 0.6181 + 0.1318 0.1590 0.2453 + 0.2791 0.1981 0.6390 + 0.2292 0.9181 0.1980 + 0.5690 0.9134 0.5699 + 0.0218 0.6615 0.1191 + 0.7396 0.8588 0.8290 + 0.4791 0.2304 0.7855 + 0.6231 0.5621 0.5530 + 0.2377 0.7269 0.4795 + 0.9498 0.7593 0.3992 + 0.4043 0.5552 0.6333 + 0.3988 0.1177 0.9016 + 0.6999 0.0590 0.5461 + + +spike_snippets(:,:,11) = + + 0.2220 0.6161 0.5482 + 0.4123 0.1179 0.7582 + 0.7245 0.9598 0.6356 + 0.7210 0.6697 0.9592 + 0.5535 0.0940 0.7996 + 0.4417 0.4261 0.0502 + 0.4423 0.5592 0.2896 + 0.9676 0.3315 0.7111 + 0.6885 0.1419 0.4361 + 0.0655 0.0035 0.1327 + 0.3826 0.6864 0.3620 + 0.0971 0.4998 0.1317 + 0.6550 0.8911 0.7389 + 0.7726 0.3506 0.4817 + 0.9044 0.0072 0.4791 + 0.9699 0.5444 0.4785 + 0.0592 0.5717 0.1154 + 0.6694 0.8065 0.3114 + 0.0451 0.4908 0.5335 + 0.6083 0.8033 0.7573 + 0.4028 0.8962 0.7841 + 0.6808 0.6181 0.2081 + 0.7436 0.6448 0.3955 + 0.0222 0.5468 0.5692 + 0.5026 0.1176 0.5409 + 0.3683 0.6528 0.1630 + 0.3198 0.7608 0.0527 + 0.0307 0.6699 0.7209 + 0.9927 0.2533 0.7853 + 0.5352 0.7892 0.0157 + 0.4994 0.9594 0.8645 + 0.2182 0.8810 0.4853 + 0.0327 0.1107 0.5328 + 0.4987 0.8243 0.0942 + 0.3071 0.6421 0.3681 + 0.0825 0.2762 0.5065 + 0.7225 0.3783 0.5774 + 0.1505 0.5689 0.6437 + 0.2730 0.3795 0.4207 + 0.3812 0.3843 0.7570 + + +spike_snippets(:,:,12) = + + 0.4536 0.0348 0.1280 + 0.0937 0.3153 0.8244 + 0.0254 0.7321 0.1776 + 0.5266 0.1378 0.7937 + 0.8184 0.4512 0.3127 + 0.2168 0.1070 0.4184 + 0.4308 0.8058 0.1924 + 0.2676 0.6272 0.4703 + 0.7924 0.5767 0.2068 + 0.6820 0.1100 0.2608 + 0.8025 0.4762 0.5039 + 0.5449 0.9441 0.8725 + 0.6045 0.0750 0.7539 + 0.3219 0.0809 0.9189 + 0.6374 0.1076 0.6818 + 0.6043 0.9807 0.2184 + 0.4737 0.8263 0.0113 + 0.4661 0.1748 0.3905 + 0.1515 0.5849 0.7402 + 0.0386 0.0491 0.0699 + 0.0451 0.8471 0.2438 + 0.9840 0.0645 0.1164 + 0.4521 0.4423 0.6804 + 0.3990 0.5811 0.2845 + 0.9404 0.8687 0.0697 + 0.7555 0.0850 0.2861 + 0.4284 0.9813 0.1606 + 0.1728 0.0667 0.2492 + 0.5687 0.0777 0.2709 + 0.2264 0.5504 0.8100 + 0.0541 0.0368 0.5259 + 0.4999 0.3839 0.5377 + 0.8043 0.5455 0.7418 + 0.2652 0.8362 0.7342 + 0.4792 0.8744 0.6765 + 0.3494 0.9809 0.9374 + 0.4120 0.9923 0.1729 + 0.8693 0.4478 0.9683 + 0.2047 0.0203 0.7611 + 0.2584 0.7700 0.7948 + + +spike_snippets(:,:,13) = + + 0.0157 0.0647 0.6416 + 0.7638 0.1440 0.4338 + 0.3404 0.7033 0.9323 + 0.3327 0.2419 0.0158 + 0.1027 0.5447 0.9455 + 0.4155 0.5504 0.4124 + 0.4389 0.7090 0.1019 + 0.1135 0.1518 0.7802 + 0.1862 0.0991 0.9385 + 0.8108 0.3330 0.3703 + 0.8882 0.0429 0.0777 + 0.6938 0.3064 0.6495 + 0.3660 0.9739 0.9794 + 0.2094 0.1125 0.6123 + 0.9374 0.2039 0.4439 + 0.3380 0.0786 0.5313 + 0.6468 0.8286 0.6347 + 0.8093 0.9563 0.2979 + 0.2439 0.3000 0.7636 + 0.9876 0.4089 0.9489 + 0.0655 0.6382 0.4527 + 0.5187 0.1400 0.3417 + 0.4703 0.5862 0.9807 + 0.2048 0.6405 0.4133 + 0.7057 0.7886 0.9320 + 0.2141 0.2059 0.3750 + 0.5749 0.4097 0.0935 + 0.1075 0.7629 0.3086 + 0.7586 0.6562 0.2721 + 0.8715 0.4178 0.0318 + 0.7900 0.4142 0.5397 + 0.5306 0.9462 0.6588 + 0.4306 0.8370 0.4302 + 0.5899 0.5945 0.0762 + 0.8496 0.3275 0.7257 + 0.2751 0.2624 0.5545 + 0.1538 0.0924 0.3398 + 0.7160 0.0858 0.0901 + 0.9813 0.8219 0.6498 + 0.8121 0.5394 0.8237 + + +spike_snippets(:,:,14) = + + 0.0221 0.8614 0.7634 + 0.3983 0.6727 0.3550 + 0.9539 0.1561 0.6813 + 0.5583 0.5885 0.9967 + 0.4075 0.0813 0.4429 + 0.2310 0.0248 0.7509 + 0.6427 0.7612 0.3196 + 0.7487 0.7756 0.8189 + 0.0629 0.4779 0.6550 + 0.8839 0.6752 0.3173 + 0.6077 0.3845 0.6331 + 0.3133 0.1429 0.8957 + 0.4345 0.7169 0.9520 + 0.0100 0.3340 0.6187 + 0.5361 0.1743 0.7057 + 0.7226 0.3522 0.6633 + 0.2896 0.4041 0.8869 + 0.7655 0.6235 0.4579 + 0.8103 0.0950 0.5008 + 0.8029 0.9529 0.8503 + 0.3489 0.9096 0.4176 + 0.6609 0.9922 0.3921 + 0.8915 0.3578 0.2378 + 0.3911 0.2561 0.2023 + 0.1938 0.1357 0.0510 + 0.5737 0.0948 0.8470 + 0.0540 0.6226 0.4725 + 0.2451 0.7750 0.8025 + 0.8169 0.3337 0.6998 + 0.2011 0.9325 0.9863 + 0.8228 0.7149 0.2157 + 0.9203 0.8141 0.3663 + 0.7566 0.7477 0.5670 + 0.8637 0.9622 0.7179 + 0.8148 0.6600 0.2051 + 0.4581 0.3688 0.0015 + 0.0350 0.1912 0.5752 + 0.5222 0.8343 0.5959 + 0.7374 0.5795 0.6809 + 0.9218 0.9206 0.8737 + + +spike_snippets(:,:,15) = + + 0.7865 0.7816 0.6155 + 0.5372 0.8114 0.4325 + 0.3727 0.8911 0.7185 + 0.0353 0.0340 0.4635 + 0.5840 0.0862 0.1968 + 0.9519 0.5324 0.2445 + 0.3134 0.7084 0.1353 + 0.1848 0.7557 0.4126 + 0.0106 0.8252 0.7487 + 0.1224 0.7105 0.7313 + 0.8718 0.5462 0.9643 + 0.2917 0.9847 0.3488 + 0.5548 0.8527 0.6744 + 0.0908 0.7675 0.8040 + 0.1565 0.1608 0.3398 + 0.7158 0.9291 0.1652 + 0.6822 0.5056 0.6049 + 0.6157 0.1270 0.7319 + 0.8646 0.4195 0.5200 + 0.0489 0.9945 0.8455 + 0.8568 0.1741 0.1825 + 0.5738 0.0524 0.1239 + 0.8604 0.1834 0.0426 + 0.3432 0.0838 0.5858 + 0.1225 0.9384 0.2049 + 0.8457 0.1780 0.2640 + 0.0039 0.7270 0.9247 + 0.2992 0.3269 0.1270 + 0.9960 0.6671 0.7288 + 0.3351 0.2429 0.6398 + 0.5583 0.6861 0.2227 + 0.3737 0.3230 0.4208 + 0.0274 0.5645 0.2406 + 0.5473 0.4634 0.8339 + 0.6710 0.1899 0.1739 + 0.9775 0.4120 0.5454 + 0.3883 0.3817 0.5621 + 0.4033 0.6777 0.4225 + 0.7234 0.2555 0.9704 + 0.1591 0.4925 0.9903 + + +spike_snippets(:,:,16) = + + 0.9601 0.4261 0.8380 + 0.4578 0.3814 0.9816 + 0.7650 0.7522 0.4094 + 0.0475 0.7618 0.2626 + 0.0517 0.4061 0.9587 + 0.3187 0.5696 0.0575 + 0.9259 0.4462 0.2033 + 0.5448 0.1725 0.5606 + 0.1203 0.7607 0.0725 + 0.8443 0.4323 0.7786 + 0.5015 0.6739 0.2669 + 0.0986 0.3635 0.6841 + 0.4879 0.6765 0.0829 + 0.9984 0.5124 0.1379 + 0.2017 0.6917 0.2215 + 0.2086 0.6594 0.5955 + 0.1700 0.9997 0.9314 + 0.2926 0.9844 0.1808 + 0.7029 0.1544 0.3578 + 0.1826 0.5482 0.9631 + 0.5764 0.0881 0.1198 + 0.2232 0.6394 0.1269 + 0.0194 0.4497 0.3601 + 0.7354 0.7041 0.9677 + 0.8830 0.4438 0.7886 + 0.5622 0.1308 0.8845 + 0.7777 0.6478 0.4099 + 0.9439 0.9918 0.3069 + 0.1511 0.8404 0.6417 + 0.3328 0.5915 0.7386 + 0.7354 0.3478 0.1711 + 0.5607 0.9615 0.8547 + 0.8202 0.1768 0.7434 + 0.9688 0.3658 0.0483 + 0.3226 0.4188 0.7442 + 0.6305 0.0570 0.8543 + 0.0882 0.8940 0.9014 + 0.2526 0.0621 0.0940 + 0.8808 0.3889 0.5883 + 0.6945 0.2075 0.9826 + + +spike_snippets(:,:,17) = + + 0.2518 0.5818 0.5687 + 0.1692 0.2601 0.8900 + 0.9416 0.9621 0.9741 + 0.6602 0.8987 0.2570 + 0.1918 0.9138 0.1570 + 0.4492 0.2385 0.0065 + 0.2470 0.6987 0.7600 + 0.4756 0.2300 0.5099 + 0.7998 0.4087 0.1418 + 0.3431 0.9322 0.6007 + 0.6657 0.9061 0.4275 + 0.8680 0.5521 0.2827 + 0.2041 0.2542 0.1747 + 0.8590 0.0126 0.7438 + 0.9619 0.0740 0.6684 + 0.0479 0.9238 0.3660 + 0.0550 0.0913 0.3853 + 0.6281 0.2848 0.6150 + 0.7952 0.2270 0.6275 + 0.4146 0.8272 0.9834 + 0.7049 0.8764 0.7300 + 0.8926 0.7747 0.2170 + 0.0096 0.2669 0.7916 + 0.4492 0.9704 0.6413 + 0.2610 0.9420 0.6689 + 0.5655 0.8651 0.2311 + 0.7637 0.2866 0.1800 + 0.2147 0.0454 0.4293 + 0.2542 0.5383 0.2259 + 0.9819 0.8289 0.4630 + 0.6742 0.0134 0.3185 + 0.1221 0.7918 0.9145 + 0.1994 0.2983 0.8845 + 0.1645 0.5156 0.6800 + 0.0033 0.2655 0.0551 + 0.5693 0.5705 0.0326 + 0.2717 0.6311 0.1091 + 0.9801 0.3286 0.5908 + 0.5141 0.1275 0.6234 + 0.3310 0.6454 0.7261 + + +spike_snippets(:,:,18) = + + 0.6792 0.1536 0.7686 + 0.4197 0.6222 0.3823 + 0.3714 0.4234 0.2509 + 0.7312 0.3902 0.0159 + 0.7945 0.9858 0.1467 + 0.5954 0.0416 0.6832 + 0.6992 0.2205 0.4558 + 0.5526 0.3969 0.6680 + 0.0647 0.0462 0.5156 + 0.0087 0.1158 0.5595 + 0.3502 0.2373 0.1293 + 0.3708 0.9807 0.5301 + 0.6378 0.8133 0.2024 + 0.3669 0.0297 0.6785 + 0.2471 0.4422 0.2760 + 0.2854 0.5070 0.9157 + 0.4780 0.8939 0.2641 + 0.0007 0.4187 0.1611 + 0.2343 0.2207 0.5019 + 0.3406 0.6748 0.8346 + 0.1625 0.8566 0.0020 + 0.5023 0.6093 0.5408 + 0.5373 0.5792 0.9837 + 0.4145 0.5383 0.3824 + 0.6758 0.0357 0.1393 + 0.2909 0.0263 0.2084 + 0.9296 0.4293 0.0568 + 0.1195 0.8615 0.3890 + 0.6217 0.4449 0.8288 + 0.2219 0.2328 0.2420 + 0.9573 0.7066 0.0879 + 0.3173 0.7592 0.4618 + 0.0770 0.2486 0.6727 + 0.3387 0.2439 0.4620 + 0.6189 0.2457 0.6979 + 0.4550 0.4702 0.6991 + 0.3922 0.4539 0.3660 + 0.1967 0.6131 0.1070 + 0.9190 0.3731 0.9881 + 0.6727 0.0177 0.8368 + + +spike_snippets(:,:,19) = + + 0.5723 0.4162 0.5737 + 0.7365 0.7400 0.0192 + 0.3486 0.1283 0.5239 + 0.5596 0.0741 0.9627 + 0.7251 0.3255 0.9218 + 0.4888 0.2237 0.8870 + 0.2031 0.5954 0.6067 + 0.3191 0.4081 0.8511 + 0.4727 0.8630 0.4413 + 0.0459 0.9529 0.9355 + 0.0706 0.7296 0.7513 + 0.1063 0.2827 0.0582 + 0.9681 0.3477 0.4857 + 0.0197 0.6786 0.2709 + 0.5342 0.7845 0.3807 + 0.4403 0.9275 0.8690 + 0.1449 0.7715 0.0911 + 0.4528 0.3883 0.4938 + 0.2482 0.6608 0.2387 + 0.2375 0.4085 0.6854 + 0.4636 0.7216 0.3191 + 0.4088 0.6269 0.4242 + 0.6251 0.8265 0.3873 + 0.7480 0.9342 0.8437 + 0.7078 0.5361 0.7469 + 0.6856 0.5139 0.4963 + 0.9682 0.1418 0.5922 + 0.3966 0.6482 0.8709 + 0.6875 0.3060 0.4249 + 0.3703 0.8274 0.2291 + 0.7073 0.3335 0.8946 + 0.5328 0.5704 0.4358 + 0.3135 0.3453 0.4677 + 0.4190 0.8187 0.4792 + 0.4847 0.5276 0.9708 + 0.2647 0.7839 0.6610 + 0.4281 0.1247 0.1745 + 0.4473 0.1502 0.9363 + 0.5404 0.8297 0.3259 + 0.8203 0.1586 0.0542 + + +spike_snippets(:,:,20) = + + 0.1943 0.5619 0.1293 + 0.9273 0.8478 0.7970 + 0.1930 0.3151 0.4309 + 0.7377 0.5235 0.5462 + 0.2804 0.0120 0.1150 + 0.1102 0.7948 0.4490 + 0.3471 0.2344 0.6634 + 0.9368 0.7492 0.8331 + 0.0338 0.8952 0.0004 + 0.3556 0.8802 0.2122 + 0.3900 0.4580 0.4061 + 0.3526 0.3209 0.9486 + 0.1045 0.3968 0.7268 + 0.3758 0.6132 0.2621 + 0.2642 0.8817 0.2963 + 0.5723 0.1138 0.3768 + 0.9351 0.0277 0.3637 + 0.1406 0.2203 0.2729 + 0.5997 0.5051 0.3468 + 0.2357 0.7267 0.0786 + 0.9923 0.6103 0.2772 + 0.1016 0.6222 0.8030 + 0.4693 0.7362 0.1926 + 0.7770 0.7613 0.1670 + 0.8043 0.4398 0.5129 + 0.5659 0.0116 0.4863 + 0.2755 0.4490 0.3219 + 0.9902 0.0969 0.6059 + 0.5231 0.5545 0.6418 + 0.6858 0.0013 0.3912 + 0.7352 0.1463 0.8434 + 0.3129 0.7567 0.2621 + 0.8429 0.9725 0.3036 + 0.0225 0.0850 0.6518 + 0.8738 0.2763 0.6083 + 0.9140 0.8860 0.5986 + 0.1661 0.6797 0.4139 + 0.0239 0.4074 0.7700 + 0.3080 0.5475 0.4765 + 0.7355 0.0164 0.3026 +
 
% Create electrode table region referencing electrodes 0, 1, and 2
shank0_table_region = types.hdmf_common.DynamicTableRegion( ...
'table', types.untyped.ObjectView(ElectrodesDynamicTable), ...
'description', 'shank0', ...
'data', (0:2)');
 
% Define spike event series for unsorted spike times
spike_events = types.core.SpikeEventSeries( ...
'data', spike_snippets, ...
'timestamps', (0:19)', ... % Timestamps for each event
'description', 'events detected with 100uV threshold', ...
'electrodes', shank0_table_region ...
);
 
% Add spike event series to NWB file acquisition
nwb.acquisition.set('SpikeEvents_Shank0', spike_events);

Designating Electrophysiology Data

As mentioned above, ElectricalSeries objects are meant for storing specific types of extracellular recordings. In addition to this TimeSeries class, NWB provides some Processing Modules for designating the type of data you are storing. We will briefly discuss them here, and refer the reader to the API documentation and Intro to NWB for more details on using these objects.
For storing unsorted spiking data, there are two options. Which one you choose depends on what data you have available. If you need to store complete and/or continuous raw voltage traces, you should store the traces with ElectricalSeries objects as acquisition data, and use the EventDetection class for identifying the spike events in your raw traces. If you do not want to store the raw voltage traces and only the waveform ‘snippets’ surrounding spike events, you should use SpikeEventSeries objects.
The results of spike sorting (or clustering) should be stored in the top-level Units table. The Units table can hold just the spike times of sorted units or, optionally, include additional waveform information. You can use the optional predefined columns waveform_mean, waveform_sd, and waveforms in the Units table to store individual and mean waveform data.
For local field potential data, there are two options. Again, which one you choose depends on what data you have available. With both options, you should store your traces with ElectricalSeries objects. If you are storing unfiltered local field potential data, you should store the ElectricalSeries objects in LFP data interface object(s). If you have filtered LFP data, you should store the ElectricalSeries objects in FilteredEphys data interface object(s).

Writing the NWB File

nwbExport(nwb, 'ecephys_tutorial.nwb')

Reading NWB Data

Data arrays are read passively from the file. Calling TimeSeries.data does not read the data values, but presents an HDF5 object that can be indexed to read data. This allows you to conveniently work with datasets that are too large to fit in RAM all at once. load with no input arguments reads the entire dataset:
nwb2 = nwbRead('ecephys_tutorial.nwb', 'ignorecache');
nwb2.processing.get('ecephys'). ...
nwbdatainterface.get('LFP'). ...
electricalseries.get('ElectricalSeries'). ...
data.load;

Accessing Data Regions

If all you need is a data region, you can index a DataStub object like you would any normal array in MATLAB, as shown below. When indexing the dataset this way, only the selected region is read from disk into RAM. This allows you to handle very large datasets that would not fit entirely into RAM.
% read section of LFP
nwb2.processing.get('ecephys'). ...
nwbdatainterface.get('LFP'). ...
electricalseries.get('ElectricalSeries'). ...
data(1:5, 1:10)
ans = 5×10
-1.0039 -0.5621 -1.1019 0.1768 0.2032 0.1612 -0.5518 0.8552 -1.3040 -0.5646 + -0.3458 -1.2921 -0.1967 1.7260 -1.5245 1.3653 -0.6380 0.8438 -0.7094 0.7466 + 0.2758 -0.3401 0.3549 0.4890 -0.2288 0.1290 2.1648 0.1316 -0.2172 0.3036 + -0.8548 -1.5282 -0.0919 -0.1388 1.7996 -0.2845 -1.1904 0.5773 -0.3059 -0.9745 + 0.1536 -0.2051 2.4873 1.0999 0.6398 -0.1086 -0.2511 -0.0993 1.3019 0.0095 +
 
% You can use the getRow method of the table to load spike times of a specific unit.
% To get the values, unpack from the returned table.
nwb.units.getRow(1).spike_times{1}
ans = 21×1
0.8383 + 0.6321 + 0.2418 + 0.2965 + 0.9865 + 0.2779 + 0.9945 + 0.5980 + 0.1216 + 0.1694 +

Learn more!

See the API documentation to learn what data types are available.

MATLAB tutorials

Python tutorials

See our tutorials for more details about your data type:
Check out other tutorials that teach advanced NWB topics:

+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/ecephys.png b/docs/source/_static/html/tutorials/ecephys.png new file mode 100644 index 00000000..90f0770f Binary files /dev/null and b/docs/source/_static/html/tutorials/ecephys.png differ diff --git a/docs/source/_static/html/tutorials/ecephys_01.png b/docs/source/_static/html/tutorials/ecephys_01.png new file mode 100644 index 00000000..88015c35 Binary files /dev/null and b/docs/source/_static/html/tutorials/ecephys_01.png differ diff --git a/docs/source/_static/html/tutorials/ecephys_data_deps.png b/docs/source/_static/html/tutorials/ecephys_data_deps.png new file mode 100644 index 00000000..8854d000 Binary files /dev/null and b/docs/source/_static/html/tutorials/ecephys_data_deps.png differ diff --git a/docs/source/_static/html/tutorials/icephys.html b/docs/source/_static/html/tutorials/icephys.html new file mode 100644 index 00000000..5b095148 --- /dev/null +++ b/docs/source/_static/html/tutorials/icephys.html @@ -0,0 +1,509 @@ + +Intracellular electrophysiology

Intracellular electrophysiology

Table of Contents
The following tutorial describes storage of intracellular electrophysiology data in NWB. NWB supports storage of the time series describing the stimulus and response, information about the electrode and device used, as well as metadata about the organization of the experiment.
Illustration of the hierarchy of metadata tables used to describe the organization of intracellular electrophysiology experiments.

Creating an NWBFile

When creating an NWB file, the first step is to create the NWBFile, which you can create using the NwbFile command.
session_start_time = datetime(2018, 3, 1, 12, 0, 0, 'TimeZone', 'local');
 
 
nwbfile = NwbFile( ...
'session_description', 'my first synthetic recording', ...
'identifier', 'EXAMPLE_ID', ...
'session_start_time', session_start_time, ...
'general_experimenter', 'Dr. Bilbo Baggins', ...
'general_lab', 'Bag End Laboratory', ...
'general_institution', 'University of Middle Earth at the Shire', ...
'general_experiment_description', 'I went on an adventure with thirteen dwarves to reclaim vast treasures.', ...
'general_session_id', 'LONELYMTN' ...
);
 

Device metadata

Device metadata is represented by Device objects.
 
device = types.core.Device();
nwbfile.general_devices.set('Heka ITC-1600', device);

Electrode metadata

Intracellular electrode metadata is represented by IntracellularElectrode objects. Create an electrode object, which requires a link to the device of the previous step. Then add it to the NWB file.
electrode = types.core.IntracellularElectrode( ...
'description', 'a mock intracellular electrode', ...
'device', types.untyped.SoftLink(device), ...
'cell_id', 'a very interesting cell' ...
);
nwbfile.general_intracellular_ephys.set('elec0', electrode);

Stimulus and response data

Intracellular stimulus and response data are represented with subclasses of PatchClampSeries. A stimulus is described by a time series representing voltage or current stimulation with a particular set of parameters. There are two classes for representing stimulus data:
The response is then described by a time series representing voltage or current recorded from a single cell using a single intracellular electrode via one of the following classes:
Below we create a simple example stimulus/response recording data pair.
ccss = types.core.VoltageClampStimulusSeries( ...
'data', [1, 2, 3, 4, 5], ...
'starting_time', 123.6, ...
'starting_time_rate', 10e3, ...
'electrode', types.untyped.SoftLink(electrode), ...
'gain', 0.02, ...
'sweep_number', uint64(15), ...
'stimulus_description', 'N/A' ...
);
 
nwbfile.stimulus_presentation.set('ccss', ccss);
 
vcs = types.core.VoltageClampSeries( ...
'data', [0.1, 0.2, 0.3, 0.4, 0.5], ...
'data_conversion', 1e-12, ...
'data_resolution', NaN, ...
'starting_time', 123.6, ...
'starting_time_rate', 20e3, ...
'electrode', types.untyped.SoftLink(electrode), ...
'gain', 0.02, ...
'capacitance_slow', 100e-12, ...
'resistance_comp_correction', 70.0, ...
'stimulus_description', 'N/A', ...
'sweep_number', uint64(15) ...
);
nwbfile.acquisition.set('vcs', vcs);

Adding an intracellular recording

The IntracellularRecordingsTable relates electrode, stimulus and response pairs and describes metadata specific to individual recordings.
Illustration of the structure of the IntracellularRecordingsTable
We can add an IntracellularRecordingsTable and add the IntracellularElectrodesTable, IntracellularStimuliTable, and IntracellularResponsesTable to it, then add them all to the NWBFile object.
ic_rec_table = types.core.IntracellularRecordingsTable( ...
'categories', {'electrodes', 'stimuli', 'responses'}, ...
'colnames', {'recordings_tag'}, ...
'description', [ ...
'A table to group together a stimulus and response from a single ', ...
'electrode and a single simultaneous recording and for storing ', ...
'metadata about the intracellular recording.'], ...
'id', types.hdmf_common.ElementIdentifiers('data', int64([0, 1, 2])), ...
'recordings_tag', types.hdmf_common.VectorData( ...
'data', repmat({'Tag'}, 3, 1), ...
'description', 'Column for storing a custom recordings tag' ...
) ...
);
 
ic_rec_table.electrodes = types.core.IntracellularElectrodesTable( ...
'description', 'Table for storing intracellular electrode related metadata.', ...
'colnames', {'electrode'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64([0, 1, 2]) ...
), ...
'electrode', types.hdmf_common.VectorData( ...
'data', repmat(types.untyped.ObjectView(electrode), 3, 1), ...
'description', 'Column for storing the reference to the intracellular electrode' ...
) ...
);
 
ic_rec_table.stimuli = types.core.IntracellularStimuliTable( ...
'description', 'Table for storing intracellular stimulus related metadata.', ...
'colnames', {'stimulus'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64([0, 1, 2]) ...
), ...
'stimulus', types.core.TimeSeriesReferenceVectorData( ...
'description', 'Column storing the reference to the recorded stimulus for the recording (rows)', ...
'data', struct( ...
'idx_start', [0, 1, -1], ...
'count', [5, 3, -1], ...
'timeseries', [ ...
types.untyped.ObjectView(ccss), ...
types.untyped.ObjectView(ccss), ...
types.untyped.ObjectView(vcs) ...
] ...
)...
)...
);
 
ic_rec_table.responses = types.core.IntracellularResponsesTable( ...
'description', 'Table for storing intracellular response related metadata.', ...
'colnames', {'response'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64([0, 1, 2]) ...
), ...
'response', types.core.TimeSeriesReferenceVectorData( ...
'description', 'Column storing the reference to the recorded response for the recording (rows)', ...
'data', struct( ...
'idx_start', [0, 2, 0], ...
'count', [5, 3, 5], ...
'timeseries', [ ...
types.untyped.ObjectView(vcs), ...
types.untyped.ObjectView(vcs), ...
types.untyped.ObjectView(vcs) ...
] ...
)...
)...
);
 
The IntracellularRecordingsTable table is not just a DynamicTable but an AlignedDynamicTable. The AlignedDynamicTable type is itself a DynamicTable that may contain an arbitrary number of additional DynamicTable, each of which defines a "category." This is similar to a table with “sub-headings”. In the case of the IntracellularRecordingsTable, we have three predefined categories, i.e., electrodes, stimuli, and responses. We can also dynamically add new categories to the table. As each category corresponds to a DynamicTable, this means we have to create a new DynamicTable and add it to our table.
% add category
ic_rec_table.categories = [ic_rec_table.categories, {'recording_lab_data'}];
ic_rec_table.dynamictable.set( ...
'recording_lab_data', types.hdmf_common.DynamicTable( ...
'description', 'category table for lab-specific recording metadata', ...
'colnames', {'location'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64([0, 1, 2]) ...
), ...
'location', types.hdmf_common.VectorData( ...
'data', {'Mordor', 'Gondor', 'Rohan'}, ...
'description', 'Recording location in Middle Earth' ...
) ...
) ...
);
In an AlignedDynamicTable all category tables must align with the main table, i.e., all tables must have the same number of rows and rows are expected to correspond to each other by index.
We can also add custom columns to any of the subcategory tables, i.e., the electrodes, stimuli, and responses tables, and any custom subcategory tables. All we need to do is indicate the name of the category we want to add the column to.
% Add voltage threshold as column of electrodes table
ic_rec_table.electrodes.colnames = [ic_rec_table.electrodes.colnames {'voltage_threshold'}];
ic_rec_table.electrodes.vectordata.set('voltage_threshold', types.hdmf_common.VectorData( ...
'data', [0.1, 0.12, 0.13], ...
'description', 'Just an example column on the electrodes category table' ...
) ...
);
 
nwbfile.general_intracellular_ephys_intracellular_recordings = ic_rec_table;

Hierarchical organization of recordings

To describe the organization of intracellular experiments, the metadata is organized hierarchically in a sequence of tables. All of the tables are so-called DynamicTables enabling users to add columns for custom metadata. Storing data in hierarchical tables has the advantage that it allows us to avoid duplication of metadata. E.g., for a single experiment we only need to describe the metadata that is constant across an experimental condition as a single row in the SimultaneousRecordingsTable without having to replicate the same information across all repetitions and sequential-, simultaneous-, and individual intracellular recordings. For analysis, this means that we can easily focus on individual aspects of an experiment while still being able to easily access information about information from related tables. All of these tables are optional, but to use one you must use all of the lower level tables, even if you only need a single row.

Add a simultaneous recording

The SimultaneousRecordingsTable groups intracellular recordings from the IntracellularRecordingsTable together that were recorded simultaneously from different electrodes and/or cells and describes metadata that is constant across the simultaneous recordings. In practice a simultaneous recording is often also referred to as a sweep. This example adds a custom column, "simultaneous_recording_tag."
% create simultaneous recordings table with custom column
% 'simultaneous_recording_tag'
 
[recordings_vector_data, recordings_vector_index] = util.create_indexed_column( ...
{[0, 1, 2],}, ...
'Column with references to one or more rows in the IntracellularRecordingsTable table', ...
ic_rec_table);
 
ic_sim_recs_table = types.core.SimultaneousRecordingsTable( ...
'description', [ ...
'A table for grouping different intracellular recordings from ', ...
'the IntracellularRecordingsTable table together that were recorded ', ...
'simultaneously from different electrodes.'...
], ...
'colnames', {'recordings', 'simultaneous_recording_tag'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64(12) ...
), ...
'recordings', recordings_vector_data, ...
'recordings_index', recordings_vector_index, ...
'simultaneous_recording_tag', types.hdmf_common.VectorData( ...
'description', 'A custom tag for simultaneous_recordings', ...
'data', {'LabTag1'} ...
) ...
);
 
Depending on the lab workflow, it may be useful to add complete columns to a table after we have already populated the table with rows. That would be done like so:
ic_sim_recs_table.colnames = [ic_sim_recs_table.colnames, {'simultaneous_recording_type'}];
ic_sim_recs_table.vectordata.set( ...
'simultaneous_recording_type', types.hdmf_common.VectorData(...
'description', 'Description of the type of simultaneous_recording', ...
'data', {'SimultaneousRecordingType1'} ...
) ...
);
 
nwbfile.general_intracellular_ephys_simultaneous_recordings = ic_sim_recs_table;

Add a sequential recording

The SequentialRecordingsTable groups simultaneously recorded intracellular recordings from the SimultaneousRecordingsTable together and describes metadata that is constant across the simultaneous recordings. In practice a sequential recording is often also referred to as a sweep sequence. A common use of sequential recordings is to group together simultaneous recordings where a sequence of stimuli of the same type with varying parameters have been presented in a sequence (e.g., a sequence of square waveforms with varying amplitude).
[simultaneous_recordings_vector_data, simultaneous_recordings_vector_index] = util.create_indexed_column( ...
{0,}, ...
'Column with references to one or more rows in the SimultaneousRecordingsTable table', ...
ic_sim_recs_table);
 
sequential_recordings = types.core.SequentialRecordingsTable( ...
'description', [ ...
'A table for grouping different intracellular recording ', ...
'simultaneous_recordings from the SimultaneousRecordingsTable ', ...
'table together. This is typically used to group together ', ...
'simultaneous_recordings where the a sequence of stimuli of ', ...
'the same type with varying parameters have been presented in ', ...
'a sequence.' ...
], ...
'colnames', {'simultaneous_recordings', 'stimulus_type'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64(15) ...
), ...
'simultaneous_recordings', simultaneous_recordings_vector_data, ...
'simultaneous_recordings_index', simultaneous_recordings_vector_index, ...
'stimulus_type', types.hdmf_common.VectorData( ...
'description', 'Column storing the type of stimulus used for the sequential recording', ...
'data', {'square'} ...
) ...
);
 
nwbfile.general_intracellular_ephys_sequential_recordings = sequential_recordings;

Add repetitions table

The RepetitionsTable groups sequential recordings from the SequentialRecordingsTable. In practice, a repetition is often also referred to a run. A typical use of the RepetitionsTable is to group sets of different stimuli that are applied in sequence that may be repeated.
[sequential_recordings_vector_data, sequential_recordings_vector_index] = util.create_indexed_column( ...
{0,}, ...
'Column with references to one or more rows in the SequentialRecordingsTable table', ...
sequential_recordings);
 
 
nwbfile.general_intracellular_ephys_repetitions = types.core.RepetitionsTable( ...
'description', [ ...
'A table for grouping different intracellular recording sequential ', ...
'recordings together. With each SimultaneousRecording typically ', ...
'representing a particular type of stimulus, the RepetitionsTable ', ...
'table is typically used to group sets of stimuli applied in sequence.' ...
], ...
'colnames', {'sequential_recordings'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64(17) ...
), ...
'sequential_recordings', sequential_recordings_vector_data, ...
'sequential_recordings_index', sequential_recordings_vector_index ...
);

Add experimental condition table

The ExperimentalConditionsTable groups repetitions of intracellular recording from the RepetitionsTable together that belong to the same experimental conditions.
[repetitions_vector_data, repetitions_vector_index] = util.create_indexed_column( ...
{0, 0}, ...
'Column with references to one or more rows in the RepetitionsTable table', ...
nwbfile.general_intracellular_ephys_repetitions);
 
nwbfile.general_intracellular_ephys_experimental_conditions = types.core.ExperimentalConditionsTable( ...
'description', [ ...
'A table for grouping different intracellular recording ', ...
'repetitions together that belong to the same experimental ', ...
'conditions.' ...
], ...
'colnames', {'repetitions', 'tag'}, ...
'id', types.hdmf_common.ElementIdentifiers( ...
'data', int64([19, 21]) ...
), ...
'repetitions', repetitions_vector_data, ...
'repetitions_index', repetitions_vector_index, ...
'tag', types.hdmf_common.VectorData( ...
'description', 'integer tag for a experimental condition', ...
'data', [1,3] ...
) ...
);

Write the NWB file

nwbExport(nwbfile, 'test_new_icephys.nwb');

Read the NWB file

nwbfile2 = nwbRead('test_new_icephys.nwb', 'ignorecache')
nwbfile2 =
NwbFile with properties: + + nwb_version: '2.7.0' + file_create_date: [1×1 types.untyped.DataStub] + identifier: 'EXAMPLE_ID' + session_description: 'my first synthetic recording' + session_start_time: [1×1 types.untyped.DataStub] + timestamps_reference_time: [1×1 types.untyped.DataStub] + acquisition: [1×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [1×1 types.untyped.Set] + general_experiment_description: 'I went on an adventure with thirteen dwarves to reclaim vast treasures.' + general_experimenter: [1×1 types.untyped.DataStub] + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of Middle Earth at the Shire' + general_intracellular_ephys: [1×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [1×1 types.core.ExperimentalConditionsTable] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [1×1 types.core.IntracellularRecordingsTable] + general_intracellular_ephys_repetitions: [1×1 types.core.RepetitionsTable] + general_intracellular_ephys_sequential_recordings: [1×1 types.core.SequentialRecordingsTable] + general_intracellular_ephys_simultaneous_recordings: [1×1 types.core.SimultaneousRecordingsTable] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: 'Bag End Laboratory' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: '' + general_session_id: 'LONELYMTN' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [1×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/images.html b/docs/source/_static/html/tutorials/images.html new file mode 100644 index 00000000..bbdaecdf --- /dev/null +++ b/docs/source/_static/html/tutorials/images.html @@ -0,0 +1,371 @@ + +Storing Image Data in NWB

Storing Image Data in NWB

Image data can be a collection of individual images or movie segments (as a movie is simply a series of images), about the subject, the environment, the presented stimuli, or other parts related to the experiment. This tutorial focuses in particular on the usage of:
Table of Contents

Create an NWB File

nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', 'Mouse5_Day3', ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'timestamps_reference_time', datetime(2018, 4, 25, 3, 0, 45, 'TimeZone', 'local'), ...
'general_experimenter', 'LastName, FirstName', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', 'DOI:10.1016/j.neuron.2016.12.011' ... % optional
);
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.7.0' + file_create_date: [] + identifier: 'Mouse5_Day3' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: {[2018-04-25T03:00:45.000000+02:00]} + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [0×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'LastName, FirstName' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: 'DOI:10.1016/j.neuron.2016.12.011' + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [0×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +

OpticalSeries: Storing series of images as stimuli

OpticalSeries is for time series of images that were presented to the subject as stimuli. We will create an OpticalSeries object with the name "StimulusPresentation" representing what images were shown to the subject and at what times.
Image data can be stored either in the HDF5 file or as an external image file. For this tutorial, we will use fake image data with shape of ('time', 'x', 'y', 'RGB') = (200, 50, 50, 3). As in all TimeSeries, the first dimension is time. The second and third dimensions represent x and y. The fourth dimension represents the RGB value (length of 3) for color images. Please note: As described in the dimensionMapNoDataPipes tutorial, when a MATLAB array is exported to HDF5, the array is transposed. Therefore, in order to correctly export the data, we will need to create a transposed array, where the dimensions are in reverse order compared to the type specification.
NWB differentiates between acquired data and data that was presented as stimulus. We can add it to the NWBFile object as stimulus data.
If the sampling rate is constant, use rate and starting_time to specify time. For irregularly sampled recordings, use timestamps to specify time for each sample image.
image_data = randi(255, [3, 50, 50, 200]); % NB: Array is transposed
optical_series = types.core.OpticalSeries( ...
'distance', 0.7, ... % required
'field_of_view', [0.2, 0.3, 0.7], ... % required
'orientation', 'lower left', ... % required
'data', image_data, ...
'data_unit', 'n.a.', ...
'starting_time_rate', 1.0, ...
'starting_time', 0.0, ...
'description', 'The images presented to the subject as stimuli' ...
);
 
nwb.stimulus_presentation.set('StimulusPresentation', optical_series);

AbstractFeatureSeries: Storing features of visual stimuli

While it is usually recommended to store the entire image data as an OpticalSeries, sometimes it is useful to store features of the visual stimuli instead of or in addition to the raw image data. For example, you may want to store the mean luminance of the image, the contrast, or the spatial frequency. This can be done using an instance of AbstractFeatureSeries. This class is a general container for storing time series of features that are derived from the raw image data.
% Create some fake feature data
feature_data = rand(3, 200); % 200 time points, 3 features
 
% Create an AbstractFeatureSeries object
abstract_feature_series = types.core.AbstractFeatureSeries( ...
'data', feature_data, ...
'timestamps', linspace(0, 1, 200), ...
'description', 'Features of the visual stimuli', ...
'features', {'luminance', 'contrast', 'spatial frequency'}, ...
'feature_units', {'n.a.', 'n.a.', 'cycles/degree'} ...
);
% Add the AbstractFeatureSeries to the NWBFile
nwb.stimulus_presentation.set('StimulusFeatures', abstract_feature_series);

ImageSeries: Storing series of images as acquisition

ImageSeries is a general container for time series of images acquired during the experiment. Image data can be stored either in the HDF5 file or as an external image file. When color images are stored in the HDF5 file the color channel order is expected to be RGB.
image_data = randi(255, [3, 50, 50, 200]);
behavior_images = types.core.ImageSeries( ...
'data', image_data, ...
'description', 'Image data of an animal in environment', ...
'data_unit', 'n.a.', ...
'starting_time_rate', 1.0, ...
'starting_time', 0.0 ...
);
 
nwb.acquisition.set('ImageSeries', behavior_images);

External Files

External files (e.g. video files of the behaving animal) can be added to the NWBFile by creating an ImageSeries object using the external_file attribute that specifies the path to the external file(s) on disk. The file(s) path must be relative to the path of the NWB file. Either external_file or data must be specified, but not both. external_file can be a cell array of multiple video files.
The starting_frame attribute serves as an index to indicate the starting frame of each external file, allowing you to skip the beginning of videos.
external_files = {'video1.pmp4', 'video2.pmp4'};
 
timestamps = [0.0, 0.04, 0.07, 0.1, 0.14, 0.16, 0.21];
behavior_external_file = types.core.ImageSeries( ...
'description', 'Behavior video of animal moving in environment', ...
'data_unit', 'n.a.', ...
'external_file', external_files, ...
'format', 'external', ...
'external_file_starting_frame', [0, 2, 4], ...
'timestamps', timestamps ...
);
 
nwb.acquisition.set('ExternalVideos', behavior_external_file);

Static Images

Static images can be stored in an NWBFile object by creating an RGBAImage, RGBImage or GrayscaleImage object with the image data. All of these image types provide an optional description parameter to include text description about the image and the resolution parameter to specify the pixels/cm resolution of the image.

RGBAImage: for color images with transparency

RGBAImage is for storing data of color image with transparency. data must be 3D where the first and second dimensions represent x and y. The third dimension has length 4 and represents the RGBA value.
image_data = randi(255, [4, 200, 200]);
 
rgba_image = types.core.RGBAImage( ...
'data', image_data, ... % required
'resolution', 70.0, ...
'description', 'RGBA image' ...
);

RGBImage: for color images

RGBImage is for storing data of RGB color image. data must be 3D where the first and second dimensions represent x and y. The third dimension has length 3 and represents the RGB value.
image_data = randi(255, [3, 200, 200]);
 
rgb_image = types.core.RGBImage( ...
'data', image_data, ... % required
'resolution', 70.0, ...
'description', 'RGB image' ...
);

GrayscaleImage: for grayscale images

GrayscaleImage is for storing grayscale image data. data must be 2D where the first and second dimensions represent x and y.
image_data = randi(255, [200, 200]);
 
grayscale_image = types.core.GrayscaleImage( ...
'data', image_data, ... % required
'resolution', 70.0, ...
'description', 'Grayscale image' ...
);

Images: a container for images

Add the images to an Images container that accepts any of these image types.
image_collection = types.core.Images( ...
'description', 'A collection of logo images presented to the subject.'...
);
 
image_collection.image.set('rgba_image', rgba_image);
image_collection.image.set('rgb_image', rgb_image);
image_collection.image.set('grayscale_image', grayscale_image);
 
nwb.acquisition.set('image_collection', image_collection);

Index Series for Repeated Images

You may want to set up a time series of images where some images are repeated many times. You could create an ImageSeries that repeats the data each time the image is shown, but that would be inefficient, because it would store the same data multiple times. A better solution would be to store the unique images once and reference those images. This is how IndexSeries works. First, create an Images container with the order of images defined using an ImageReferences. Then create an IndexSeries that indexes into the Images.
rgbImage = imread('street2.jpg');
grayImage = uint8(sum(double(rgbImage), 3) ./ double(max(max(max(rgbImage)))));
GsStreet = types.core.GrayscaleImage(...
'data', grayImage, ...
'description', 'grayscale image of a street.', ...
'resolution', 28 ...
);
 
RgbStreet = types.core.RGBImage( ...
'data', rgbImage, ...
'resolution', 28, ...
'description', 'RGB Street' ...
);
 
ImageOrder = types.core.ImageReferences(...
'data', [types.untyped.ObjectView(RgbStreet), types.untyped.ObjectView(GsStreet)] ...
);
Images = types.core.Images( ...
'gs_face', GsStreet, ...
'rgb_face', RgbStreet, ...
'description', 'A collection of streets.', ...
'order_of_images', ImageOrder ...
);
 
types.core.IndexSeries(...
'data', [0, 1, 0, 1], ... % NOTE: 0-indexed
'indexed_images', Images, ...
'timestamps', [0.1, 0.2, 0.3, 0.4] ...
)
ans =
IndexSeries with properties: + + indexed_images: [1×1 types.core.Images] + indexed_timeseries: [] + starting_time_unit: 'seconds' + timestamps_interval: 1 + timestamps_unit: 'seconds' + data: [0 1 0 1] + comments: 'no comments' + control: [] + control_description: '' + data_continuity: '' + data_conversion: [] + data_offset: [] + data_resolution: [] + data_unit: 'N/A' + description: 'no description' + starting_time: [] + starting_time_rate: [] + timestamps: [0.1000 0.2000 0.3000 0.4000] +
Here data contains the (0-indexed) index of the displayed image as they are ordered in the ImageReference.

Writing the images to an NWB File

Now use nwbExport to write the file.
nwbExport(nwb, "images_test.nwb");
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/intro.html b/docs/source/_static/html/tutorials/intro.html new file mode 100644 index 00000000..63ebd0ce --- /dev/null +++ b/docs/source/_static/html/tutorials/intro.html @@ -0,0 +1,307 @@ + +Introduction to MatNWB

Introduction to MatNWB

Table of Contents

Installing MatNWB

Use the code below within the brackets to install MatNWB from source. MatNWB works by automatically creating API classes based on the schema.
%{
!git clone https://github.com/NeurodataWithoutBorders/matnwb.git
addpath(genpath(pwd));
%}

Set up the NWB File

An NWB file represents a single session of an experiment. Each file must have a session_description, identifier, and session start time. Create a new NWBFile object with those and additional metadata using the NwbFile command. For all MatNWB classes and functions, we use the Matlab method of entering keyword argument pairs, where arguments are entered as name followed by value. Ellipses are used for clarity.
nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', 'Mouse5_Day3', ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'general_experimenter', 'Last, First', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', {'DOI:10.1016/j.neuron.2016.12.011'}); % optional
nwb

Subject Information

You can also provide information about your subject in the NWB file. Create a Subject object to store information such as age, species, genotype, sex, and a freeform description. Then set nwb.general_subject to the Subject object.
Each of these fields is free-form, so any values will be valid, but here are our recommendations:
subject = types.core.Subject( ...
'subject_id', '001', ...
'age', 'P90D', ...
'description', 'mouse 5', ...
'species', 'Mus musculus', ...
'sex', 'M' ...
);
nwb.general_subject = subject;
 
subject
Note: the DANDI archive requires all NWB files to have a subject object with subject_id specified, and strongly encourages specifying the other fields.

Time Series Data

TimeSeries is a common base class for measurements sampled over time, and provides fields for data and timestamps (regularly or irregularly sampled). You will also need to supply the name and unit of measurement (SI unit).
For instance, we can store a TimeSeries data where recording started 0.0 seconds after start_time and sampled every second (1 Hz):
time_series_with_rate = types.core.TimeSeries( ...
'description', 'an example time series', ...
'data', linspace(0, 100, 10), ...
'data_unit', 'm', ...
'starting_time', 0.0, ...
'starting_time_rate', 1.0);
For irregularly sampled recordings, we need to provide the timestamps for the data:
time_series_with_timestamps = types.core.TimeSeries( ...
'description', 'an example time series', ...
'data', linspace(0, 100, 10), ...
'data_unit', 'm', ...
'timestamps', linspace(0, 1, 10));
The TimeSeries class serves as the foundation for all other time series types in the NWB format. Several specialized subclasses extend the functionality of TimeSeries, each tailored to handle specific kinds of data. In the next section, we’ll explore one of these specialized types. For a full overview, please check out the type hierarchy in the NWB schema documentation.

Other Types of Time Series

As mentioned previously, there are many subtypes of TimeSeries in MatNWB that are used to store different kinds of data. One example is AnnotationSeries, a subclass of TimeSeries that stores text-based records about the experiment. Similar to our TimeSeries example above, we can create an AnnotationSeries object with text information about a stimulus and add it to the stimulus_presentation group in the NWBFile. Below is an example where we create an AnnotationSeries object with annotations for airpuff stimuli and add it to the NWBFile.
% Create an AnnotationSeries object with annotations for airpuff stimuli
annotations = types.core.AnnotationSeries( ...
'description', 'Airpuff events delivered to the animal', ...
'data', {'Left Airpuff', 'Right Airpuff', 'Right Airpuff'}, ...
'timestamps', [1.0, 3.0, 8.0] ...
);
 
% Add the AnnotationSeries to the NWBFile's stimulus group
nwb.stimulus_presentation.set('Airpuffs', annotations)

Behavior

SpatialSeries and Position

Many types of data have special data types in NWB. To store the spatial position of a subject, we will use the SpatialSeries and Position classes.
Note: These diagrams follow a standard convention called "UML class diagram" to express the object-oriented relationships between NWB classes. For our purposes, all you need to know is that an open triangle means "extends" and an open diamond means "is contained within." Learn more about class diagrams on the wikipedia page.
SpatialSeries is a subclass of TimeSeries, a common base class for measurements sampled over time, and provides fields for data and time (regularly or irregularly sampled). Here, we put a SpatialSeries object called 'SpatialSeries' in a Position object. If the data is sampled at a regular interval, it is recommended to specify the starting_time and the sampling rate (starting_time_rate), although it is still possible to specify timestamps as in the time_series_with_timestamps example above.
% create SpatialSeries object
spatial_series_ts = types.core.SpatialSeries( ...
'data', [linspace(0,10,100); linspace(0,8,100)], ...
'reference_frame', '(0,0) is bottom left corner', ...
'starting_time', 0, ...
'starting_time_rate', 200 ...
);
 
% create Position object and add SpatialSeries
Position = types.core.Position('SpatialSeries', spatial_series_ts);
 
% create processing module
behavior_mod = types.core.ProcessingModule('description', 'contains behavioral data');
 
% add the Position object (that holds the SpatialSeries object)
behavior_mod.nwbdatainterface.set('Position', Position);
NWB differentiates between raw, acquired data, which should never change, and processed data, which are the results of preprocessing algorithms and could change. Let's assume that the animal's position was computed from a video tracking algorithm, so it would be classified as processed data. Since processed data can be very diverse, NWB allows us to create processing modules, which are like folders, to store related processed data or data that comes from a single algorithm.
Create a processing module called "behavior" for storing behavioral data in the NWBFile and add the Position object to the module.
% create processing module
behavior_mod = types.core.ProcessingModule('description', 'contains behavioral data');
 
% add the Position object (that holds the SpatialSeries object) to the
% module and name the Position object "Position"
behavior_mod.nwbdatainterface.set('Position', Position);
 
% add the processing module to the NWBFile object, and name the processing module "behavior"
nwb.processing.set('behavior', behavior_mod);

Trials

Trials are stored in a TimeIntervals object which is a subclass of DynamicTable. DynamicTable objects are used to store tabular metadata throughout NWB, including for trials, electrodes, and sorted units. They offer flexibility for tabular data by allowing required columns, optional columns, and custom columns.
The trials DynamicTable can be thought of as a table with this structure:
Trials are stored in a TimeIntervals object which subclasses DynamicTable. Here, we are adding 'correct', which will be a logical array.
trials = types.core.TimeIntervals( ...
'colnames', {'start_time', 'stop_time', 'correct'}, ...
'description', 'trial data and properties');
 
trials.addRow('start_time', 0.1, 'stop_time', 1.0, 'correct', false)
trials.addRow('start_time', 1.5, 'stop_time', 2.0, 'correct', true)
trials.addRow('start_time', 2.5, 'stop_time', 3.0, 'correct', false)
 
trials.toTable() % visualize the table
nwb.intervals_trials = trials;
 
% If you have multiple trials tables, you will need to use custom names for
% each one:
nwb.intervals.set('custom_intervals_table_name', trials);

Write

Now, to write the NWB file that we have built so far:
nwbExport(nwb, 'intro_tutorial.nwb')
We can use the HDFView application to inspect the resulting NWB file.

Read

We can then read the file back in using MatNWB and inspect its contents.
read_nwbfile = nwbRead('intro_tutorial.nwb', 'ignorecache')
We can print the SpatialSeries data traversing the hierarchy of objects. The processing module called 'behavior' contains our Position object named 'Position'. The Position object contains our SpatialSeries object named 'SpatialSeries'.
read_spatial_series = read_nwbfile.processing.get('behavior'). ...
nwbdatainterface.get('Position').spatialseries.get('SpatialSeries')

Reading Data

Counter to normal MATLAB workflow, data arrays are read passively from the file. Calling read_spatial_series.data does not read the data values, but presents a DataStub object that can be indexed to read data.
read_spatial_series.data
This allows you to conveniently work with datasets that are too large to fit in RAM all at once. Access all the data in the matrix using the load method with no arguments.
read_spatial_series.data.load
If you only need a section of the data, you can read only that section by indexing the DataStub object like a normal array in MATLAB. This will just read the selected region from disk into RAM. This technique is particularly useful if you are dealing with a large dataset that is too big to fit entirely into your available RAM.
read_spatial_series.data(:, 1:10)

Next Steps

This concludes the introductory tutorial. Please proceed to one of the specialized tutorials, which are designed to follow this one.
See the API documentation to learn what data types are available.
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/ogen.html b/docs/source/_static/html/tutorials/ogen.html new file mode 100644 index 00000000..4e19478b --- /dev/null +++ b/docs/source/_static/html/tutorials/ogen.html @@ -0,0 +1,201 @@ + +Optogenetics

Optogenetics

This tutorial will demonstrate how to write optogenetics data.

Creating an NWBFile object

When creating a NWB file, the first step is to create the NWBFile object using NwbFile.
nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', char(java.util.UUID.randomUUID), ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'general_experimenter', 'Last, First M.', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', 'DOI:10.1016/j.neuron.2016.12.011'); % optional
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.6.0' + file_create_date: [] + identifier: 'b843652e-3404-48c7-8686-4904e786ea4c' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: [] + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [0×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'Last, First M.' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: 'DOI:10.1016/j.neuron.2016.12.011' + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [0×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +

Adding optogenetic data

The ogen module contains two data types that you will need to write optogenetics data, OptogeneticStimulusSite, which contains metadata about the stimulus site, and OptogeneticSeries, which contains the values of the time series.
First, you need to create a Device object linked to the NWBFile:
device = types.core.Device();
nwb.general_devices.set('Device', device);
Now, you can create and add an OptogeneticStimulusSite.
ogen_stim_site = types.core.OptogeneticStimulusSite( ...
'device', types.untyped.SoftLink(device), ...
'description', 'This is an example optogenetic site.', ...
'excitation_lambda', 600.0, ...
'location', 'VISrl');
 
nwb.general_optogenetics.set('OptogeneticStimulusSite', ogen_stim_site);
With the OptogeneticStimulusSite added, you can now create and add a OptogeneticSeries. Here, we will generate some random data and specify the timing using rate. If you have samples at irregular intervals, you should use timestamps instead.
ogen_series = types.core.OptogeneticSeries( ...
'data', randn(20, 1), ...
'site', types.untyped.SoftLink(ogen_stim_site), ...
'starting_time', 0.0, ...
'starting_time_rate', 30.0); % Hz
nwb.stimulus_presentation.set('OptogeneticSeries', ogen_series);
 
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.6.0' + file_create_date: [] + identifier: 'b843652e-3404-48c7-8686-4904e786ea4c' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: [] + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [1×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'Last, First M.' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [1×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: 'DOI:10.1016/j.neuron.2016.12.011' + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [1×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +
Now you can write the NWB file.
nwbExport(nwb, 'ogen_tutorial.nwb');
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/ophys.html b/docs/source/_static/html/tutorials/ophys.html new file mode 100644 index 00000000..c9cab142 --- /dev/null +++ b/docs/source/_static/html/tutorials/ophys.html @@ -0,0 +1,479 @@ + +MatNWB Optical Physiology Tutorial

MatNWB Optical Physiology Tutorial

Table of Contents

Introduction

In this tutorial, we will create fake data for a hypothetical optical physiology experiment with a freely moving animal. The types of data we will convert are:
It is recommended to first work through the Introduction to MatNWB tutorial, which demonstrates installing MatNWB and creating an NWB file with subject information, animal position, and trials, as well as writing and reading NWB files in MATLAB.

Set up the NWB file

An NWB file represents a single session of an experiment. Each file must have a session_description, identifier, and session start time. Create a new NWBFile object with those and additional metadata. For all MatNWB functions, we use the Matlab method of entering keyword argument pairs, where arguments are entered as name followed by value.
nwb = NwbFile( ...
'session_description', 'mouse in open exploration',...
'identifier', 'Mouse5_Day3', ...
'session_start_time', datetime(2018, 4, 25, 2, 30, 3, 'TimeZone', 'local'), ...
'timestamps_reference_time', datetime(2018, 4, 25, 3, 0, 45, 'TimeZone', 'local'), ...
'general_experimenter', 'LastName, FirstName', ... % optional
'general_session_id', 'session_1234', ... % optional
'general_institution', 'University of My Institution', ... % optional
'general_related_publications', {'DOI:10.1016/j.neuron.2016.12.011'}); % optional
nwb
nwb =
NwbFile with properties: + + nwb_version: '2.6.0' + file_create_date: [] + identifier: 'Mouse5_Day3' + session_description: 'mouse in open exploration' + session_start_time: {[2018-04-25T02:30:03.000000+02:00]} + timestamps_reference_time: {[2018-04-25T03:00:45.000000+02:00]} + acquisition: [0×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: '' + general_devices: [0×1 types.untyped.Set] + general_experiment_description: '' + general_experimenter: 'LastName, FirstName' + general_extracellular_ephys: [0×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [] + general_institution: 'University of My Institution' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_experimental_conditions: [] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_intracellular_recordings: [] + general_intracellular_ephys_repetitions: [] + general_intracellular_ephys_sequential_recordings: [] + general_intracellular_ephys_simultaneous_recordings: [] + general_intracellular_ephys_sweep_table: [] + general_keywords: '' + general_lab: '' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: {'DOI:10.1016/j.neuron.2016.12.011'} + general_session_id: 'session_1234' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [0×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [] +

Optical Physiology

Optical physiology results are written in four steps:
  1. Create imaging plane
  2. Acquired two-photon images
  3. Image segmentation
  4. Fluorescence and dF/F responses

Imaging Plane

First, you must create an ImagingPlane object, which will hold information about the area and method used to collect the optical imaging data. This requires creation of a Device object for the microscope and an OpticalChannel object. Then you can create an ImagingPlane.
optical_channel = types.core.OpticalChannel( ...
'description', 'description', ...
'emission_lambda', 500.);
 
device = types.core.Device();
nwb.general_devices.set('Device', device);
 
imaging_plane_name = 'imaging_plane';
imaging_plane = types.core.ImagingPlane( ...
'optical_channel', optical_channel, ...
'description', 'a very interesting part of the brain', ...
'device', types.untyped.SoftLink(device), ...
'excitation_lambda', 600., ...
'imaging_rate', 5., ...
'indicator', 'GFP', ...
'location', 'my favorite brain location');
 
nwb.general_optophysiology.set(imaging_plane_name, imaging_plane);

Storing Two-Photon Data

You can create a TwoPhotonSeries class representing two photon imaging data. TwoPhotonSeries, like SpatialSeries, inherits from TimeSeries and is similar in behavior to OnePhotonSeries.
InternalTwoPhoton = types.core.TwoPhotonSeries( ...
'imaging_plane', types.untyped.SoftLink(imaging_plane), ...
'starting_time', 0.0, ...
'starting_time_rate', 3.0, ...
'data', ones(200, 100, 1000), ...
'data_unit', 'lumens');
 
nwb.acquisition.set('2pInternal', InternalTwoPhoton);

Storing One-Photon Data

Now that we have our ImagingPlane, we can create a OnePhotonSeries object to store raw one-photon imaging data.
% using internal data. this data will be stored inside the NWB file
InternalOnePhoton = types.core.OnePhotonSeries( ...
'data', ones(100, 100, 1000), ...
'imaging_plane', types.untyped.SoftLink(imaging_plane), ...
'starting_time', 0., ...
'starting_time_rate', 1.0, ...
'data_unit', 'normalized amplitude' ...
);
nwb.acquisition.set('1pInternal', InternalOnePhoton);

Plane Segmentation

Image segmentation stores the detected regions of interest in the TwoPhotonSeries data. ImageSegmentation allows you to have more than one segmentation by creating more PlaneSegmentation objects.

Regions of interest (ROIs)

ROIs can be added to a PlaneSegmentation either as an image_mask or as a pixel_mask. An image mask is an array that is the same size as a single frame of the TwoPhotonSeries, and indicates where a single region of interest is. This image mask may be boolean or continuous between 0 and 1. A pixel_mask, on the other hand, is a list of indices (i.e coordinates) and weights for the ROI. The pixel_mask is represented as a compound data type using a ragged array and below is an example demonstrating how to create either an image_mask or a pixel_mask. Changing the dropdown selection will update the PlaneSegmentation object accordingly.
selection = "Create Image Mask"; % "Create Image Mask" or "Create Pixel Mask"
 
% generate fake image_mask data
imaging_shape = [100, 100];
x = imaging_shape(1);
y = imaging_shape(2);
 
n_rois = 20;
image_mask = zeros(y, x, n_rois);
center = randi(90,2,n_rois);
for i = 1:n_rois
image_mask(center(1,i):center(1,i)+10, center(2,i):center(2,i)+10, i) = 1;
end
 
if selection == "Create Pixel Mask"
ind = find(image_mask);
[y_ind, x_ind, roi_ind] = ind2sub(size(image_mask), ind);
 
pixel_mask_struct = struct();
pixel_mask_struct.x = uint32(x_ind); % Add x coordinates to struct field x
pixel_mask_struct.y = uint32(y_ind); % Add y coordinates to struct field y
pixel_mask_struct.weight = single(ones(size(x_ind)));
% Create pixel mask vector data
pixel_mask = types.hdmf_common.VectorData(...
'data', struct2table(pixel_mask_struct), ...
'description', 'pixel masks');
 
% When creating a pixel mask, it is also necessary to specify a
% pixel_mask_index vector. See the documentation for ragged arrays linked
% above to learn more.
num_pixels_per_roi = zeros(n_rois, 1); % Column vector
for i_roi = 1:n_rois
num_pixels_per_roi(i_roi) = sum(roi_ind == i_roi);
end
 
pixel_mask_index = uint16(cumsum(num_pixels_per_roi)); % Note: Use an integer
% type that can accommodate the maximum value of the cumulative sum
 
% Create pixel_mask_index vector
pixel_mask_index = types.hdmf_common.VectorIndex(...
'description', 'Index into pixel_mask VectorData', ...
'data', pixel_mask_index, ...
'target', types.untyped.ObjectView(pixel_mask) );
 
plane_segmentation = types.core.PlaneSegmentation( ...
'colnames', {'pixel_mask'}, ...
'description', 'roi pixel position (x,y) and pixel weight', ...
'imaging_plane', types.untyped.SoftLink(imaging_plane), ...
'pixel_mask_index', pixel_mask_index, ...
'pixel_mask', pixel_mask ...
);
 
else % selection == "Create Image Mask"
plane_segmentation = types.core.PlaneSegmentation( ...
'colnames', {'image_mask'}, ...
'description', 'output from segmenting my favorite imaging plane', ...
'imaging_plane', types.untyped.SoftLink(imaging_plane), ...
'image_mask', types.hdmf_common.VectorData(...
'data', image_mask, ...
'description', 'image masks') ...
);
end

Adding ROIs to NWB file

Now create an ImageSegmentation object and put the plane_segmentation object inside of it, naming it PlaneSegmentation.
img_seg = types.core.ImageSegmentation();
img_seg.planesegmentation.set('PlaneSegmentation', plane_segmentation);
Now create a ProcessingModule called "ophys" and put our img_seg object in it, calling it "ImageSegmentation", and add the ProcessingModule to nwb.
ophys_module = types.core.ProcessingModule( ...
'description', 'contains optical physiology data')
ophys_module =
ProcessingModule with properties: + + description: 'contains optical physiology data' + dynamictable: [0×1 types.untyped.Set] + nwbdatainterface: [0×1 types.untyped.Set] +
ophys_module.nwbdatainterface.set('ImageSegmentation', img_seg);
nwb.processing.set('ophys', ophys_module);

Storing fluorescence of ROIs over time

Now that ROIs are stored, you can store fluorescence dF/F data for these regions of interest. This type of data is stored using the RoiResponseSeries class. You will not need to instantiate this class directly to create objects of this type, but it is worth noting that this is the class you will work with after you read data back in.
First, create a data interface to store this data in
roi_table_region = types.hdmf_common.DynamicTableRegion( ...
'table', types.untyped.ObjectView(plane_segmentation), ...
'description', 'all_rois', ...
'data', (0:n_rois-1)');
 
roi_response_series = types.core.RoiResponseSeries( ...
'rois', roi_table_region, ...
'data', NaN(n_rois, 100), ...
'data_unit', 'lumens', ...
'starting_time_rate', 3.0, ...
'starting_time', 0.0);
 
fluorescence = types.core.Fluorescence();
fluorescence.roiresponseseries.set('RoiResponseSeries', roi_response_series);
 
ophys_module.nwbdatainterface.set('Fluorescence', fluorescence);
Finally, the ophys ProcessingModule is added to the NwbFile.
nwb.processing.set('ophys', ophys_module);

Writing the NWB file

nwb_file_name = 'ophys_tutorial.nwb';
if isfile(nwb_file_name); delete(nwb_file_name); end
nwbExport(nwb, nwb_file_name);

Reading the NWB file

read_nwb = nwbRead(nwb_file_name, 'ignorecache');
Data arrays are read passively from the file. Calling TimeSeries.data does not read the data values, but presents an HDF5 object that can be indexed to read data.
read_nwb.processing.get('ophys').nwbdatainterface.get('Fluorescence')...
.roiresponseseries.get('RoiResponseSeries').data
ans =
DataStub with properties: + + filename: 'ophys_tutorial.nwb' + path: '/processing/ophys/Fluorescence/RoiResponseSeries/data' + dims: [20 100] + ndims: 2 + dataType: 'double' +
This allows you to conveniently work with datasets that are too large to fit in RAM all at once. Access the data in the matrix using the load method.
load with no input arguments reads the entire dataset:
read_nwb.processing.get('ophys').nwbdatainterface.get('Fluorescence'). ...
roiresponseseries.get('RoiResponseSeries').data.load
ans = 20×100
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN +
If all you need is a section of the data, you can read only that section by indexing the DataStub object like a normal array in MATLAB. This will just read the selected region from disk into RAM. This technique is particularly useful if you are dealing with a large dataset that is too big to fit entirely into your available RAM.
read_nwb.processing.get('ophys'). ...
nwbdatainterface.get('Fluorescence'). ...
roiresponseseries.get('RoiResponseSeries'). ...
data(1:5, 1:10)
ans = 5×10
NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN + NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN +
% read back the image/pixel masks and display the first roi
plane_segmentation = read_nwb.processing.get('ophys'). ...
nwbdatainterface.get('ImageSegmentation'). ...
planesegmentation.get('PlaneSegmentation');
 
if ~isempty(plane_segmentation.image_mask)
roi_mask = plane_segmentation.image_mask.data(:,:,1);
elseif ~isempty(plane_segmentation.pixel_mask)
row = plane_segmentation.getRow(1, 'columns', {'pixel_mask'});
pixel_mask = row.pixel_mask{1};
roi_mask = zeros(imaging_shape);
ind = sub2ind(imaging_shape, pixel_mask.y, pixel_mask.x);
roi_mask(ind) = pixel_mask.weight;
end
imshow(roi_mask)

Learn more!

See the API documentation to learn what data types are available.

Other MatNWB tutorials

Python tutorials

See our tutorials for more details about your data type:
Check out other tutorials that teach advanced NWB topics:

+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/ophys_tutorial_schematic.png b/docs/source/_static/html/tutorials/ophys_tutorial_schematic.png new file mode 100644 index 00000000..7e8a94e4 Binary files /dev/null and b/docs/source/_static/html/tutorials/ophys_tutorial_schematic.png differ diff --git a/docs/source/_static/html/tutorials/read_demo.html b/docs/source/_static/html/tutorials/read_demo.html new file mode 100644 index 00000000..46c9dafb --- /dev/null +++ b/docs/source/_static/html/tutorials/read_demo.html @@ -0,0 +1,356 @@ + +Reading NWB Data in MATLAB

Reading NWB Data in MATLAB

Authors: Ryan Ly, with modification by Lawrence Niu
Last Updated: 2023-09-05

Introduction

In this tutorial, we will read single neuron spiking data that is in the NWB standard format and do a basic visualization of the data. More thorough documentation regarding reading files as well as the NwbFile class, can be found in the NWB Overview Documentation
Table of Contents

Download the data

First, let's download an NWB data file from the DANDI neurophysiology data archive.
A NWB file represents a single session of an experiment. It contains all the data of that session and the metadata required to understand the data.
We will use data from one session of an experiment by Chandravadia et al. (2020), where the authors recorded single neuron electrophysiological activity from the medial temporal lobes of human subjects while they performed a visual recognition memory task.
  1. Go to the DANDI page for this dataset: https://dandiarchive.org/dandiset/000004/draft
  2. Toward the top middle of the page, click the "Files" button.
demo_dandi_view_files_in_dataset.png
3. Click on the folder "sub-P11MHM" (click the folder name, not the checkbox).
demo_dandi_select_folder.png
4. Then click on the download symbol to the right of the filename "sub-P11HMH_ses-20061101_ecephys+image.nwb" to download the data file (69 MB) to your computer.
demo_dandi_download_data.png

Installing matnwb

Use the code below to install MatNWB from source using git. Ensure git is on your path before running this line.
!git clone https://github.com/NeurodataWithoutBorders/matnwb.git
Cloning into 'matnwb'... +Updating files: 95% (520/542) +Updating files: 96% (521/542) +Updating files: 97% (526/542) +Updating files: 98% (532/542) +Updating files: 99% (537/542) +Updating files: 100% (542/542) +Updating files: 100% (542/542), done.
MatNWB works by automatically creating API classes based on the schema. For most NWB files, the classes are generated automatically by calling nwbRead farther down. This particular NWB file was created before this feature was supported, so we must ensure that these classes for the correct schema versions are properly generated before attempting to read from the file.
% add the path to matnwb and generate the core classes
addpath('matnwb');
 
% Reminder: YOU DO NOT NORMALLY NEED TO CALL THIS FUNCTION. Only attempt this method if you
% encounter read errors.
generateCore(util.getSchemaVersion('sub-P11HMH_ses-20061101_ecephys+image.nwb'));

Read the NWB file

You can read any NWB file using nwbRead. You will find that the print out for this shows a summary of the data within.
% ignorecache informs the `nwbRead` call to not generate files by default. Since we have already
% done this, we can skip this automated step when reading. If you are reading the file before
% generating, you can omit this argument flag.
nwb = nwbRead('sub-P11HMH_ses-20061101_ecephys+image.nwb', 'ignorecache')
nwb =
NwbFile with properties: + + nwb_version: '2.1.0' + file_create_date: [1×1 types.untyped.DataStub] + general_devices: [1×1 types.untyped.Set] + identifier: 'H11_9' + session_description: 'New/Old recognition task for ID: 9. ' + session_start_time: 2006-11-01 + timestamps_reference_time: 2006-11-01 + acquisition: [2×1 types.untyped.Set] + analysis: [0×1 types.untyped.Set] + general: [0×1 types.untyped.Set] + general_data_collection: 'learning: 80, recognition: 81' + general_experiment_description: 'The data contained within this file describes a new/old recogntion task performed in patients with intractable epilepsy implanted with depth electrodes and Behnke-Fried microwires in the human Medical Temporal Lobe (MTL).' + general_experimenter: '' + general_extracellular_ephys: [9×1 types.untyped.Set] + general_extracellular_ephys_electrodes: [1×1 types.core.DynamicTable] + general_institution: 'Hunigton Memorial Hospital' + general_intracellular_ephys: [0×1 types.untyped.Set] + general_intracellular_ephys_filtering: '' + general_intracellular_ephys_sweep_table: [] + general_keywords: [1×1 types.untyped.DataStub] + general_lab: 'Rutishauser' + general_notes: '' + general_optogenetics: [0×1 types.untyped.Set] + general_optophysiology: [0×1 types.untyped.Set] + general_pharmacology: '' + general_protocol: '' + general_related_publications: [1×1 types.untyped.DataStub] + general_session_id: '' + general_slices: '' + general_source_script: '' + general_source_script_file_name: '' + general_stimulus: '' + general_subject: [1×1 types.core.Subject] + general_surgery: '' + general_virus: '' + intervals: [0×1 types.untyped.Set] + intervals_epochs: [] + intervals_invalid_times: [] + intervals_trials: [1×1 types.core.TimeIntervals] + processing: [0×1 types.untyped.Set] + scratch: [0×1 types.untyped.Set] + stimulus_presentation: [1×1 types.untyped.Set] + stimulus_templates: [0×1 types.untyped.Set] + units: [1×1 types.core.Units] +
You can also use util.nwbTree to actively explore the NWB file.
util.nwbTree(nwb);

Stimulus

Now lets take a look at the visual stimuli presented to the subject. They will be in nwb.stimulus_presentation
nwb.stimulus_presentation
ans =
Set with properties: + + StimulusPresentation: [types.core.OpticalSeries] +
This results shows us that nwb.stimulus_presentation is a Set object that contains a single data object called StimulusPresentation, which is an OpticalSeries neurodata type. Use the get method to return this OpticalSeries. Set objects store a collection of other NWB objects.
nwb.stimulus_presentation.get('StimulusPresentation')
ans =
OpticalSeries with properties: + + distance: 0.7000 + field_of_view: [1×1 types.untyped.DataStub] + orientation: 'lower left' + dimension: [1×1 types.untyped.DataStub] + external_file: '' + external_file_starting_frame: [] + format: 'raw' + starting_time_unit: 'seconds' + timestamps_interval: 1 + timestamps_unit: 'seconds' + data: [1×1 types.untyped.DataStub] + comments: 'no comments' + control: [] + control_description: '' + data_conversion: 1 + data_resolution: -1 + data_unit: 'meters' + description: 'no description' + starting_time: [] + starting_time_rate: [] + timestamps: [1×1 types.untyped.DataStub] +
OpticalSeries is a neurodata type that stores information about visual stimuli presented to subjects. This print out shows all of the attributes in the OpticalSeries object named StimulusPresentation. The images are stored in StimulusPresentation.data
StimulusImageData = nwb.stimulus_presentation.get('StimulusPresentation').data
StimulusImageData =
DataStub with properties: + + filename: 'sub-P11HMH_ses-20061101_ecephys+image.nwb' + path: '/stimulus/presentation/StimulusPresentation/data' + dims: [3 300 400 200] + ndims: 4 + dataType: 'uint8' +
When calling a data object directly, the data is not read but instead a DataStub is returned. This is because data is read "lazily" in MatNWB. Instead of reading the entire dataset into memory, this provides a "window" into the data stored on disk that allows you to read only a section of the data. In this case, the last dimension indexes over images. You can index into any DataStub as you would any MATLAB matrix.
% get the image and display it
% the dimension order is provided as follows:
% [rgb, y, x, image index]
img = StimulusImageData(1:3, 1:300, 1:400, 32);
A bit of manipulation allows us to display the image using MATLAB's imshow.
img = permute(img,[3, 2, 1]); % fix orientation
img = flip(img, 3); % reverse color order
F = figure();
imshow(img, 'InitialMagnification', 'fit');
daspect([3, 5, 5]);
To read an entire dataset, use the DataStub.load method without any input arguments. We will use this approach to read all of the image display timestamps into memory.
stimulus_times = nwb.stimulus_presentation.get('StimulusPresentation').timestamps.load();

Quick PSTH and raster

Here, I will pull out spike times of a particular unit, align them to the image display times, and finally display the results.
First, let us show the first row of the NWB Units table representing the first unit.
nwb.units.getRow(1)
ans = 1×8 table
 origClusterIDwaveform_mean_encodingwaveform_mean_recognitionIsolationDistSNRwaveform_mean_sampling_ratespike_timeselectrodes
11102256×1 double256×1 double11.29171.440798400373×1 double0
Let us specify some parameters for creating a cell array of spike times aligned to each stimulus time.
%% Align spikes by stimulus presentations
 
unit_ind =8;
before =1;
after =3;
getRow provides a convenient method for reading this data out.
unit_spikes = nwb.units.getRow(unit_ind, 'columns', {'spike_times'}).spike_times{1}
unit_spikes = 2116×1
103 ×
5.9338 + 5.9343 + 5.9346 + 5.9358 + 5.9364 + 5.9375 + 6.0772 + 6.0776 + 6.0797 + 6.0798 +
Spike times from this unit are aligned to each stimulus time and compiled in a cell array
results = cell(1, length(stimulus_times));
for itime = 1:length(stimulus_times)
stimulus_time = stimulus_times(itime);
spikes = unit_spikes - stimulus_time;
spikes = spikes(spikes > -before);
spikes = spikes(spikes < after);
results{itime} = spikes;
end

Plot results

Finally, here is a (slightly sloppy) peri-stimulus time histogram
figure();
hold on
for i = 1:length(results)
spikes = results{i};
yy = ones(length(spikes)) * i;
 
plot(spikes, yy, 'k.');
end
hold off
ylabel('trial');
xlabel('time (s)');
axis('tight')
figure();
all_spikes = cat(1, results{:});
histogram(all_spikes, 30);
ylabel('count')
xlabel('time (s)');
axis('tight')

Conclusion

This is an example of how to get started with understanding and analyzing public NWB datasets. This particular dataset was published with an extensive open analysis conducted in both MATLAB and Python, which you can find here. For more datasets, or to publish your own NWB data for free, check out the DANDI archive here. Also, make sure to check out the DANDI breakout session later in this event.
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/remote_read.html b/docs/source/_static/html/tutorials/remote_read.html new file mode 100644 index 00000000..9746120e --- /dev/null +++ b/docs/source/_static/html/tutorials/remote_read.html @@ -0,0 +1,58 @@ + +Remote read of NWB files

Remote read of NWB files

It is possible to read an NWB file (or any HDF5 file) in MATLAB directly from several different kinds of remote locations, including AWS, Azure Blob Storage and HDFS. This tutorial will walk you through specifically loading a MATLAB file from AWS S3, which is the storage used by the DANDI archive. See MATLAB documentation for more general information.
To read an NWB file file from an s3 store, first you need to figure out the s3 path of that resource. The easiest way to do this is to use the DANDI web client.
s3 = 's3://dandiarchive/blobs/7ee/415/7ee41580-9b0b-44ca-8675-6959ddd8dc33';
nwbfile = nwbRead(s3);
That's it! MATLAB will automatically detect that this is an S3 path instead of a local filepath and will set up a remote read interface for that NWB file. This appoach works on any computer with a fairly recent version of MATLAB and an internet connection. It works particularly well on the DANDI Hub, which has a very fast connection to the DANDI S3 store and which provides a MATLAB environment for free provided you have a license.

Note: MATLAB vs. Python remote read

Python also allows you to remotely read a file, and has several advantages over MATLAB. Reading in Python is faster. On DANDI Hub, for MATLAB, reading the file takes about 51 seconds, while the analogous operation takes less than a second in Python. Python also allows you to create a local cache so you are not repeatedly requesting the same data, which can further speed up data access. Overall, we recommend remote reading using Python instead of MATLAB.
+
+ +
\ No newline at end of file diff --git a/docs/source/_static/html/tutorials/scratch.html b/docs/source/_static/html/tutorials/scratch.html new file mode 100644 index 00000000..fe467263 --- /dev/null +++ b/docs/source/_static/html/tutorials/scratch.html @@ -0,0 +1,164 @@ + +Scratch Data

Scratch Data

This tutorial will focus on the basics of working with a NWBFile for storing non-standardizable data. For example, you may want to store results from one-off analyses of some temporary utility. NWB provides in-file scratch space as a dedicated location where miscellaneous non-standard data may be written.
Table of Contents

Setup

Let us first set up an environment with some "acquired data".
ContextFile = NwbFile(...
'session_description', 'demonstrate NWBFile scratch', ... % required
'identifier', 'SCRATCH-0', ... % required
'session_start_time', datetime(2019, 4, 3, 11, 0, 0, 'TimeZone', 'local'), ... % required
'file_create_date', datetime(2019, 4, 15, 12, 0, 0, 'TimeZone', 'local'), ... % optional
'general_experimenter', 'Niu, Lawrence', ...
'general_institution', 'NWB' ...
);
% simulate some data
timestamps = 0:100:1024;
data = sin(0.333 .* timestamps) ...
+ cos(0.1 .* timestamps) ...
+ randn(1, length(timestamps));
RawTs = types.core.TimeSeries(...
'data', data, ...
'data_unit', 'm', ...
'starting_time', 0., ...
'starting_time_rate', 100, ...
'description', 'simulated acquired data' ...
);
ContextFile.acquisition.set('raw_timeseries', RawTs);
 
% "analyze" the simulated data
% we provide a re-implementation of scipy.signal.correlate(..., mode='same')
% Ideally, you should use MATLAB-native code though using its equivalent function (xcorr) requires
% the Signal Processing Toolbox
correlatedData = sameCorr(RawTs.data, ones(128, 1)) ./ 128;
% If you are unsure of how HDF5 paths map to MatNWB property structures, we suggest using HDFView to
% verify. In most cases, MatNWB properties map directly to HDF5 paths.
FilteredTs = types.core.TimeSeries( ...
'data', correlatedData, ...
'data_unit', 'm', ...
'starting_time', 0, ...
'starting_time_rate', 100, ...
'description', 'cross-correlated data' ...
)
FilteredTs =
TimeSeries with properties: + + starting_time_unit: 'seconds' + timestamps_interval: 1 + timestamps_unit: 'seconds' + data: [0.0461 0.0461 0.0461 0.0461 0.0461 0.0461 0.0461 0.0461 0.0461 0.0461 0.0461] + comments: 'no comments' + control: [] + control_description: '' + data_continuity: '' + data_conversion: 1 + data_offset: 0 + data_resolution: -1 + data_unit: 'm' + description: 'cross-correlated data' + starting_time: 0 + starting_time_rate: 100 + timestamps: [] +
ProcModule = types.core.ProcessingModule( ...
'description', 'a module to store filtering results', ...
'filtered_timeseries', FilteredTs ...
);
ContextFile.processing.set('core', ProcModule);
nwbExport(ContextFile, 'context_file.nwb');

Warning Regarding the Usage of Scratch Space

Scratch data written into the scratch space should not be intended for reuse or sharing. Standard NWB types, along with any extensions, should always be used for any data intended to be shared. Published data should not include scratch data and any reuse should not require scratch data for data processing.

Writing Data to Scratch Space

Let us first copy what we need from the processed data file.
ScratchFile = NwbFile('identifier', 'SCRATCH-1');
ContextFile = nwbRead('./context_file.nwb', 'ignorecache');
% again, copy the required metadata from the processed file.
ScratchFile.session_description = ContextFile.session_description;
ScratchFile.session_start_time = ContextFile.session_start_time;
We can now do an analysis lacking specification but that we still wish to store results for.
% ProcessingModule stores its timeseries inside of the "nwbdatainterface" property which is a Set of
% NWBDataInterface objects. This is not directly mapped to the NWB file but is used to distinguish
% it and DynamicTable objects which it stores under the "dynamictable" property.
FilteredTs = ContextFile.processing.get('core').nwbdatainterface.get('filtered_timeseries');
% note: MatNWB does not currently support complex numbers. If you wish to store the data, consider
% storing each number as a struct which will write the data to HDF5 using compound types.
dataFft = real(fft(FilteredTs.data.load()));
ScratchData = types.core.ScratchData( ...
'data', dataFft, ...
'notes', 'discrete Fourier transform from filtered data' ...
)
ScratchData =
ScratchData with properties: + + notes: 'discrete Fourier transform from filtered data' + data: [11×1 double] +
ScratchFile.scratch.set('dft_filtered', ScratchData);
nwbExport(ScratchFile, 'scratch_analysis.nwb');
The scratch_analysis.nwb file will now have scratch data stored in it:
scratch_filtered.png
function C = sameCorr(A, B)
% SAMECORR scipy.signals.correlate(..., mode="same") equivalent
for iDim = 1:ndims(B)
B = flip(B, iDim);
end
C = conv(A, conj(B), 'same');
end
+
+ +
\ No newline at end of file diff --git a/docs/source/conf.py b/docs/source/conf.py new file mode 100644 index 00000000..b14ade56 --- /dev/null +++ b/docs/source/conf.py @@ -0,0 +1,69 @@ +# Configuration file for the Sphinx documentation builder. +# +# For the full list of built-in configuration values, see the documentation: +# https://www.sphinx-doc.org/en/master/usage/configuration.html + +# -- Project information ----------------------------------------------------- +# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information + +import os +import sys + +sys.path.append('sphinx_extensions') +from docstring_processors import process_matlab_docstring + +def setup(app): + app.connect("autodoc-process-docstring", process_matlab_docstring) + +project = 'MatNWB' +copyright = '2024, Neurodata Without Borders' # Todo: compute year +author = 'Neurodata Without Borders' + +release = '2.7.0' # Todo: read from Contents.m + +# -- General configuration --------------------------------------------------- +# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration + +extensions = [ + "sphinx.ext.mathjax", # or other extensions you may need + 'sphinxcontrib.matlab', # generate docs for matlab functions + 'sphinx.ext.autodoc', # autogenerate docs + 'sphinx.ext.napoleon', # for parsing e.g google style parameter docstring + 'sphinx.ext.viewcode', + 'sphinx_copybutton', +] + +# -- Options that are MATLAB specific ---------------------------------------- + +highlight_language = 'matlab' + +primary_domain = "mat" + +# Get the absolute path of the script's directory +script_dir = os.path.dirname(os.path.abspath(__file__)) + +# Compute the absolute path two levels up from the script's directory +matlab_src_dir = os.path.abspath(os.path.join(script_dir, '..', '..')) + +matlab_class_signature = True +matlab_auto_link = "all" +matlab_show_property_default_value = True + +# -- Options for HTML output ------------------------------------------------- +# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output + +html_theme = "sphinx_rtd_theme" + +html_static_path = ['_static'] +html_logo = os.path.join(matlab_src_dir, 'logo', 'logo_matnwb_small.png') +html_theme_options = { + # "style_nav_header_background": "#AFD2E8" + "style_nav_header_background": "#000000" + } + # 'navigation_depth': 1, # Adjust the depth as needed + +templates_path = ['_templates'] +exclude_patterns = [] +html_css_files = [ + 'css/custom.css', +] diff --git a/docs/source/index.rst b/docs/source/index.rst new file mode 100644 index 00000000..327d56ad --- /dev/null +++ b/docs/source/index.rst @@ -0,0 +1,28 @@ +MatNWB documentation +==================== + +Add your content using ``reStructuredText`` syntax. See the +`reStructuredText `_ +documentation for details. + +.. toctree:: + :maxdepth: 2 + :caption: Getting Started + + pages/install_users + pages/tutorials/index + pages/overview_citing + +.. toctree:: + :maxdepth: 2 + :caption: MatNWB Documentation + + pages/functions/index + pages/neurodata_types/core/index + +.. toctree:: + :maxdepth: 2 + :caption: For Developers + + pages/developers + pages/developer/documentation/formatting_docstrings diff --git a/docs/source/pages/developer/documentation/formatting_docstrings.rst b/docs/source/pages/developer/documentation/formatting_docstrings.rst new file mode 100644 index 00000000..fe1998a9 --- /dev/null +++ b/docs/source/pages/developer/documentation/formatting_docstrings.rst @@ -0,0 +1,176 @@ +Writing a Properly Documented MATLAB Docstring +============================================== + +A well-documented MATLAB function should be structured to provide all necessary details about its purpose, usage, inputs, outputs, and examples in a clear and consistent format. This guide outlines the key sections and formatting rules to follow when documenting MATLAB functions, using ``NWBREAD`` as an example. + +1. Function Summary +------------------- + +Provide a one-line summary of the function's purpose at the very beginning of the docstring. Use uppercase function names followed by a concise description. + +**Example**:: + + % NWBREAD - Read an NWB file. + +2. Syntax Section +------------------ + +Document the different ways the function can be called (function signatures). Include all variations and briefly describe their purpose. Each syntax line should start with a code literal and describe what it does. + +**Example**:: + + % Syntax: + % nwb = NWBREAD(filename) Reads the nwb file at filename and returns an + % NWBFile object representing its contents. + % + % nwb = NWBREAD(filename, flags) Reads the nwb file using optional + % flags controlling the mode for how to read the file. See input + % arguments for a list of available flags. + % + % nwb = NWBREAD(filename, Name, Value) Reads the nwb file using optional + % name-value pairs controlling options for how to read the file. + +3. Input Arguments +------------------- + +Provide a detailed description of all input arguments. Use the following format for each input: +- Start with a ``-`` followed by the argument name. +- Add the argument type in parentheses (e.g., ``(string)``). +- Write a concise description on the same line or in an indented paragraph below. +- For optional or additional parameters, list their sub-arguments as indented items. + +**Example**:: + + % Input Arguments: + % - filename (string) - + % Filepath pointing to an NWB file. + % + % - flags (string) - + % Flag for setting the mode for the NWBREAD operation. Available options are: + % 'ignorecache'. If the 'ignorecache' flag is used, classes for NWB data types + % are not re-generated based on the embedded schemas in the file. + % + % - options (name-value pairs) - + % Optional name-value pairs. Available options: + % + % - savedir (string) - + % A folder to save generated classes for NWB types. + +4. Output Arguments +-------------------- + +Document all outputs of the function. Use a similar format as the input arguments: +- Start with a ``-`` followed by the output name. +- Add the output type in parentheses. +- Provide a brief description. + +**Example**:: + + % Output Arguments: + % - nwb (NwbFile) - Nwb file object + +5. Usage Examples +------------------ + +Provide practical examples of how to use the function. Each example should: +- Start with "Example X - Description" and be followed by a colon (``::``). +- Include MATLAB code blocks, indented with spaces. +- Add comments in the code to explain each step if necessary. + +**Example**:: + + % Usage: + % Example 1 - Read an NWB file:: + % + % nwb = nwbRead('data.nwb'); + % + % Example 2 - Read an NWB file without re-generating classes for NWB types:: + % + % nwb = nwbRead('data.nwb', 'ignorecache'); + % + % Note: This is a good option to use if you are reading several files + % which are created of the same version of the NWB schemas. + % + % Example 3 - Read an NWB file and generate classes for NWB types in the current working directory:: + % + % nwb = nwbRead('data.nwb', 'savedir', '.'); + +6. See Also +----------- + +Use the ``See also:`` section to reference related functions or objects. List each item separated by commas and include cross-references if applicable. + +**Example**:: + + % See also: + % generateCore, generateExtension, NwbFile, nwbExport + +7. Formatting Tips +------------------- + +- **Consistent Indentation**: + - Indent descriptions or additional information using two spaces. + +- **Bold Text**: + - Use ``**`` around key elements like argument names in the rendered documentation. + +- **Code Literals**: + - Use double backticks (``) for MATLAB code snippets in descriptions. + +- **Directives**: + - Use Sphinx-compatible directives for linking (``:class:``, ``:func:``, etc.) when writing in RST. + +8. Final Example +----------------- + +**Complete Example**:: + + % NWBREAD - Read an NWB file. + % + % Syntax: + % nwb = NWBREAD(filename) Reads the nwb file at filename and returns an + % NWBFile object representing its contents. + % + % nwb = NWBREAD(filename, flags) Reads the nwb file using optional + % flags controlling the mode for how to read the file. See input + % arguments for a list of available flags. + % + % nwb = NWBREAD(filename, Name, Value) Reads the nwb file using optional + % name-value pairs controlling options for how to read the file. + % + % Input Arguments: + % - filename (string) - + % Filepath pointing to an NWB file. + % + % - flags (string) - + % Flag for setting the mode for the NWBREAD operation. Available options are: + % 'ignorecache'. If the 'ignorecache' flag is used, classes for NWB data types + % are not re-generated based on the embedded schemas in the file. + % + % - options (name-value pairs) - + % Optional name-value pairs. Available options: + % + % - savedir (string) - + % A folder to save generated classes for NWB types. + % + % Output Arguments: + % - nwb (NwbFile) - Nwb file object + % + % Usage: + % Example 1 - Read an NWB file:: + % + % nwb = nwbRead('data.nwb'); + % + % Example 2 - Read an NWB file without re-generating classes for NWB types:: + % + % nwb = nwbRead('data.nwb', 'ignorecache'); + % + % Note: This is a good option to use if you are reading several files + % which are created of the same version of the NWB schemas. + % + % Example 3 - Read an NWB file and generate classes for NWB types in the current working directory:: + % + % nwb = nwbRead('data.nwb', 'savedir', '.'); + % + % See also: + % generateCore, generateExtension, NwbFile, nwbExport diff --git a/docs/source/pages/developers.rst b/docs/source/pages/developers.rst new file mode 100644 index 00000000..9515ff98 --- /dev/null +++ b/docs/source/pages/developers.rst @@ -0,0 +1,4 @@ +Developers +============= + +hello \ No newline at end of file diff --git a/docs/source/pages/functions/NwbFile.rst b/docs/source/pages/functions/NwbFile.rst new file mode 100644 index 00000000..2327759d --- /dev/null +++ b/docs/source/pages/functions/NwbFile.rst @@ -0,0 +1,7 @@ +NwbFile +======= + +.. mat:module:: . +.. autoclass:: NwbFile + :members: + :show-inheritance: diff --git a/docs/source/pages/functions/generateCore.rst b/docs/source/pages/functions/generateCore.rst new file mode 100644 index 00000000..33eb05b5 --- /dev/null +++ b/docs/source/pages/functions/generateCore.rst @@ -0,0 +1,5 @@ +generateCore +============ + +.. mat:module:: . +.. autofunction:: generateCore diff --git a/docs/source/pages/functions/generateExtension.rst b/docs/source/pages/functions/generateExtension.rst new file mode 100644 index 00000000..42a93291 --- /dev/null +++ b/docs/source/pages/functions/generateExtension.rst @@ -0,0 +1,5 @@ +generateExtension +================= + +.. mat:module:: . +.. autofunction:: generateExtension diff --git a/docs/source/pages/functions/index.rst b/docs/source/pages/functions/index.rst new file mode 100644 index 00000000..fae4ae4d --- /dev/null +++ b/docs/source/pages/functions/index.rst @@ -0,0 +1,15 @@ +MatNWB Functions +================ + +These are the main functions of the MatNWB API + +.. toctree:: + :maxdepth: 2 + :caption: Functions + + nwbRead + NwbFile + nwbExport + generateCore + generateExtension + nwbClearGenerated \ No newline at end of file diff --git a/docs/source/pages/functions/nwbClearGenerated.rst b/docs/source/pages/functions/nwbClearGenerated.rst new file mode 100644 index 00000000..166e00fb --- /dev/null +++ b/docs/source/pages/functions/nwbClearGenerated.rst @@ -0,0 +1,5 @@ +nwbClearGenerated +================= + +.. mat:module:: . +.. autofunction:: nwbClearGenerated diff --git a/docs/source/pages/functions/nwbExport.rst b/docs/source/pages/functions/nwbExport.rst new file mode 100644 index 00000000..ee4e5c0e --- /dev/null +++ b/docs/source/pages/functions/nwbExport.rst @@ -0,0 +1,5 @@ +nwbExport +========= + +.. mat:module:: . +.. autofunction:: nwbExport diff --git a/docs/source/pages/functions/nwbRead.rst b/docs/source/pages/functions/nwbRead.rst new file mode 100644 index 00000000..94d455e5 --- /dev/null +++ b/docs/source/pages/functions/nwbRead.rst @@ -0,0 +1,5 @@ +nwbRead +======= + +.. mat:module:: . +.. autofunction:: nwbRead diff --git a/docs/source/pages/install_users.rst b/docs/source/pages/install_users.rst new file mode 100644 index 00000000..20156dbf --- /dev/null +++ b/docs/source/pages/install_users.rst @@ -0,0 +1,4 @@ +Install MatNWB +============== + +Todo: installation \ No newline at end of file diff --git a/docs/source/pages/neurodata_types/core/AbstractFeatureSeries.rst b/docs/source/pages/neurodata_types/core/AbstractFeatureSeries.rst new file mode 100644 index 00000000..c4eebc7b --- /dev/null +++ b/docs/source/pages/neurodata_types/core/AbstractFeatureSeries.rst @@ -0,0 +1,7 @@ +AbstractFeatureSeries +===================== + +.. mat:module:: types.core +.. autoclass:: types.core.AbstractFeatureSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/AnnotationSeries.rst b/docs/source/pages/neurodata_types/core/AnnotationSeries.rst new file mode 100644 index 00000000..7bd34497 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/AnnotationSeries.rst @@ -0,0 +1,7 @@ +AnnotationSeries +================ + +.. mat:module:: types.core +.. autoclass:: types.core.AnnotationSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/BehavioralEpochs.rst b/docs/source/pages/neurodata_types/core/BehavioralEpochs.rst new file mode 100644 index 00000000..a30da28d --- /dev/null +++ b/docs/source/pages/neurodata_types/core/BehavioralEpochs.rst @@ -0,0 +1,7 @@ +BehavioralEpochs +================ + +.. mat:module:: types.core +.. autoclass:: types.core.BehavioralEpochs + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/BehavioralEvents.rst b/docs/source/pages/neurodata_types/core/BehavioralEvents.rst new file mode 100644 index 00000000..73b76ec7 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/BehavioralEvents.rst @@ -0,0 +1,7 @@ +BehavioralEvents +================ + +.. mat:module:: types.core +.. autoclass:: types.core.BehavioralEvents + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/BehavioralTimeSeries.rst b/docs/source/pages/neurodata_types/core/BehavioralTimeSeries.rst new file mode 100644 index 00000000..28729a31 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/BehavioralTimeSeries.rst @@ -0,0 +1,7 @@ +BehavioralTimeSeries +==================== + +.. mat:module:: types.core +.. autoclass:: types.core.BehavioralTimeSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ClusterWaveforms.rst b/docs/source/pages/neurodata_types/core/ClusterWaveforms.rst new file mode 100644 index 00000000..326078d5 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ClusterWaveforms.rst @@ -0,0 +1,7 @@ +ClusterWaveforms +================ + +.. mat:module:: types.core +.. autoclass:: types.core.ClusterWaveforms + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Clustering.rst b/docs/source/pages/neurodata_types/core/Clustering.rst new file mode 100644 index 00000000..f4d71e48 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Clustering.rst @@ -0,0 +1,7 @@ +Clustering +========== + +.. mat:module:: types.core +.. autoclass:: types.core.Clustering + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/CompassDirection.rst b/docs/source/pages/neurodata_types/core/CompassDirection.rst new file mode 100644 index 00000000..741510f6 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/CompassDirection.rst @@ -0,0 +1,7 @@ +CompassDirection +================ + +.. mat:module:: types.core +.. autoclass:: types.core.CompassDirection + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/CorrectedImageStack.rst b/docs/source/pages/neurodata_types/core/CorrectedImageStack.rst new file mode 100644 index 00000000..5aaa228a --- /dev/null +++ b/docs/source/pages/neurodata_types/core/CorrectedImageStack.rst @@ -0,0 +1,7 @@ +CorrectedImageStack +=================== + +.. mat:module:: types.core +.. autoclass:: types.core.CorrectedImageStack + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/CurrentClampSeries.rst b/docs/source/pages/neurodata_types/core/CurrentClampSeries.rst new file mode 100644 index 00000000..499c5716 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/CurrentClampSeries.rst @@ -0,0 +1,7 @@ +CurrentClampSeries +================== + +.. mat:module:: types.core +.. autoclass:: types.core.CurrentClampSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/CurrentClampStimulusSeries.rst b/docs/source/pages/neurodata_types/core/CurrentClampStimulusSeries.rst new file mode 100644 index 00000000..5d8c5f1b --- /dev/null +++ b/docs/source/pages/neurodata_types/core/CurrentClampStimulusSeries.rst @@ -0,0 +1,7 @@ +CurrentClampStimulusSeries +========================== + +.. mat:module:: types.core +.. autoclass:: types.core.CurrentClampStimulusSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/DecompositionSeries.rst b/docs/source/pages/neurodata_types/core/DecompositionSeries.rst new file mode 100644 index 00000000..d187f416 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/DecompositionSeries.rst @@ -0,0 +1,7 @@ +DecompositionSeries +=================== + +.. mat:module:: types.core +.. autoclass:: types.core.DecompositionSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Device.rst b/docs/source/pages/neurodata_types/core/Device.rst new file mode 100644 index 00000000..ce022bed --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Device.rst @@ -0,0 +1,7 @@ +Device +====== + +.. mat:module:: types.core +.. autoclass:: types.core.Device + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/DfOverF.rst b/docs/source/pages/neurodata_types/core/DfOverF.rst new file mode 100644 index 00000000..35c40613 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/DfOverF.rst @@ -0,0 +1,7 @@ +DfOverF +======= + +.. mat:module:: types.core +.. autoclass:: types.core.DfOverF + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ElectricalSeries.rst b/docs/source/pages/neurodata_types/core/ElectricalSeries.rst new file mode 100644 index 00000000..15972a49 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ElectricalSeries.rst @@ -0,0 +1,7 @@ +ElectricalSeries +================ + +.. mat:module:: types.core +.. autoclass:: types.core.ElectricalSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ElectrodeGroup.rst b/docs/source/pages/neurodata_types/core/ElectrodeGroup.rst new file mode 100644 index 00000000..4ba91041 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ElectrodeGroup.rst @@ -0,0 +1,7 @@ +ElectrodeGroup +============== + +.. mat:module:: types.core +.. autoclass:: types.core.ElectrodeGroup + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/EventDetection.rst b/docs/source/pages/neurodata_types/core/EventDetection.rst new file mode 100644 index 00000000..5633b916 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/EventDetection.rst @@ -0,0 +1,7 @@ +EventDetection +============== + +.. mat:module:: types.core +.. autoclass:: types.core.EventDetection + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/EventWaveform.rst b/docs/source/pages/neurodata_types/core/EventWaveform.rst new file mode 100644 index 00000000..3550c82a --- /dev/null +++ b/docs/source/pages/neurodata_types/core/EventWaveform.rst @@ -0,0 +1,7 @@ +EventWaveform +============= + +.. mat:module:: types.core +.. autoclass:: types.core.EventWaveform + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ExperimentalConditionsTable.rst b/docs/source/pages/neurodata_types/core/ExperimentalConditionsTable.rst new file mode 100644 index 00000000..ad31451b --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ExperimentalConditionsTable.rst @@ -0,0 +1,7 @@ +ExperimentalConditionsTable +=========================== + +.. mat:module:: types.core +.. autoclass:: types.core.ExperimentalConditionsTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/EyeTracking.rst b/docs/source/pages/neurodata_types/core/EyeTracking.rst new file mode 100644 index 00000000..124d9ebc --- /dev/null +++ b/docs/source/pages/neurodata_types/core/EyeTracking.rst @@ -0,0 +1,7 @@ +EyeTracking +=========== + +.. mat:module:: types.core +.. autoclass:: types.core.EyeTracking + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/FeatureExtraction.rst b/docs/source/pages/neurodata_types/core/FeatureExtraction.rst new file mode 100644 index 00000000..31c673cd --- /dev/null +++ b/docs/source/pages/neurodata_types/core/FeatureExtraction.rst @@ -0,0 +1,7 @@ +FeatureExtraction +================= + +.. mat:module:: types.core +.. autoclass:: types.core.FeatureExtraction + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/FilteredEphys.rst b/docs/source/pages/neurodata_types/core/FilteredEphys.rst new file mode 100644 index 00000000..380d26d2 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/FilteredEphys.rst @@ -0,0 +1,7 @@ +FilteredEphys +============= + +.. mat:module:: types.core +.. autoclass:: types.core.FilteredEphys + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Fluorescence.rst b/docs/source/pages/neurodata_types/core/Fluorescence.rst new file mode 100644 index 00000000..98568955 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Fluorescence.rst @@ -0,0 +1,7 @@ +Fluorescence +============ + +.. mat:module:: types.core +.. autoclass:: types.core.Fluorescence + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/GrayscaleImage.rst b/docs/source/pages/neurodata_types/core/GrayscaleImage.rst new file mode 100644 index 00000000..19ea6a07 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/GrayscaleImage.rst @@ -0,0 +1,7 @@ +GrayscaleImage +============== + +.. mat:module:: types.core +.. autoclass:: types.core.GrayscaleImage + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IZeroClampSeries.rst b/docs/source/pages/neurodata_types/core/IZeroClampSeries.rst new file mode 100644 index 00000000..933c36bd --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IZeroClampSeries.rst @@ -0,0 +1,7 @@ +IZeroClampSeries +================ + +.. mat:module:: types.core +.. autoclass:: types.core.IZeroClampSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Image.rst b/docs/source/pages/neurodata_types/core/Image.rst new file mode 100644 index 00000000..f68e19c8 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Image.rst @@ -0,0 +1,7 @@ +Image +===== + +.. mat:module:: types.core +.. autoclass:: types.core.Image + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImageMaskSeries.rst b/docs/source/pages/neurodata_types/core/ImageMaskSeries.rst new file mode 100644 index 00000000..a64fce4c --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImageMaskSeries.rst @@ -0,0 +1,7 @@ +ImageMaskSeries +=============== + +.. mat:module:: types.core +.. autoclass:: types.core.ImageMaskSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImageReferences.rst b/docs/source/pages/neurodata_types/core/ImageReferences.rst new file mode 100644 index 00000000..f25c1dde --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImageReferences.rst @@ -0,0 +1,7 @@ +ImageReferences +=============== + +.. mat:module:: types.core +.. autoclass:: types.core.ImageReferences + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImageSegmentation.rst b/docs/source/pages/neurodata_types/core/ImageSegmentation.rst new file mode 100644 index 00000000..dbd7bf12 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImageSegmentation.rst @@ -0,0 +1,7 @@ +ImageSegmentation +================= + +.. mat:module:: types.core +.. autoclass:: types.core.ImageSegmentation + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImageSeries.rst b/docs/source/pages/neurodata_types/core/ImageSeries.rst new file mode 100644 index 00000000..2ca8ddaa --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImageSeries.rst @@ -0,0 +1,7 @@ +ImageSeries +=========== + +.. mat:module:: types.core +.. autoclass:: types.core.ImageSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Images.rst b/docs/source/pages/neurodata_types/core/Images.rst new file mode 100644 index 00000000..59aa8822 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Images.rst @@ -0,0 +1,7 @@ +Images +====== + +.. mat:module:: types.core +.. autoclass:: types.core.Images + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImagingPlane.rst b/docs/source/pages/neurodata_types/core/ImagingPlane.rst new file mode 100644 index 00000000..513c21db --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImagingPlane.rst @@ -0,0 +1,7 @@ +ImagingPlane +============ + +.. mat:module:: types.core +.. autoclass:: types.core.ImagingPlane + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ImagingRetinotopy.rst b/docs/source/pages/neurodata_types/core/ImagingRetinotopy.rst new file mode 100644 index 00000000..47edb9b3 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ImagingRetinotopy.rst @@ -0,0 +1,7 @@ +ImagingRetinotopy +================= + +.. mat:module:: types.core +.. autoclass:: types.core.ImagingRetinotopy + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IndexSeries.rst b/docs/source/pages/neurodata_types/core/IndexSeries.rst new file mode 100644 index 00000000..5b2a7f40 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IndexSeries.rst @@ -0,0 +1,7 @@ +IndexSeries +=========== + +.. mat:module:: types.core +.. autoclass:: types.core.IndexSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntervalSeries.rst b/docs/source/pages/neurodata_types/core/IntervalSeries.rst new file mode 100644 index 00000000..d316f0fe --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntervalSeries.rst @@ -0,0 +1,7 @@ +IntervalSeries +============== + +.. mat:module:: types.core +.. autoclass:: types.core.IntervalSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntracellularElectrode.rst b/docs/source/pages/neurodata_types/core/IntracellularElectrode.rst new file mode 100644 index 00000000..ab0f254b --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntracellularElectrode.rst @@ -0,0 +1,7 @@ +IntracellularElectrode +====================== + +.. mat:module:: types.core +.. autoclass:: types.core.IntracellularElectrode + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntracellularElectrodesTable.rst b/docs/source/pages/neurodata_types/core/IntracellularElectrodesTable.rst new file mode 100644 index 00000000..d1903fcc --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntracellularElectrodesTable.rst @@ -0,0 +1,7 @@ +IntracellularElectrodesTable +============================ + +.. mat:module:: types.core +.. autoclass:: types.core.IntracellularElectrodesTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntracellularRecordingsTable.rst b/docs/source/pages/neurodata_types/core/IntracellularRecordingsTable.rst new file mode 100644 index 00000000..83af79be --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntracellularRecordingsTable.rst @@ -0,0 +1,7 @@ +IntracellularRecordingsTable +============================ + +.. mat:module:: types.core +.. autoclass:: types.core.IntracellularRecordingsTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntracellularResponsesTable.rst b/docs/source/pages/neurodata_types/core/IntracellularResponsesTable.rst new file mode 100644 index 00000000..e1be1bc1 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntracellularResponsesTable.rst @@ -0,0 +1,7 @@ +IntracellularResponsesTable +=========================== + +.. mat:module:: types.core +.. autoclass:: types.core.IntracellularResponsesTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/IntracellularStimuliTable.rst b/docs/source/pages/neurodata_types/core/IntracellularStimuliTable.rst new file mode 100644 index 00000000..b01d0116 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/IntracellularStimuliTable.rst @@ -0,0 +1,7 @@ +IntracellularStimuliTable +========================= + +.. mat:module:: types.core +.. autoclass:: types.core.IntracellularStimuliTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/LFP.rst b/docs/source/pages/neurodata_types/core/LFP.rst new file mode 100644 index 00000000..62c80788 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/LFP.rst @@ -0,0 +1,7 @@ +LFP +=== + +.. mat:module:: types.core +.. autoclass:: types.core.LFP + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/LabMetaData.rst b/docs/source/pages/neurodata_types/core/LabMetaData.rst new file mode 100644 index 00000000..7dd71b67 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/LabMetaData.rst @@ -0,0 +1,7 @@ +LabMetaData +=========== + +.. mat:module:: types.core +.. autoclass:: types.core.LabMetaData + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/MotionCorrection.rst b/docs/source/pages/neurodata_types/core/MotionCorrection.rst new file mode 100644 index 00000000..4aa78836 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/MotionCorrection.rst @@ -0,0 +1,7 @@ +MotionCorrection +================ + +.. mat:module:: types.core +.. autoclass:: types.core.MotionCorrection + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/NWBContainer.rst b/docs/source/pages/neurodata_types/core/NWBContainer.rst new file mode 100644 index 00000000..4761b9c2 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/NWBContainer.rst @@ -0,0 +1,7 @@ +NWBContainer +============ + +.. mat:module:: types.core +.. autoclass:: types.core.NWBContainer + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/NWBData.rst b/docs/source/pages/neurodata_types/core/NWBData.rst new file mode 100644 index 00000000..c376686e --- /dev/null +++ b/docs/source/pages/neurodata_types/core/NWBData.rst @@ -0,0 +1,7 @@ +NWBData +======= + +.. mat:module:: types.core +.. autoclass:: types.core.NWBData + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/NWBDataInterface.rst b/docs/source/pages/neurodata_types/core/NWBDataInterface.rst new file mode 100644 index 00000000..dd29dca1 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/NWBDataInterface.rst @@ -0,0 +1,7 @@ +NWBDataInterface +================ + +.. mat:module:: types.core +.. autoclass:: types.core.NWBDataInterface + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/NWBFile.rst b/docs/source/pages/neurodata_types/core/NWBFile.rst new file mode 100644 index 00000000..9f390e5e --- /dev/null +++ b/docs/source/pages/neurodata_types/core/NWBFile.rst @@ -0,0 +1,7 @@ +NWBFile +======= + +.. mat:module:: types.core +.. autoclass:: types.core.NWBFile + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/OnePhotonSeries.rst b/docs/source/pages/neurodata_types/core/OnePhotonSeries.rst new file mode 100644 index 00000000..93695fcd --- /dev/null +++ b/docs/source/pages/neurodata_types/core/OnePhotonSeries.rst @@ -0,0 +1,7 @@ +OnePhotonSeries +=============== + +.. mat:module:: types.core +.. autoclass:: types.core.OnePhotonSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/OpticalChannel.rst b/docs/source/pages/neurodata_types/core/OpticalChannel.rst new file mode 100644 index 00000000..7f3e711a --- /dev/null +++ b/docs/source/pages/neurodata_types/core/OpticalChannel.rst @@ -0,0 +1,7 @@ +OpticalChannel +============== + +.. mat:module:: types.core +.. autoclass:: types.core.OpticalChannel + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/OpticalSeries.rst b/docs/source/pages/neurodata_types/core/OpticalSeries.rst new file mode 100644 index 00000000..a9fc655f --- /dev/null +++ b/docs/source/pages/neurodata_types/core/OpticalSeries.rst @@ -0,0 +1,7 @@ +OpticalSeries +============= + +.. mat:module:: types.core +.. autoclass:: types.core.OpticalSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/OptogeneticSeries.rst b/docs/source/pages/neurodata_types/core/OptogeneticSeries.rst new file mode 100644 index 00000000..33071d10 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/OptogeneticSeries.rst @@ -0,0 +1,7 @@ +OptogeneticSeries +================= + +.. mat:module:: types.core +.. autoclass:: types.core.OptogeneticSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/OptogeneticStimulusSite.rst b/docs/source/pages/neurodata_types/core/OptogeneticStimulusSite.rst new file mode 100644 index 00000000..bb8b9ace --- /dev/null +++ b/docs/source/pages/neurodata_types/core/OptogeneticStimulusSite.rst @@ -0,0 +1,7 @@ +OptogeneticStimulusSite +======================= + +.. mat:module:: types.core +.. autoclass:: types.core.OptogeneticStimulusSite + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/PatchClampSeries.rst b/docs/source/pages/neurodata_types/core/PatchClampSeries.rst new file mode 100644 index 00000000..67d1bbc9 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/PatchClampSeries.rst @@ -0,0 +1,7 @@ +PatchClampSeries +================ + +.. mat:module:: types.core +.. autoclass:: types.core.PatchClampSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/PlaneSegmentation.rst b/docs/source/pages/neurodata_types/core/PlaneSegmentation.rst new file mode 100644 index 00000000..fd984fc6 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/PlaneSegmentation.rst @@ -0,0 +1,7 @@ +PlaneSegmentation +================= + +.. mat:module:: types.core +.. autoclass:: types.core.PlaneSegmentation + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Position.rst b/docs/source/pages/neurodata_types/core/Position.rst new file mode 100644 index 00000000..0b49fec6 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Position.rst @@ -0,0 +1,7 @@ +Position +======== + +.. mat:module:: types.core +.. autoclass:: types.core.Position + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ProcessingModule.rst b/docs/source/pages/neurodata_types/core/ProcessingModule.rst new file mode 100644 index 00000000..55bbc315 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ProcessingModule.rst @@ -0,0 +1,7 @@ +ProcessingModule +================ + +.. mat:module:: types.core +.. autoclass:: types.core.ProcessingModule + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/PupilTracking.rst b/docs/source/pages/neurodata_types/core/PupilTracking.rst new file mode 100644 index 00000000..e45a9981 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/PupilTracking.rst @@ -0,0 +1,7 @@ +PupilTracking +============= + +.. mat:module:: types.core +.. autoclass:: types.core.PupilTracking + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/RGBAImage.rst b/docs/source/pages/neurodata_types/core/RGBAImage.rst new file mode 100644 index 00000000..420307c2 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/RGBAImage.rst @@ -0,0 +1,7 @@ +RGBAImage +========= + +.. mat:module:: types.core +.. autoclass:: types.core.RGBAImage + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/RGBImage.rst b/docs/source/pages/neurodata_types/core/RGBImage.rst new file mode 100644 index 00000000..3dfc9870 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/RGBImage.rst @@ -0,0 +1,7 @@ +RGBImage +======== + +.. mat:module:: types.core +.. autoclass:: types.core.RGBImage + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/RepetitionsTable.rst b/docs/source/pages/neurodata_types/core/RepetitionsTable.rst new file mode 100644 index 00000000..71c33b5a --- /dev/null +++ b/docs/source/pages/neurodata_types/core/RepetitionsTable.rst @@ -0,0 +1,7 @@ +RepetitionsTable +================ + +.. mat:module:: types.core +.. autoclass:: types.core.RepetitionsTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/RoiResponseSeries.rst b/docs/source/pages/neurodata_types/core/RoiResponseSeries.rst new file mode 100644 index 00000000..c48db5d9 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/RoiResponseSeries.rst @@ -0,0 +1,7 @@ +RoiResponseSeries +================= + +.. mat:module:: types.core +.. autoclass:: types.core.RoiResponseSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/ScratchData.rst b/docs/source/pages/neurodata_types/core/ScratchData.rst new file mode 100644 index 00000000..d121abdd --- /dev/null +++ b/docs/source/pages/neurodata_types/core/ScratchData.rst @@ -0,0 +1,7 @@ +ScratchData +=========== + +.. mat:module:: types.core +.. autoclass:: types.core.ScratchData + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/SequentialRecordingsTable.rst b/docs/source/pages/neurodata_types/core/SequentialRecordingsTable.rst new file mode 100644 index 00000000..8a343944 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/SequentialRecordingsTable.rst @@ -0,0 +1,7 @@ +SequentialRecordingsTable +========================= + +.. mat:module:: types.core +.. autoclass:: types.core.SequentialRecordingsTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/SimultaneousRecordingsTable.rst b/docs/source/pages/neurodata_types/core/SimultaneousRecordingsTable.rst new file mode 100644 index 00000000..0598bcb6 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/SimultaneousRecordingsTable.rst @@ -0,0 +1,7 @@ +SimultaneousRecordingsTable +=========================== + +.. mat:module:: types.core +.. autoclass:: types.core.SimultaneousRecordingsTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/SpatialSeries.rst b/docs/source/pages/neurodata_types/core/SpatialSeries.rst new file mode 100644 index 00000000..662c9c46 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/SpatialSeries.rst @@ -0,0 +1,7 @@ +SpatialSeries +============= + +.. mat:module:: types.core +.. autoclass:: types.core.SpatialSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/SpikeEventSeries.rst b/docs/source/pages/neurodata_types/core/SpikeEventSeries.rst new file mode 100644 index 00000000..4cb93f2d --- /dev/null +++ b/docs/source/pages/neurodata_types/core/SpikeEventSeries.rst @@ -0,0 +1,7 @@ +SpikeEventSeries +================ + +.. mat:module:: types.core +.. autoclass:: types.core.SpikeEventSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Subject.rst b/docs/source/pages/neurodata_types/core/Subject.rst new file mode 100644 index 00000000..a227a712 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Subject.rst @@ -0,0 +1,7 @@ +Subject +======= + +.. mat:module:: types.core +.. autoclass:: types.core.Subject + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/SweepTable.rst b/docs/source/pages/neurodata_types/core/SweepTable.rst new file mode 100644 index 00000000..0088de4f --- /dev/null +++ b/docs/source/pages/neurodata_types/core/SweepTable.rst @@ -0,0 +1,7 @@ +SweepTable +========== + +.. mat:module:: types.core +.. autoclass:: types.core.SweepTable + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/TimeIntervals.rst b/docs/source/pages/neurodata_types/core/TimeIntervals.rst new file mode 100644 index 00000000..7385bf7f --- /dev/null +++ b/docs/source/pages/neurodata_types/core/TimeIntervals.rst @@ -0,0 +1,7 @@ +TimeIntervals +============= + +.. mat:module:: types.core +.. autoclass:: types.core.TimeIntervals + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/TimeSeries.rst b/docs/source/pages/neurodata_types/core/TimeSeries.rst new file mode 100644 index 00000000..fbc71723 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/TimeSeries.rst @@ -0,0 +1,7 @@ +TimeSeries +========== + +.. mat:module:: types.core +.. autoclass:: types.core.TimeSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/TimeSeriesReferenceVectorData.rst b/docs/source/pages/neurodata_types/core/TimeSeriesReferenceVectorData.rst new file mode 100644 index 00000000..93594259 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/TimeSeriesReferenceVectorData.rst @@ -0,0 +1,7 @@ +TimeSeriesReferenceVectorData +============================= + +.. mat:module:: types.core +.. autoclass:: types.core.TimeSeriesReferenceVectorData + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/TwoPhotonSeries.rst b/docs/source/pages/neurodata_types/core/TwoPhotonSeries.rst new file mode 100644 index 00000000..e71db63d --- /dev/null +++ b/docs/source/pages/neurodata_types/core/TwoPhotonSeries.rst @@ -0,0 +1,7 @@ +TwoPhotonSeries +=============== + +.. mat:module:: types.core +.. autoclass:: types.core.TwoPhotonSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/Units.rst b/docs/source/pages/neurodata_types/core/Units.rst new file mode 100644 index 00000000..54a4a71f --- /dev/null +++ b/docs/source/pages/neurodata_types/core/Units.rst @@ -0,0 +1,7 @@ +Units +===== + +.. mat:module:: types.core +.. autoclass:: types.core.Units + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/VoltageClampSeries.rst b/docs/source/pages/neurodata_types/core/VoltageClampSeries.rst new file mode 100644 index 00000000..b4ede53f --- /dev/null +++ b/docs/source/pages/neurodata_types/core/VoltageClampSeries.rst @@ -0,0 +1,7 @@ +VoltageClampSeries +================== + +.. mat:module:: types.core +.. autoclass:: types.core.VoltageClampSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/VoltageClampStimulusSeries.rst b/docs/source/pages/neurodata_types/core/VoltageClampStimulusSeries.rst new file mode 100644 index 00000000..d1cc01a3 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/VoltageClampStimulusSeries.rst @@ -0,0 +1,7 @@ +VoltageClampStimulusSeries +========================== + +.. mat:module:: types.core +.. autoclass:: types.core.VoltageClampStimulusSeries + :members: + :show-inheritance: diff --git a/docs/source/pages/neurodata_types/core/index.rst b/docs/source/pages/neurodata_types/core/index.rst new file mode 100644 index 00000000..34437913 --- /dev/null +++ b/docs/source/pages/neurodata_types/core/index.rst @@ -0,0 +1,84 @@ +Neurodata Types +=============== + +These are the MatNWB neurodata types from the core schema specification. + +.. toctree:: + :maxdepth: 2 + :caption: Functions + + AbstractFeatureSeries + AnnotationSeries + BehavioralEpochs + BehavioralEvents + BehavioralTimeSeries + ClusterWaveforms + Clustering + CompassDirection + CorrectedImageStack + CurrentClampSeries + CurrentClampStimulusSeries + DecompositionSeries + Device + DfOverF + ElectricalSeries + ElectrodeGroup + EventDetection + EventWaveform + ExperimentalConditionsTable + EyeTracking + FeatureExtraction + FilteredEphys + Fluorescence + GrayscaleImage + IZeroClampSeries + Image + ImageMaskSeries + ImageReferences + ImageSegmentation + ImageSeries + Images + ImagingPlane + ImagingRetinotopy + IndexSeries + IntervalSeries + IntracellularElectrode + IntracellularElectrodesTable + IntracellularRecordingsTable + IntracellularResponsesTable + IntracellularStimuliTable + LFP + LabMetaData + MotionCorrection + NWBContainer + NWBData + NWBDataInterface + NWBFile + OnePhotonSeries + OpticalChannel + OpticalSeries + OptogeneticSeries + OptogeneticStimulusSite + PatchClampSeries + PlaneSegmentation + Position + ProcessingModule + PupilTracking + RGBAImage + RGBImage + RepetitionsTable + RoiResponseSeries + ScratchData + SequentialRecordingsTable + SimultaneousRecordingsTable + SpatialSeries + SpikeEventSeries + Subject + SweepTable + TimeIntervals + TimeSeries + TimeSeriesReferenceVectorData + TwoPhotonSeries + Units + VoltageClampSeries + VoltageClampStimulusSeries \ No newline at end of file diff --git a/docs/source/pages/overview_citing.rst b/docs/source/pages/overview_citing.rst new file mode 100644 index 00000000..7c67f175 --- /dev/null +++ b/docs/source/pages/overview_citing.rst @@ -0,0 +1,30 @@ +Citing MatNWB +============= + +BibTeX entry +------------ + +If you use MatNWB in your research, please use the following citation: + +.. code-block:: bibtex + + @article {10.7554/eLife.78362, + article_type = {journal}, + title = {{The Neurodata Without Borders ecosystem for neurophysiological data science}}, + author = {R\"ubel, Oliver and Tritt, Andrew and Ly, Ryan and Dichter, Benjamin K. and + Ghosh, Satrajit and Niu, Lawrence and Baker, Pamela and Soltesz, Ivan and + Ng, Lydia and Svoboda, Karel and Frank, Loren and Bouchard, Kristofer E.}, + editor = {Colgin, Laura L and Jadhav, Shantanu P}, + volume = {11{, + year = {2022}, + month = {oct}, + pub_date = {2022-10-04}, + pages = {e78362}, + citation = {eLife 2022;11:e78362}, + doi = {10.7554/eLife.78362}, + url = {https://doi.org/10.7554/eLife.78362}, + keywords = {Neurophysiology, data ecosystem, data language, data standard, FAIR data, archive}, + journal = {eLife}, + issn = {2050-084X}, + publisher = {eLife Sciences Publications, Ltd}, + } diff --git a/docs/source/pages/tutorials/basicUsage.rst b/docs/source/pages/tutorials/basicUsage.rst new file mode 100644 index 00000000..5e1c8f10 --- /dev/null +++ b/docs/source/pages/tutorials/basicUsage.rst @@ -0,0 +1,6 @@ +basicUsage +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/behavior.rst b/docs/source/pages/tutorials/behavior.rst new file mode 100644 index 00000000..c15fe726 --- /dev/null +++ b/docs/source/pages/tutorials/behavior.rst @@ -0,0 +1,6 @@ +behavior +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/convertTrials.rst b/docs/source/pages/tutorials/convertTrials.rst new file mode 100644 index 00000000..313e15ca --- /dev/null +++ b/docs/source/pages/tutorials/convertTrials.rst @@ -0,0 +1,6 @@ +convertTrials +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/dataPipe.rst b/docs/source/pages/tutorials/dataPipe.rst new file mode 100644 index 00000000..8f7436a3 --- /dev/null +++ b/docs/source/pages/tutorials/dataPipe.rst @@ -0,0 +1,6 @@ +dataPipe +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/dimensionMapNoDataPipes.rst b/docs/source/pages/tutorials/dimensionMapNoDataPipes.rst new file mode 100644 index 00000000..8fb67911 --- /dev/null +++ b/docs/source/pages/tutorials/dimensionMapNoDataPipes.rst @@ -0,0 +1,6 @@ +dimensionMapNoDataPipes +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/dimensionMapWithDataPipes.rst b/docs/source/pages/tutorials/dimensionMapWithDataPipes.rst new file mode 100644 index 00000000..27f0e4f6 --- /dev/null +++ b/docs/source/pages/tutorials/dimensionMapWithDataPipes.rst @@ -0,0 +1,6 @@ +dimensionMapWithDataPipes +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/dynamic_tables.rst b/docs/source/pages/tutorials/dynamic_tables.rst new file mode 100644 index 00000000..a74dbe21 --- /dev/null +++ b/docs/source/pages/tutorials/dynamic_tables.rst @@ -0,0 +1,6 @@ +dynamic_tables +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/dynamically_loaded_filters.rst b/docs/source/pages/tutorials/dynamically_loaded_filters.rst new file mode 100644 index 00000000..e7600fea --- /dev/null +++ b/docs/source/pages/tutorials/dynamically_loaded_filters.rst @@ -0,0 +1,6 @@ +dynamically_loaded_filters +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/ecephys.rst b/docs/source/pages/tutorials/ecephys.rst new file mode 100644 index 00000000..4e476bc2 --- /dev/null +++ b/docs/source/pages/tutorials/ecephys.rst @@ -0,0 +1,6 @@ +ecephys +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/icephys.rst b/docs/source/pages/tutorials/icephys.rst new file mode 100644 index 00000000..a1f3eb78 --- /dev/null +++ b/docs/source/pages/tutorials/icephys.rst @@ -0,0 +1,6 @@ +icephys +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/images.rst b/docs/source/pages/tutorials/images.rst new file mode 100644 index 00000000..bc088579 --- /dev/null +++ b/docs/source/pages/tutorials/images.rst @@ -0,0 +1,6 @@ +images +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/index.rst b/docs/source/pages/tutorials/index.rst new file mode 100644 index 00000000..1b349ecf --- /dev/null +++ b/docs/source/pages/tutorials/index.rst @@ -0,0 +1,24 @@ +Tutorials +========= + +.. toctree:: + :maxdepth: 1 + :caption: Tutorials + + basicUsage + behavior + convertTrials + dataPipe + dimensionMapNoDataPipes + dimensionMapWithDataPipes + dynamic_tables + dynamically_loaded_filters + ecephys + icephys + images + intro + ogen + ophys + read_demo + remote_read + scratch \ No newline at end of file diff --git a/docs/source/pages/tutorials/intro.rst b/docs/source/pages/tutorials/intro.rst new file mode 100644 index 00000000..24aadfa2 --- /dev/null +++ b/docs/source/pages/tutorials/intro.rst @@ -0,0 +1,6 @@ +intro +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/ogen.rst b/docs/source/pages/tutorials/ogen.rst new file mode 100644 index 00000000..ede496c7 --- /dev/null +++ b/docs/source/pages/tutorials/ogen.rst @@ -0,0 +1,6 @@ +ogen +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/ophys.rst b/docs/source/pages/tutorials/ophys.rst new file mode 100644 index 00000000..511360ae --- /dev/null +++ b/docs/source/pages/tutorials/ophys.rst @@ -0,0 +1,6 @@ +ophys +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/read_demo.rst b/docs/source/pages/tutorials/read_demo.rst new file mode 100644 index 00000000..7cb5392c --- /dev/null +++ b/docs/source/pages/tutorials/read_demo.rst @@ -0,0 +1,6 @@ +read_demo +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/remote_read.rst b/docs/source/pages/tutorials/remote_read.rst new file mode 100644 index 00000000..5b920117 --- /dev/null +++ b/docs/source/pages/tutorials/remote_read.rst @@ -0,0 +1,6 @@ +remote_read +=============================== + +.. raw:: html + + diff --git a/docs/source/pages/tutorials/scratch.rst b/docs/source/pages/tutorials/scratch.rst new file mode 100644 index 00000000..bcd17732 --- /dev/null +++ b/docs/source/pages/tutorials/scratch.rst @@ -0,0 +1,6 @@ +scratch +=============================== + +.. raw:: html + + diff --git a/docs/source/sphinx_extensions/__pycache__/docstring_processors.cpython-311.pyc b/docs/source/sphinx_extensions/__pycache__/docstring_processors.cpython-311.pyc new file mode 100644 index 00000000..2fc5ab6f Binary files /dev/null and b/docs/source/sphinx_extensions/__pycache__/docstring_processors.cpython-311.pyc differ diff --git a/docs/source/sphinx_extensions/docstring_processors.py b/docs/source/sphinx_extensions/docstring_processors.py new file mode 100644 index 00000000..2b3c9d51 --- /dev/null +++ b/docs/source/sphinx_extensions/docstring_processors.py @@ -0,0 +1,149 @@ +import re + + +def process_matlab_docstring(app, what, name, obj, options, lines): + _format_matlab_type_as_code_literal(lines) + _make_syntax_examples_code_literals(lines) + _format_input_arguments(lines) + _split_and_format_example_lines(lines) + + +def _format_matlab_type_as_code_literal(lines): + # Full list of MATLAB base types + matlab_types = { + "double", "single", "int8", "uint8", "int16", "uint16", + "int32", "uint32", "int64", "uint64", "logical", "char", + "cell", "struct", "table", "categorical", "datetime", + "duration", "calendarDuration", "function_handle", + "string", "complex" + } + + # Regex pattern to match MATLAB types as whole words, optionally wrapped in parentheses + type_pattern = re.compile( + rf"(?\(?)" + rf"(?P{'|'.join(re.escape(t) for t in matlab_types)})" + rf"(?P\)?)(?!\w)" + ) + + for i, line in enumerate(lines): + # Replace matches with inline code formatting, preserving parentheses + lines[i] = type_pattern.sub( + lambda match: ( + f"{match.group('before') or ''}" + f"``{match.group('type')}``" + f"{match.group('after') or ''}" + ), + line + ) + + +def _make_syntax_examples_code_literals(lines): + """ + Process a MATLAB docstring to wrap expressions in the Syntax section with double backticks. + + Args: + lines (str): The original MATLAB docstring lines. + """ + + in_syntax_section = False + + # Regex to match MATLAB expressions + matlab_expr_pattern = re.compile( + r"^\s*((?:\[[\w,\s]*\]\s*=\s*|[\w]+\s*=\s*)?[A-Za-z]\w*\([^)]*\))" + ) + + for i, line in enumerate(lines): + # Check if the current line starts the Syntax section + if line.strip().lower().startswith("syntax:"): + in_syntax_section = True + continue + + # Check if the current line is another section header + if in_syntax_section and _is_section_header(line) and not line.strip().lower().startswith("syntax:"): + in_syntax_section = False + + if in_syntax_section: + # Wrap MATLAB expressions in double backticks + match = matlab_expr_pattern.search(line) + if match: + # Need group 1 as group 0 contains the leading whitespace...? + line = matlab_expr_pattern.sub(lambda m: f"``{m.group(1)}``", line) + # Need to prepend a leading space, no idea why. + lines[i] = " " + line + + +def _format_input_arguments(lines): + """ + Format the 'Input Arguments' section to add double ** around item names + and `` around types in parentheses. + + Args: + lines (list of str): List of lines in the Input Arguments section. + + Returns: + list of str: Formatted lines. + """ + # Regex pattern for list item names with optional types in parentheses + input_arg_pattern = re.compile( + r"(?P^\s*)-\s*(?P\w+)" # Match the name of the argument + r"(?:\s*\((?P.*?)\))?" # Optionally match the type in parentheses + ) + + for i, line in enumerate(lines): + # Apply formatting to each matching line + lines[i] = input_arg_pattern.sub( + lambda match: ( + f"{match.group('indent')}- **{match.group('name').strip()}**" + # Name + ( # Optional type + f" ({match.group('type').strip()})" # Preserve existing formatting + if match.group('type') and ( + match.group('type').strip().startswith("``") or # Already backtick-formatted + match.group('type').strip().startswith(":") # Sphinx directive + ) + else f" (``{match.group('type').strip()}``)" # Add backticks if unformatted + ) if match.group('type') else "" # No type provided + ), + line + ) + + return lines + + +def _split_and_format_example_lines(lines): + """ + Split and format example lines like: + 'Example 1 - Export an NWB file:' + into two lines: + '**Example 1.**' + '**Export an NWB file**::' + + Modifies the `lines` list in place. + + Args: + lines (list of str): List of lines in the Usage section. + """ + + # Regex pattern to match example lines with descriptions + example_pattern = re.compile( + r"^\s*(Example\s+\d+)\s*-\s*(.*)::\s*$" # Matches 'Example X - Description:' + ) + + i = 0 + while i < len(lines): + # Check if the current line matches the "Example X - Description:" format + match = example_pattern.match(lines[i]) + if match: + example, description = match.groups() + # Replace the original line with two formatted lines + lines[i] = f" **{example} -**" # Important: add one space at beginning of line for proper rst indent + lines.insert(i + 1, f" **{description}**::") # Important: add one space at beginning of line for proper rst indent + i += 2 # Skip over the newly added line + else: + i += 1 # Move to the next line if no match + + +def _is_section_header(line): + # Regex to identify section headers + section_header_pattern = re.compile(r"^\s*%?\s*[A-Za-z ]+:") + + return section_header_pattern.match(line)