storage Package

datastore Module

This module implements the data storage functionality.

class mozaik.storage.datastore.DataStoreView(parameters, full_datastore, replace=False)

Bases: mozaik.core.ParametrizedObject

This class represents a subset of a DataStore and defines the query interface and the structure in which the data are stored in the memory of any datastore. Main role of this class is to allow for creating subsets of data stored in the DataStore, so that one can restrict other parts of Mozaik to work only over these subsets. This is done via means of queries (see mozaik.storage.queries) which produce DataStoreView objects and can be chained to work like filters.

Note that the actual datastores inherit from this object and define how the data are actualy stored on other media (i.e. hdf5 file format or simple pickle).

Data store should aggregate all data and analysis results collected across experiments and a given model.

Experiments results storage:

These are stored as a simple list in the self.block.segments variable. Each segment corresponds to recordings from a single sheet to a single stimulus.

Analysis results storage:

A list containing the mozaik.analysis.data_structures.AnalysisDataStructure objects.

The analysis results are addressed by the AnalysisDataStructure identifier. The call with this specification returns a set of AnalysisDataStructures that correspond to the above addressing. If further more specific ‘addressing’ is required it has to be done by the corresponding visualization or analysis code that asked for the AnalysisDataStructure’s based on the knowledge of their content. Or a specific query filters can be written that understand the specific type of AnalysisDataStructure and can filter them based on their internal data. For more details on addressing experimental results or analysis data structures please reffer to queries or mozaik.tools.mozaik_parametrized modules.

DataStoreView also keeps a reference to a full .Datastore object from which it was originally created (this might have happened via a chain of DSVs). This is on order to allow for operations that work over DSV to insert their results into the original full datastore as this is (almost?) always the desired behaviours (note DSV does not actually have functions to add new recordings or analysis results).

By default, the datastore will refuse to add a new AnalysisDataStructure to the datastore if the new ADS has the same values of all its parameters as some other ADS already inserted in the datastore. This is so that each ADS stored in datastore is uniquely identifiable based on its parameters. If the datastore is created (loaded) with the replace flag set to True, in the situation of such conflict the datastore will replace the new ADS for the one already in the datastore.

get_segments()

Returns list of all recordings (as neo segments) stored in the datastore.

sheets()

Returns the list of all sheets that are present in at least one of the segments in the given DataStoreView.

get_neuron_postions()

Returns the positions for all neurons in the model within their respective sheets. A dictionary is returned with keys names of sheets and values a 2d ndarray of size (2,number of neurons) holding the x and y positions of all neurons in the rows.

Use get_sheet_indexes() to link the indexes in the returned array to neuron idds.

get_sheet_indexes(sheet_name, neuron_id)

Returns the indexes of neurons in the sheet given the idds (this should be primarily used with annotations data such as positions etc.)

get_sheet_ids(sheet_name, indexes=None)

Returns the idds of neurons in the sheet given the indexes (this should be primarily used with annotations data such as positions etc.)

get_neuron_annotations()

Returns neuron annotations.

get_stimuli()

Returns a list of stimuli (as strings). The order of the stimuli corresponds to the order of segments returned by the get_segments() call.

get_analysis_result(**kwargs)

Return a list of ADSs, that match the parameter values specified in kwargs.

Examples

>>> datastore.get_analysis_result(identifier=['PerNeuronValue','SingleValue'],sheet_name=sheet,value_name='orientation preference')

This command should return or ADS whose identifier is PerNeuronValue or SingleValue, and are associated with sheet named sheet and as their value name have ‘orientation preference’

get_sensory_stimulus(stimuli=None)

Return the raw sensory stimulus that has been presented to the model due to stimuli specified by the stimuli argument. If stimuli==None returns all sensory stimuli.

sensory_stimulus_copy()

Utility function that makes a shallow copy of the dictionary holding sensory stimuli.

analysis_result_copy()

Utility function that makes a shallow copy of the list holding analysis data structures.

recordings_copy()

Utility function that makes a shallow copy of the list holding recordings.

fromDataStoreView()

Returns a empty DSV that is linked to the same .DataStore as this DSV.

print_content(full_recordings=False, full_ADS=False)

Prints the content of the data store (specifically the list of recordings and ADSs in the DSV).

If the

Parameters :

full_recordings : bool (optional)

If True each contained recording will be printed. Otherwise only the overview of the recordings based on stimulus type will be shown.

full_ADS : bool (optional)

If True each contained ADS will be printed (for each this will print the set of their mozaik parameters together with their values). Otherwise only the overview of the ADSs based on their identifier will be shown.

class mozaik.storage.datastore.DataStore(load, parameters, **params)

Bases: mozaik.storage.datastore.DataStoreView

Abstract DataStore class that declares the mozaik data store interface.

The role of mozaik data store is to store the recordings from simulation (generally this means the spike trains and the various analog signals such as conductances or membrane potential), the analysis results and the various metedata that is generated during the model setup and it’s subsequent simulation.

The recordings are send to the DataStore in the neo format and are expected to be returned in neo format as well.

mozaik generetas one neo segment per each model sheet (see mozaik.sheets.Sheet) for each presented stimulus, that is stored in the DataStore.

Parameters :

load : bool

If False datastore will be created in the parameter.root_directory directory. If True it will be loaded from the parameter.root_directory directory.

parameters : ParameterSet

The required parameter set.

required_parameters = {'root_directory': <type 'str'>}
set_neuron_positions(neuron_positions)
set_neuron_annotations(neuron_annotations)
set_neuron_ids(neuron_ids)
identify_unpresented_stimuli(stimuli)

This method filters out from a list of stimuli all those which have already been presented.

load()

The DataStore interface function to be implemented by a given backend. It should load the datastore.

save()

The DataStore interface function to be implemented by a given backend. It should store the datastore.

add_recording(segment, stimulus)

The DataStore interface function to be implemented by a given backend. It should add a recording into the datastore.

add_analysis_result(result)

The DataStore interface function to be implemented by a given backend. It should add a ADS to the datastore.

add_stimulus(data, stimulus)

The DataStore interface function to be implemented by a given backend. It should add a stimulus into the datastore.

class mozaik.storage.datastore.Hdf5DataStore(load, parameters, **params)

Bases: mozaik.storage.datastore.DataStore

An DataStore that saves all it’s data in a hdf5 file and an associated analysis results file, which just becomes the pickled self.analysis_results dictionary.

load()
save()
add_recording(segments, stimulus)
add_stimulus(data, stimulus)
add_analysis_result(result)
class mozaik.storage.datastore.PickledDataStore(load, parameters, **params)

Bases: mozaik.storage.datastore.Hdf5DataStore

An DataStore that saves all it’s data as a simple pickled files

load()
save()
add_recording(segments, stimulus)

queries Module

This module contain query manipulation system that is used to filter information stored in data store (DataStore).

The basic principle is that each query takes a existing DataSore (or DataStoreView) as an input and returns and new DataSoreView that is a subset of the input DSV.

class mozaik.storage.queries.Query(parameters)

Bases: mozaik.core.ParametrizedObject

Query accepts a DataStoreView and returns a DataStoreView (or set of DSVs) with potentially reduced set of recorded data or analysis results

We recommend to write queries in a way where it can be invoked via the ParamterSet method using a class, but also directly as a function with (potentially with default paramter values)

See ParamFilterQuery for a simple example.

required_parameters = {}
query(dsv)

Abstract function to be implemented by each query.

This is the function that executes the query. It receives a DSV as input and has to return a DSV (or set of DSVs).

mozaik.storage.queries.param_filter_query(dsv, ads_unique=False, rec_unique=False, **kwargs)

It will return DSV with only recordings and ADSs with mozaik parameters whose values match the parameter values combinations provided in kwargs.

To restrict mozaik parameters of the stimuli associated with the ADS or recordings pre-pend ‘st_‘ to the parameter name.

For the recordings, parameter sheet refers to the sheet for which the recording was done.

Parameters :

dsv : DataStoreView

The input DSV.

ads_unique : bool, optional

If True the query will raise an exception if the query does not identify a unique ADS.

rec_unique : bool, optional

If True the query will raise an exception if the query does not identify a unique recording.

**kwargs : dict

Remaining keyword arguments will be interepreted as the mozaik parameter names and their associated values that all ASDs or recordings have to match. The values of the parameters should be either directly the values to match or list of values in which case this list is interpreted as one of of the values that each returned recording or ASD has to match (thus effectively there is an and operation between the different parameters and or operation between the values specified for the given mozaik parameters).

Examples

>>> datastore.param_filter_query(datastore,identifier=['PerNeuronValue','SingleValue'],sheet_name=sheet,value_name='orientation preference')

This command should return DSV containing all recordings and ADSs whose identifier is PerNeuronValue or SingleValue, and are associated with sheet named sheet_name and as their value name have ‘orientation preference’. Note that since recordings do not have these parameters, this query would return a DSV containing only ADSs.

>>> datastore.param_filter_query(datastore,st_orientation=0.5)

This command should return DSV containing all recordings and ADSs that are associated with stimuli whose mozaik parameter orientation has value 0.5.

class mozaik.storage.queries.ParamFilterQuery(parameters)

Bases: mozaik.storage.queries.Query

See param_filter_query() for description.

Other Parameters:
 

params : ParameterSet

The set of mozaik parameters and their associated values to which to restrict the DSV. (see **kwargs in :func:.`param_filter_query`)

ads_unique : bool, optional

If True the query will raise an exception if the query does not identify a unique ADS.

rec_unique : bool, optional

If True the query will raise an exception if the query does not identify a unique recording.

required_parameters = {'rec_unique': <type 'bool'>, 'params': <class 'parameters.ParameterSet'>, 'ads_unique': <type 'bool'>}
query(dsv)
mozaik.storage.queries.tag_based_query(dsv, tags)

This query filters out all AnalysisDataStructure’s corresponding to the given tags.

Parameters :

tags : list(str)

The list of tags that each ADS has to contain.

class mozaik.storage.queries.TagBasedQuery(parameters)

Bases: mozaik.storage.queries.Query

See tag_based_query().

Parameters :

tags : list(str)

The list of tags that each ADS has to contain.

required_parameters = {'tags': <type 'list'>}
query(dsv)
mozaik.storage.queries.partition_by_stimulus_paramter_query(dsv, parameter_list)

This query will take all recordings and return list of DataStoreViews each holding recordings measured to the same stimulus with exception of the parameters reference by parameter_list.

Note that in most cases one wants to do this only against datastore holding only single Stimulus type! In that case the datastore is partitioned into subsets each holding recordings to the same stimulus with the same paramter values, with the exception to the parameters in parameter_list.

Parameters :

dsv : DataStoreView

The input DSV.

parameter_list : list(string)

The list of parameters of the associated stimuli that will vary in the returned DSVs, all other stimulus parameters will have the same value within each of the returned DSVs.

class mozaik.storage.queries.PartitionByStimulusParamterQuery(parameters)

Bases: mozaik.storage.queries.Query

See partition_by_stimulus_paramter_query().

Other Parameters:
 

parameter_list : list(string)

The list of parameters that will vary in the returned DSVs, all other parameters will have the same value within each of the returned DSVs.

required_parameters = {'parameter_list': <type 'list'>}
query(dsv)
mozaik.storage.queries.partition_analysis_results_by_parameters_query(dsv, parameter_list=None, excpt=False)

This query will take all analysis results and return list of DataStoreViews each holding analysis results that have the same values of the parameters in parameter_list.

Note that in most cases one wants to do this only against datastore holding only single analysis results type! In that case the datastore is partitioned into subsets each holding recordings to the same stimulus with the same paramter values, with the exception to the parameters in parameter_list.

Parameters :

dsv : DataStoreView

The input DSV.

parameter_list : list(string)

The list of parameters that will vary in the returned DSVs, all other parameters will have the same value within each of the returned DSVs.

except : bool

If excpt is True the query is allowed only on DSVs holding the same AnalysisDataStructures.

class mozaik.storage.queries.PartitionAnalysisResultsByParameterNameQuery(parameters)

Bases: mozaik.storage.queries.Query

See partition_analysis_results_by_parameters_query().

Other Parameters:
 

parameter_list : list(string)

The list of parameters that will vary in the returned DSVs, all other parameters will have the same value within each of the returned DSVs.

excpt : bool

If excpt is True the query is allowed only on DSVs holding the same AnalysisDataStructures.

required_parameters = {'parameter_list': <type 'list'>, 'excpt': <type 'bool'>}
query(dsv)
mozaik.storage.queries.partition_analysis_results_by_stimulus_parameters_query(dsv, parameter_list=None, excpt=False)

This query will take all analysis results and return list of DataStoreViews each holding analysis results that have the same values of of stimulus parameters in parameter_list.

Note that in most cases one wants to do this only against datastore holding only analysis results measured to the same stimulus type! In that case the datastore is partitioned into subsets each holding recordings to the same stimulus with the same paramter values, with the exception to the parameters in parameter_list.

Parameters :

dsv : DataStoreView

The input DSV.

parameter_list : list(string)

The list of stimulus parameters that will vary between the ASDs in the returned DSVs, all other parameters will have the same value within each of the returned DSVs.

except : bool

If excpt is True the query is allowed only on DSVs holding the same AnalysisDataStructures.

class mozaik.storage.queries.PartitionAnalysisResultsByStimulusParameterNameQuery(parameters)

Bases: mozaik.storage.queries.Query

See partition_analysis_results_by_stimulus_parameters_query().

Other Parameters:
 

parameter_list : list(string)

The list of parameters that will vary in the returned DSVs, all other parameters will have the same value within each of the returned DSVs.

excpt : bool

If excpt is True the query is allowed only on DSVs holding the same AnalysisDataStructures.

required_parameters = {'parameter_list': <type 'list'>, 'excpt': <type 'bool'>}
query(dsv)
mozaik.storage.queries.equal_stimulus_type(dsv)

This functions returns True if DSV contains only recordings associated with the same stimulus type. Otherwise False.

mozaik.storage.queries.equal_ads_except(dsv, except_params)

This functions returns true if DSV contains only ADS of the same kind and parametrization with the exception of parameters listed in except_params. Otherwise False.

mozaik.storage.queries.equal_ads_type(dsv)

Returns True if the dsv contains ADS of the same type. Otherwise False.

mozaik.storage.queries.ads_with_equal_stimulus_type(dsv, allow_None=False)

This functions tests whether DSV contains only ADS associated with the same stimulus type.

Parameters :

not_None : bool

If true it will not allow ADS that are not associated with stimulus

neo_neurotools_wrapper Module

This module contains wrapper for the neo Segment, that add extra functionality to the class. Within mozaik the data are stored and passed in this format.

Most of the included functionality should in future be provided directly by neo. When this happens most of this code should become irrelevant and the rest should be merged into the datastore module.

class mozaik.storage.neo_neurotools_wrapper.MozaikSegment(segment, identifier)

Bases: neo.core.segment.Segment

This class extends Neo segment with several convenience functions.

The most important function is that it allows lazy loading of the data.

It should be moved to datastore.py once the NeoNeurotoolsWrapper is obsolete and this file should be discarded.

get_spiketrains()

Returns the list of SpikeTrain objects stored in this segment.

set_spiketrains(s)
spiketrains

Returns the list of SpikeTrain objects stored in this segment.

get_spiketrain(neuron_id)

Returns a spiktrain or a list of spike train corresponding to id(s) listed in the neuron_id argument.

Parameters :

neuron_id : int or list(int)

An int or a list of ints containing the ids for which to return the spiketrains.

Returns :

A SpikeTrain object if neuron_id is int, or list of SpikeTrain objects if neuron_id is list, the order corresponds to the order in neuron_id argument. :

get_vm(neuron_id)

Returns the recorded membrane potential corresponding to neurons with id(s) listed in the neuron_id argument.

Parameters :

neuron_id : int or list(int)

An int or a list of ints containing the ids for which to return the AnalogSignal objects.

Returns :

A AnalogSignal object if neuron_id is int, or list of AnalogSignal objects if neuron_id is list, the order corresponds to the order in neuron_id argument. :

get_esyn(neuron_id)

Returns the recorded excitatory conductance corresponding to neurons with id(s) listed in the neuron_id argument.

Parameters :

neuron_id : int or list(int)

An int or a list of ints containing the ids for which to return the AnalogSignal objects.

Returns :

A AnalogSignal object if neuron_id is int, or list of AnalogSignal objects if neuron_id is list, the order corresponds to the order in neuron_id argument. :

get_isyn(neuron_id)

Returns the recorded inhibitory conductance corresponding to neurons with id(s) listed in the neuron_id argument.

Parameters :

neuron_id : int or list(int)

An int or a list of ints containing the ids for which to return the AnalogSignal objects.

Returns :

A AnalogSignal object if neuron_id is int, or list of AnalogSignal objects if neuron_id is list, the order corresponds to the order in neuron_id argument. :

load_full()
neuron_num()

Return number of stored neurons in this Segment.

get_stored_isyn_ids()

Returns ids of neurons for which inhibitory conductance is stored in this segment.

get_stored_esyn_ids()

Returns ids of neurons for which excitatory conductance is stored in this segment.

get_stored_vm_ids()

Returns ids of neurons for which membrane potential is stored in this segment.

get_stored_spike_train_ids()

Returns ids of neurons for which spikes are stored in this segment.

mean_rates()

Returns the mean rates of the spiketrains in spikes/s.

isi()

Returns an array containing arrays (one per each neurons) with the inter-spike intervals of the SpikeTrain objects.

cv_isi()

Return array with the coefficient of variation of the isis, one per each neuron.

cv_isi is the ratio between the standard deviation and the mean of the ISI The irregularity of individual spike trains is measured by the squared coefficient of variation of the corresponding inter-spike interval (ISI) distribution. In point processes, low values reflect more regular spiking, a clock-like pattern yields CV2= 0. On the other hand, CV2 = 1 indicates Poisson-type behavior. As a measure for irregularity in the network one can use the average irregularity across all neurons.

http://en.wikipedia.org/wiki/Coefficient_of_variation

class mozaik.storage.neo_neurotools_wrapper.PickledDataStoreNeoWrapper(segment, identifier, datastore_path)

Bases: mozaik.storage.neo_neurotools_wrapper.MozaikSegment

This is a Mozaik wrapper of neo segment, that enables pickling and lazy loading.

load_full()
release()

Previous topic

vision Package

Next topic

tools Package

This Page