Neo IO

Preamble

The Neo io module aims to provide an exhaustive way of loading and saving several widely used data formats in electrophysiology. The more these heterogeneous formats are supported, the easier it will be to manipulate them as Neo objects in a similar way. Therefore the IO set of classes propose a simple and flexible IO API that fits many format specifications. It is not only file-oriented, it can also read/write objects from a database.

At the moment, there are 3 families of IO modules:
  1. for reading closed manufacturers’ formats (Spike2, Plexon, AlphaOmega, BlackRock, Axon, ...)
  2. for reading(/writing) formats from open source tools (KlustaKwik, Elan, WinEdr, WinWcp, PyNN, ...)
  3. for reading/writing Neo structure in neutral formats (HDF5, .mat, ...) but with Neo structure inside (NeoHDF5, NeoMatlab, ...)

Combining 1 for reading and 3 for writing is a good example of use: converting your datasets to a more standard format when you want to share/collaborate.

Introduction

There is an intrinsic structure in the different Neo objects, that could be seen as a hierachy with cross-links. See Neo core. The highest level object is the Block object, which is the high level container able to encapsulate all the others.

A Block has therefore a list of Segment objects, that can, in some file formats, be accessed individually. Depending on the file format, i.e. if it is streamable or not, the whole Block may need to be loaded, but sometimes particular Segment objects can be accessed individually. Within a Segment, the same hierarchical organisation applies. A Segment embeds several objects, such as SpikeTrain, AnalogSignal, AnaloSignalArray, EpochArray, EventArray (basically, all the different Neo objects).

Depending on the file format, these objects can sometimes be loaded separately, without the need to load the whole file. If possible, a file IO therefore provides distinct methods allowing to load only particular objects that may be present in the file. The basic idea of each IO file format is to have, as much as possible, read/write methods for the individual encapsulated objects, and otherwise to provide a read/write method that will return the object at the highest level of hierarchy (by default, a Block or a Segment).

The neo.io API is a balance between full flexibility for the user (all read_XXX() methods are enabled) and simple, clean and understandable code for the developer (few read_XXX() methods are enabled). This means that not all IOs offer the full flexibility for partial reading of data files.

One format = one class

The basic syntax is as follows. If you want to load a file format that is implemented in a generic MyFormatIO class:

>>> from neo.io import MyFormatIO
>>> reader = MyFormatIO(filename="myfile.dat")

you can replace MyFormatIO by any implemented class, see List of implemented formats

Modes

IO can be based on a single file, a directory containing files, or a database. This is described in the mode attribute of the IO class.

>>> from neo.io import MyFormatIO
>>> print MyFormatIO.mode
'file'

For file mode the filename keyword argument is necessary. For directory mode the dirname keyword argument is necessary.

Ex:
>>> reader = io.PlexonIO(filename='File_plexon_1.plx')
>>> reader = io.TdtIO(dirname='aep_05')

Supported objects/readable objects

To know what types of object are supported by a given IO interface:

>>> MyFormatIO.supported_objects
[Segment , AnalogSignal , SpikeTrain, Event, Spike]

Supported objects does not mean objects that you can read directly. For instance, many formats support AnalogSignal but don’t allow them to be loaded directly, rather to access the AnalogSignal objects, you must read a Segment:

>>> seg = reader.read_segment()
>>> print(seg.analogsignals)
>>> print(seg.analogsignals[0])

To get a list of directly readable objects

>>> MyFormatIO.readable_objects
[Segment]

The first element of the previous list is the highest level for reading the file. This mean that the IO has a read_segment() method:

>>> seg = reader.read_segment()
>>> type(seg)
neo.core.Segment

All IOs have a read() method that returns a list of Block objects (representing the whole content of the file):

>>> bl = reader.read()
>>> print bl[0].segments[0]
neo.core.Segment

Lazy and cascade options

In some cases you may not want to load everything in memory because it could be too big. For this scenario, two options are available:

  • lazy=True/False. With lazy=True all arrays will have a size of zero, but all the metadata will be loaded. lazy_shape attribute is added to all object that inheritate Quantitities or numpy.ndarray (AnalogSignal, AnalogSignalArray, SpikeTrain) and to object that have array like attributes (EpochArray, EventArray) In that cases, lazy_shape is a tuple that have the same shape with lazy=False.
  • cascade=True/False. With cascade=False only one object is read (and one_to_many and many_to_many relationship are not read).

By default (if they are not specified), lazy=False and cascade=True, i.e. all data is loaded.

Example cascade:

>>> seg = reader.read_segment( cascade=True)
>>> print(len(seg.analogsignals))  # this is N
>>> seg = reader.read_segment(cascade=False)
>>> print(len(seg.analogsignals))  # this is zero

Example lazy:

>>> seg = reader.read_segment(lazy=False)
>>> print(seg.analogsignals[0].shape)  # this is N
>>> seg = reader.read_segment(lazy=True)
>>> print(seg.analogsignals[0].shape)  # this is zero, the AnalogSignal is empty
>>> print(seg.analogsignals[0].lazy_shape)  # this is N

Some IOs support advanced forms of lazy loading, cascading or both (these features are currently limited to the HDF5 IO, which supports both forms).

  • For lazy loading, these IOs have a load_lazy_object() method that takes a single parameter: a data object previously loaded by the same IO in lazy mode. It returns the fully loaded object, without links to container objects (Segment etc.). Continuing the lazy example above:

    >>> lazy_sig = seg.analogsignals[0]  # Empty signal
    >>> full_sig = reader.load_lazy_object(lazy_sig)
    >>> print(lazy_sig.lazy_shape, full_sig.shape)  # Identical
    >>> print(lazy_sig.segment)  # Has the link to the object "seg"
    >>> print(full_sig.segment)  # Does not have the link: None
    
  • For lazy cascading, IOs have a load_lazy_cascade() method. This method is not called directly when interacting with the IO, but its presence can be used to check if an IO supports lazy cascading. To use lazy cascading, the cascade parameter is set to 'lazy':

    >>> block = reader.read(cascade='lazy')
    

    You do not have to do anything else, lazy cascading is now active for the object you just loaded. You can interact with the object in the same way as if it was loaded with cascade=True. However, only the objects that are actually accessed are loaded as soon as they are needed:

    >>> print(block.channelindexes[0].name)  # The first ChannelIndex is loaded
    >>> print(block.segments[0].analogsignals[1])  # The first Segment and its second AnalogSignal are loaded
    

    Once an object has been loaded with lazy cascading, it stays in memory:

    >>> print(block.segments[0].analogsignals[0])  # The first Segment is already in memory, its first AnalogSignal is loaded
    

Details of API

The neo.io API is designed to be simple and intuitive:
  • each file format has an IO class (for example for Spike2 files you have a Spike2IO class).
  • each IO class inherits from the BaseIO class.
  • each IO class can read or write directly one or several Neo objects (for example Segment, Block, ...): see the readable_objects and writable_objects attributes of the IO class.
  • each IO class supports part of the neo.core hierachy, though not necessarily all of it (see supported_objects).
  • each IO class has a read() method that returns a list of Block objects. If the IO only supports Segment reading, the list will contain one block with all segments from the file.
  • each IO class that supports writing has a write() method that takes as a parameter a list of blocks, a single block or a single segment, depending on the IO’s writable_objects.
  • each IO is able to do a lazy load: all metadata (e.g. sampling_rate) are read, but not the actual numerical data. lazy_shape attribute is added to provide information on real size.
  • each IO is able to do a cascade load: if True (default) all child objects are loaded, otherwise only the top level object is loaded.
  • each IO is able to save and load all required attributes (metadata) of the objects it supports.
  • each IO can freely add user-defined or manufacturer-defined metadata to the annotations attribute of an object.

If you want to develop your own IO

See IO developers’ guide for information on how to implement a new IO.

List of implemented formats

neo.io provides classes for reading and/or writing electrophysiological data files.

Note that if the package dependency is not satisfied for one io, it does not raise an error but a warning.

neo.io.iolist provides a list of succesfully imported io classes.

Classes:

class neo.io.AlphaOmegaIO(filename=None)

Class for reading data from Alpha Omega .map files (experimental)

This class is an experimental reader with important limitations. See the source code for details of the limitations. The code of this reader is of alpha quality and received very limited testing.

Usage:
>>> from neo import io
>>> r = io.AlphaOmegaIO( filename = 'File_AlphaOmega_1.map')
>>> blck = r.read_block(lazy = False, cascade = True)
>>> print blck.segments[0].analogsignals
class neo.io.AsciiSignalIO(filename=None)

Class for reading signal in generic ascii format. Columns respresents signals. They all share the same sampling rate. The sampling rate is externally known or the first columns could hold the time vector.

Usage:
>>> from neo import io
>>> r = io.AsciiSignalIO(filename='File_asciisignal_2.txt')
>>> seg = r.read_segment(lazy=False, cascade=True)
>>> print seg.analogsignals
[<AnalogSignal(array([ 39.0625    ,   0.        ,   0.        , ..., -26.85546875 ...
class neo.io.AsciiSpikeTrainIO(filename=None)

Classe for reading/writing SpikeTrains in a text file. Each Spiketrain is a line.

Usage:
>>> from neo import io
>>> r = io.AsciiSpikeTrainIO( filename = 'File_ascii_spiketrain_1.txt')
>>> seg = r.read_segment(lazy = False, cascade = True,)
>>> print seg.spiketrains     
[<SpikeTrain(array([ 3.89981604,  4.73258781,  0.608428  ,  4.60246277,  1.23805797,
...
class neo.io.AxonIO(filename=None)

Class for reading data from pCLAMP and AxoScope files (.abf version 1 and 2), developed by Molecular Device/Axon Technologies.

Usage:
>>> from neo import io
>>> r = io.AxonIO(filename='File_axon_1.abf')
>>> bl = r.read_block(lazy=False, cascade=True)
>>> print bl.segments
[<neo.core.segment.Segment object at 0x105516fd0>]
>>> print bl.segments[0].analogsignals
[<AnalogSignal(array([ 2.18811035,  2.19726562,  2.21252441, ...,
    1.33056641,  1.3458252 ,  1.3671875 ], dtype=float32) * pA,
    [0.0 s, 191.2832 s], sampling rate: 10000.0 Hz)>]
>>> print bl.segments[0].events
[]
class neo.io.BlackrockIO(filename, nsx_override=None, nev_override=None, sif_override=None, ccf_override=None, verbose=False)

Class for reading data in from a file set recorded by the Blackrock (Cerebus) recording system.

Upon initialization, the class is linked to the available set of Blackrock files. Data can be read as a neo Block or neo Segment object using the read_block or read_segment function, respectively.

Note: This routine will handle files according to specification 2.1, 2.2, and 2.3. Recording pauses that may occur in file specifications 2.2 and 2.3 are automatically extracted and the data set is split into different segments.

Inherits from:
neo.io.BaseIO

The Blackrock data format consists not of a single file, but a set of different files. This constructor associates itself with a set of files that constitute a common data set. By default, all files belonging to the file set have the same base name, but different extensions. However, by using the override parameters, individual filenames can be set.

Args:
filename (string):
File name (without extension) of the set of Blackrock files to associate with. Any .nsX or .nev, .sif, or .ccf extensions are ignored when parsing this parameter.
nsx_override (string):
File name of the .nsX files (without extension). If None, _filenames is used. Default: None.
nev_override (string):
File name of the .nev file (without extension). If None, _filenames is used. Default: None.
sif_override (string):
File name of the .sif file (without extension). If None, _filenames is used. Default: None.
ccf_override (string):
File name of the .ccf file (without extension). If None, _filenames is used. Default: None.
verbose (boolean):
If True, the class will output additional diagnostic information on stdout. Default: False
Returns:
Examples:
>>> a = BlackrockIO('myfile')
Loads a set of file consisting of files myfile.ns1, ..., myfile.ns6, and myfile.nev
>>> b = BlackrockIO('myfile', nev_override='sorted')
Loads the analog data from the set of files myfile.ns1, ..., myfile.ns6, but reads spike/event data from sorted.nev
class neo.io.BrainVisionIO(filename=None)

Class for reading/writing data from BrainVision products (brainAmp, brain analyser...)

Usage:
>>> from neo import io
>>> r = io.BrainVisionIO( filename = 'File_brainvision_1.eeg')
>>> seg = r.read_segment(lazy = False, cascade = True,)
class neo.io.BrainwareDamIO(filename=None)

Class for reading Brainware raw data files with the extension ‘.dam’.

The read_block method returns the first Block of the file. It will automatically close the file after reading. The read method is the same as read_block.

Note:

The file format does not contain a sampling rate. The sampling rate is set to 1 Hz, but this is arbitrary. If you have a corresponding .src or .f32 file, you can get the sampling rate from that. It may also be possible to infer it from the attributes, such as “sweep length”, if present.

Usage:
>>> from neo.io.brainwaredamio import BrainwareDamIO
>>> damfile = BrainwareDamIO(filename='multi_500ms_mulitrep_ch1.dam')
>>> blk1 = damfile.read()
>>> blk2 = damfile.read_block()
>>> print blk1.segments
>>> print blk1.segments[0].analogsignals
>>> print blk1.units
>>> print blk1.units[0].name
>>> print blk2
>>> print blk2[0].segments
class neo.io.BrainwareF32IO(filename=None)

Class for reading Brainware Spike ReCord files with the extension ‘.f32’

The read_block method returns the first Block of the file. It will automatically close the file after reading. The read method is the same as read_block.

The read_all_blocks method automatically reads all Blocks. It will automatically close the file after reading.

The read_next_block method will return one Block each time it is called. It will automatically close the file and reset to the first Block after reading the last block. Call the close method to close the file and reset this method back to the first Block.

The isopen property tells whether the file is currently open and reading or closed.

Note 1:
There is always only one ChannelIndex. BrainWare stores the equivalent of ChannelIndexes in separate files.
Usage:
>>> from neo.io.brainwaref32io import BrainwareF32IO
>>> f32file = BrainwareF32IO(filename='multi_500ms_mulitrep_ch1.f32')
>>> blk1 = f32file.read()
>>> blk2 = f32file.read_block()
>>> print blk1.segments
>>> print blk1.segments[0].spiketrains
>>> print blk1.units
>>> print blk1.units[0].name
>>> print blk2
>>> print blk2[0].segments
class neo.io.BrainwareSrcIO(filename=None)

Class for reading Brainware Spike ReCord files with the extension ‘.src’

The read_block method returns the first Block of the file. It will automatically close the file after reading. The read method is the same as read_block.

The read_all_blocks method automatically reads all Blocks. It will automatically close the file after reading.

The read_next_block method will return one Block each time it is called. It will automatically close the file and reset to the first Block after reading the last block. Call the close method to close the file and reset this method back to the first Block.

The _isopen property tells whether the file is currently open and reading or closed.

Note 1:
The first Unit in each ChannelIndex is always UnassignedSpikes, which has a SpikeTrain for each Segment containing all the spikes not assigned to any Unit in that Segment.
Note 2:
The first Segment in each Block is always Comments, which stores all comments as an Event object.
Note 3:
The parameters from the BrainWare table for each condition are stored in the Segment annotations. If there are multiple repetitions of a condition, each repetition is stored as a separate Segment.
Note 4:
There is always only one ChannelIndex. BrainWare stores the equivalent of ChannelIndexes in separate files.
Usage:
>>> from neo.io.brainwaresrcio import BrainwareSrcIO
>>> srcfile = BrainwareSrcIO(filename='multi_500ms_mulitrep_ch1.src')
>>> blk1 = srcfile.read()
>>> blk2 = srcfile.read_block()
>>> blks = srcfile.read_all_blocks()
>>> print blk1.segments
>>> print blk1.segments[0].spiketrains
>>> print blk1.units
>>> print blk1.units[0].name
>>> print blk2
>>> print blk2[0].segments
>>> print blks
>>> print blks[0].segments
class neo.io.ElanIO(filename=None)

Classe for reading/writing data from Elan.

Usage:
>>> from neo import io
>>> r = io.ElanIO(filename='File_elan_1.eeg')
>>> seg = r.read_segment(lazy = False, cascade = True,)
>>> print seg.analogsignals 
[<AnalogSignal(array([ 89.21203613,  88.83666992,  87.21008301, ...,
    64.56298828, 67.94128418,  68.44177246], dtype=float32) * pA,
    [0.0 s, 101.5808 s], sampling rate: 10000.0 Hz)>]
>>> print seg.spiketrains   
[]
>>> print seg.events   
[]
class neo.io.IgorIO(filename=None, parse_notes=None)

Class for reading Igor Binary Waves (.ibw) written by WaveMetrics’ IGOR Pro software.

Support for Packed Experiment (.pxp) files is planned.

It requires the igor Python package by W. Trevor King.

Usage:
>>> from neo import io
>>> r = io.IgorIO(filename='...ibw')
class neo.io.KlustaKwikIO(filename, sampling_rate=30000.0)

Reading and writing from KlustaKwik-format files.

class neo.io.KwikIO(filename)

Class for “reading” experimental data from a .kwik file.

Generates a Segment with a AnalogSignal

class neo.io.MicromedIO(filename=None)

Class for reading data from micromed (.trc).

Usage:
>>> from neo import io
>>> r = io.MicromedIO(filename='File_micromed_1.TRC')
>>> seg = r.read_segment(lazy=False, cascade=True)
>>> print seg.analogsignals 
[<AnalogSignal(array([ -1.77246094e+02,  -2.24707031e+02,
    -2.66015625e+02, ...
class neo.io.NeoHdf5IO(filename)

Class for reading HDF5 format files created by Neo version 0.4 or earlier.

Writing to HDF5 is not supported by this IO; we recommend using NixIO for this.

class neo.io.NeoMatlabIO(filename=None)

Class for reading/writing Neo objects in MATLAB format (.mat) versions 5 to 7.2.

This module is a bridge for MATLAB users who want to adopt the Neo object representation. The nomenclature is the same but using Matlab structs and cell arrays. With this module MATLAB users can use neo.io to read a format and convert it to .mat.

Rules of conversion:
  • Neo classes are converted to MATLAB structs. e.g., a Block is a struct with attributes “name”, “file_datetime”, ...
  • Neo one_to_many relationships are cellarrays in MATLAB. e.g., seg.analogsignals[2] in Python Neo will be seg.analogsignals{3} in MATLAB.
  • Quantity attributes are represented by 2 fields in MATLAB. e.g., anasig.t_start = 1.5 * s in Python will be anasig.t_start = 1.5 and anasig.t_start_unit = 's' in MATLAB.
  • classes that inherit from Quantity (AnalogSignal, SpikeTrain, ...) in Python will have 2 fields (array and units) in the MATLAB struct. e.g.: AnalogSignal( [1., 2., 3.], 'V') in Python will be anasig.array = [1. 2. 3] and anasig.units = 'V' in MATLAB.

1 - Scenario 1: create data in MATLAB and read them in Python

This MATLAB code generates a block:

block = struct();
block.segments = { };
block.name = 'my block with matlab';
for s = 1:3
    seg = struct();
    seg.name = strcat('segment ',num2str(s));

    seg.analogsignals = { };
    for a = 1:5
        anasig = struct();
        anasig.signal = rand(100,1);
        anasig.signal_units = 'mV';
        anasig.t_start = 0;
        anasig.t_start_units = 's';
        anasig.sampling_rate = 100;
        anasig.sampling_rate_units = 'Hz';
        seg.analogsignals{a} = anasig;
    end

    seg.spiketrains = { };
    for t = 1:7
        sptr = struct();
        sptr.times = rand(30,1)*10;
        sptr.times_units = 'ms';
        sptr.t_start = 0;
        sptr.t_start_units = 'ms';
        sptr.t_stop = 10;
        sptr.t_stop_units = 'ms';
        seg.spiketrains{t} = sptr;
    end

    event = struct();
    event.times = [0, 10, 30];
    event.times_units = 'ms';
    event.labels = ['trig0'; 'trig1'; 'trig2'];
    seg.events{1} = event;

    epoch = struct();
    epoch.times = [10, 20];
    epoch.times_units = 'ms';
    epoch.durations = [4, 10];
    epoch.durations_units = 'ms';
    epoch.labels = ['a0'; 'a1'];
    seg.epochs{1} = epoch;

    block.segments{s} = seg;
    
end

save 'myblock.mat' block -V7

This code reads it in Python:

import neo
r = neo.io.NeoMatlabIO(filename='myblock.mat')
bl = r.read_block()
print bl.segments[1].analogsignals[2]
print bl.segments[1].spiketrains[4]

2 - Scenario 2: create data in Python and read them in MATLAB

This Python code generates the same block as in the previous scenario:

import neo
import quantities as pq
from scipy import rand, array

bl = neo.Block(name='my block with neo')
for s in range(3):
    seg = neo.Segment(name='segment' + str(s))
    bl.segments.append(seg)
    for a in range(5):
        anasig = neo.AnalogSignal(rand(100)*pq.mV, t_start=0*pq.s, sampling_rate=100*pq.Hz)
        seg.analogsignals.append(anasig)
    for t in range(7):
        sptr = neo.SpikeTrain(rand(40)*pq.ms, t_start=0*pq.ms, t_stop=10*pq.ms)
        seg.spiketrains.append(sptr)
    ev = neo.Event([0, 10, 30]*pq.ms, labels=array(['trig0', 'trig1', 'trig2']))
    ep = neo.Epoch([10, 20]*pq.ms, durations=[4, 10]*pq.ms, labels=array(['a0', 'a1']))
    seg.events.append(ev)
    seg.epochs.append(ep)

from neo.io.neomatlabio import NeoMatlabIO
w = NeoMatlabIO(filename='myblock.mat')
w.write_block(bl)

This MATLAB code reads it:

load 'myblock.mat'
block.name
block.segments{2}.analogsignals{3}.signal
block.segments{2}.analogsignals{3}.signal_units
block.segments{2}.analogsignals{3}.t_start
block.segments{2}.analogsignals{3}.t_start_units

3 - Scenario 3: conversion

This Python code converts a Spike2 file to MATLAB:

from neo import Block
from neo.io import Spike2IO, NeoMatlabIO

r = Spike2IO(filename='spike2.smr')
w = NeoMatlabIO(filename='convertedfile.mat')
blocks = r.read()
w.write(blocks[0])
class neo.io.NestIO(filenames=None)

Class for reading NEST output files. GDF files for the spike data and DAT files for analog signals are possible.

Usage:

from neo.io.nestio import NestIO

files = [‘membrane_voltages-1261-0.dat’,
‘spikes-1258-0.gdf’]

r = NestIO(filenames=files) seg = r.read_segment(gid_list=[], t_start=400 * pq.ms,

t_stop=600 * pq.ms, id_column_gdf=0, time_column_gdf=1, id_column_dat=0, time_column_dat=1, value_columns_dat=2)
class neo.io.NeuroExplorerIO(filename=None)

Class for reading nex files.

Usage:
>>> from neo import io
>>> r = io.NeuroExplorerIO(filename='File_neuroexplorer_1.nex')
>>> seg = r.read_segment(lazy=False, cascade=True)
>>> print seg.analogsignals 
[<AnalogSignal(array([ 39.0625    ,   0.        ,   0.        , ...,
>>> print seg.spiketrains   
[<SpikeTrain(array([  2.29499992e-02,   6.79249987e-02, ...
>>> print seg.events   
[<Event: @21.1967754364 s, @21.2993755341 s, @21.350725174 s, ...
>>> print seg.epochs   
[<neo.core.epoch.Epoch object at 0x10561ba90>,
 <neo.core.epoch.Epoch object at 0x10561bad0>]
class neo.io.NeuroScopeIO(filename=None)
neo.io.NeuroshareIO

alias of NeurosharectypesIO

class neo.io.NixIO(filename, mode='rw')

Class for reading and writing NIX files.

class neo.io.PickleIO(filename=None, **kargs)

A class for reading and writing Neo data from/to the Python “pickle” format.

Note that files in this format may not be readable if using a different version of Neo to that used to create the file. It should therefore not be used for long-term storage, but rather for intermediate results in a pipeline.

class neo.io.PlexonIO(filename=None)

Class for reading data from Plexon acquisition systems (.plx)

Compatible with versions 100 to 106. Other versions have not been tested.

Usage:
>>> from neo import io
>>> r = io.PlexonIO(filename='File_plexon_1.plx')
>>> seg = r.read_segment(lazy=False, cascade=True)
>>> print seg.analogsignals
[]
>>> print seg.spiketrains  
[<SpikeTrain(array([  2.75000000e-02,   5.68250000e-02, ...,
...
>>> print seg.events
[]
class neo.io.PyNNNumpyIO(filename=None, **kargs)

Reads/writes data from/to PyNN NumpyBinaryFile format

class neo.io.PyNNTextIO(filename=None, **kargs)

Reads/writes data from/to PyNN StandardTextFile format

class neo.io.RawBinarySignalIO(filename=None)

Class for reading/writing data in a raw binary interleaved compact file.

Usage:
>>> from neo import io
>>> r = io.RawBinarySignalIO( filename = 'File_ascii_signal_2.txt')
>>> seg = r.read_segment(lazy = False, cascade = True,)
>>> print seg.analogsignals  
...
class neo.io.StimfitIO(filename=None)

Class for converting a stfio Recording to a Neo object. Provides a standardized representation of the data as defined by the neo project; this is useful to explore the data with an increasing number of electrophysiology software tools that rely on the Neo standard.

stfio is a standalone file i/o Python module that ships with the Stimfit program (http://www.stimfit.org). It is a Python wrapper around Stimfit’s file i/o library (libstfio) that natively provides support for the following file types:

  • ABF (Axon binary file format; pClamp 6–9)
  • ABF2 (Axon binary file format 2; pClamp 10+)
  • ATF (Axon text file format)
  • AXGX/AXGD (Axograph X file format)
  • CFS (Cambridge electronic devices filing system)
  • HEKA (HEKA binary file format)
  • HDF5 (Hierarchical data format 5; only hdf5 files written by Stimfit or stfio are supported)

In addition, libstfio can use the biosig file i/o library as an additional file handling backend (http://biosig.sourceforge.net/), extending support to more than 30 additional file formats (http://pub.ist.ac.at/~schloegl/biosig/TESTED).

Example usage:
>>> import neo
>>> neo_obj = neo.io.StimfitIO("file.abf")
or
>>> import stfio
>>> stfio_obj = stfio.read("file.abf")
>>> neo_obj = neo.io.StimfitIO(stfio_obj)
class neo.io.TdtIO(dirname=None)

Class for reading data from from Tucker Davis TTank format.

Usage:
>>> from neo import io
>>> r = io.TdtIO(dirname='aep_05')
>>> bl = r.read_block(lazy=False, cascade=True)
>>> print bl.segments
[<neo.core.segment.Segment object at 0x1060a4d10>]
>>> print bl.segments[0].analogsignals
[<AnalogSignal(array([ 2.18811035,  2.19726562,  2.21252441, ...,
    1.33056641, 1.3458252 ,  1.3671875 ], dtype=float32) * pA,
    [0.0 s, 191.2832 s], sampling rate: 10000.0 Hz)>]
>>> print bl.segments[0].events
[]
class neo.io.WinEdrIO(filename=None)

Class for reading data from WinEDR.

Usage:
>>> from neo import io
>>> r = io.WinEdrIO(filename='File_WinEDR_1.EDR')
>>> seg = r.read_segment(lazy=False, cascade=True,)
>>> print seg.analogsignals
[<AnalogSignal(array([ 89.21203613,  88.83666992,  87.21008301, ...,  64.56298828,
        67.94128418,  68.44177246], dtype=float32) * pA, [0.0 s, 101.5808 s], sampling rate: 10000.0 Hz)>]
class neo.io.WinWcpIO(filename=None)

Class for reading from a WinWCP file.

Usage:
>>> from neo import io
>>> r = io.WinWcpIO( filename = 'File_winwcp_1.wcp')
>>> bl = r.read_block(lazy = False, cascade = True,)
>>> print bl.segments   
[<neo.core.segment.Segment object at 0x1057bd350>, <neo.core.segment.Segment object at 0x1057bd2d0>,
...
>>> print bl.segments[0].analogsignals
[<AnalogSignal(array([-2438.73388672, -2428.96801758, -2425.61083984, ..., -2695.39453125,
...

Logging

neo uses the standard Python logging module for logging. All neo.io classes have logging set up by default, although not all classes produce log messages. The logger name is the same as the full qualified class name, e.g. neo.io.hdf5io.NeoHdf5IO. By default, only log messages that are critically important for users are displayed, so users should not disable log messages unless they are sure they know what they are doing. However, if you wish to disable the messages, you can do so:

>>> import logging
>>>
>>> logger = logging.getLogger('neo')
>>> logger.setLevel(100)

Some io classes provide additional information that might be interesting to advanced users. To enable these messages, do the following:

>>> import logging
>>>
>>> logger = logging.getLogger('neo')
>>> logger.setLevel(logging.INFO)

It is also possible to log to a file in addition to the terminal:

>>> import logging
>>>
>>> logger = logging.getLogger('neo')
>>> handler = logging.FileHandler('filename.log')
>>> logger.addHandler(handler)

To only log to the terminal:

>>> import logging
>>> from neo import logging_handler
>>>
>>> logger = logging.getLogger('neo')
>>> handler = logging.FileHandler('filename.log')
>>> logger.addHandler(handler)
>>>
>>> logging_handler.setLevel(100)

This can also be done for individual IO classes:

>>> import logging
>>>
>>> logger = logging.getLogger('neo.io.hdf5io.NeoHdf5IO')
>>> handler = logging.FileHandler('filename.log')
>>> logger.addHandler(handler)

Individual IO classes can have their loggers disabled as well:

>>> import logging
>>>
>>> logger = logging.getLogger('neo.io.hdf5io.NeoHdf5IO')
>>> logger.setLevel(100)

And more detailed logging messages can be enabled for individual IO classes:

>>> import logging
>>>
>>> logger = logging.getLogger('neo.io.hdf5io.NeoHdf5IO')
>>> logger.setLevel(logging.INFO)

The default handler, which is used to print logs to the command line, is stored in neo.logging_handler. This example changes how the log text is displayed:

>>> import logging
>>> from neo import logging_handler
>>>
>>> formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
>>> logging_handler.setFormatter(formatter)

For more complex logging, please see the documentation for the logging module.

Note

If you wish to implement more advanced logging as describe in the documentation for the logging module or elsewhere on the internet, please do so before calling any neo functions or initializing any neo classes. This is because the default handler is created when neo is imported, but it is not attached to the neo logger until a class that uses logging is initialized or a function that uses logging is called. Further, the handler is only attached if there are no handlers already attached to the root logger or the neo logger, so adding your own logger will override the default one. Additional functions and/or classes may get logging during bugfix releases, so code relying on particular modules not having logging may break at any time without warning.