tlpipe.pipeline.pipeline.OneAndOne

class tlpipe.pipeline.pipeline.OneAndOne(parameter_file_or_dict=None, feedback=2)[source]

Base class for tasks that have (at most) one input and one output.

The input for the task will be preferably get from the in key, if no input product is specified by the in key, then input will be get from the files specified by the value of :attr:params[input_files] if it is anything other None, otherwise an error will be raised upon initialization.

If the value of :attr:params[output_files] is anything other than None then the output will be written (using write_output()) to the specified output files.

Attributes

cacheable Override to return True if caching results is implemented.
embarrassingly_parallelizable Override to return True if next() is trivially parallelizeable.
history History that will be added to the output file.
iteration Current iteration when iterable is True, None else.
params_init
prefix
__init__(parameter_file_or_dict=None, feedback=2)[source]

Methods

cast_input(input) Override to support accepting pipeline inputs of various types.
copy_input(input) Override to return a copy of the input so that the original input would not be changed by the task.
finish() Final analysis stage of pipeline task.
next([input]) Should not need to override.
process(input) Override this method with your data processing task.
read_input() Override to implement reading inputs from disk.
read_output(filenames) Override to implement reading outputs from disk.
read_process_write(input) Reads input, executes any processing and writes output.
restart_iteration() Re-start the iteration.
setup([requires]) First analysis stage of pipeline task.
show_params() Show all parameters that can be set and their default values of this task.
stop_iteration([force_stop]) Determine whether to stop the iteration.
write_output(output) Override to implement writing outputs to disk.