All Collections
Using Dcipher Studio
Tips and tricks
Understanding operations pipelines
Understanding operations pipelines

Dcipher tracks the workflow by constructing pipelines of operations based on user interactions with the data.

Tomas Larsson avatar
Written by Tomas Larsson
Updated over a week ago

Keywords: operations pipeline, workflow

An operations pipeline shows the operations that have been applied to your data.

The operations are shown in the Manage Pipeline sidebar in the order they were applied. Here you can expand the operation to see and adjust its settings. If you change the settings and click apply, the operation will rerun, as will the operations that are downstream from it in the pipeline.

Each operation results in a new dataset, so just like we have a pipeline of operations, we also have a pipeline of datasets. These are shown in the Pipeline View above the workspace. Hover over a dataset to see what operation it was generated by. Click a dataset to view it in the Schema workbench.

If you drag-and-drop your data to a new workbench, a local pipeline is formed. This means you can work with the dataset locally, in the workbench, without the changes affecting your global pipeline.

Local pipelines can be seen as branches from the main, global pipeline. So changes to the global data ripple through to the downstream local data, while local changes don't affect the upstream global data.

Local pipelines can be moved to the global scope from the Manage Pipeline sidebar.


The local pipelines are removed when you close or empty a workbench. To remove an operation and its resulting dataset from the operations pipeline, right-click a dataset in the pipeline and click "Delete".

Did this answer your question?