Skip to content
Dylan Martin edited this page Sep 14, 2022 · 15 revisions

COINSTAC

Decentralized Analyses Made Easy

The Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC) is a tool developed to overcome barriers to collaboration through the use of federated analysis and standardization of collaboration methods. COINSTAC enables users to create collaborative consortia which and run decentralized computations against the data of consortia members in-place. The data is never moved or shared, and only group-level derivatives are passed back. This allows for large scale collaborative analysis between multiple sites without the technological, policy, and administrative challenges that come with centralized analysis.

Let's do research!

Team

This section contains our current team members and their preferred roles and responsibilities on this project. This information can be used to find the right kind of help and to know who to assign Github Issues to.

  • Paul Prae, Technical Project Lead and Data Platform Architect: As a tech lead, Paul enjoys being a success coach. He helps people have fun while building quality software products. As an architect, Paul enjoys working in cloud and data operations. On COINSTAC, his current focus areas are implementing agile software development processes and bringing new innovations to COINSTAC's cloud architecture.

  • Dylan Martin, Software Engineer: Dylan enjoys application development, architecture planning and integrating cloud services.

  • Ross Kelly, Lead Software Engineer: Ross does Coinstac's technical design and planning, and enjoys architecting systems and data sharing.

  • Sandeep Panta, Software Development Manager: Sandeep is experienced with various pre-processing pipelines, familiar with research tools used in the neuroscience community, data standardization projects like BIDS, big data sets and challenges to comparing results derived from various collection protocols and processing techniques. First author on 2 machine learning papers: A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets and Classifying handedness with MRI. He enjoys researching and developing new tools that can further big data collaborative research, solving data standardization challenges, converting algorithms from research publications to easy to access software tools for the community, automating tedious pre-processing pipelines and quality control metrics for large datasets and creative problem solving in general.

Key Wiki Resources

Clone this wiki locally