- Data Commons: full stack infrastructure for capturing, structuring, organizing, managing and supporting analysis of heterogeneous data sets (including imaging, time series, etc) for an organization.
- Data Pipelines: Scalable, high-throughput data processing of complex workflows from RNA Sequencing to machine-learning models.
- Directory and Security: Authorized access to discrete supporting different data policies including HIPAA and restricted data.
- Data Portal: Data-driven portal to support a variety of enterprise access and use needs.
- Registries: Patient registries for capturing and managing studies and connecting to discreet types of data for analysis in the data commons.
- Metadata Tools: For managing different types of data models and common data elements.
- Curation Tools: For managing enterprise scale databases and knowledge content.
- Visualization Tools: For accessing and analyzing data in the data commons and ecosystem.
Common models for quickly constructing a working data science, cloud-based platform; extensions for supporting a wide-variety of data types.