Scalable compute, with cost controls
- Data scientists can provision compute resources with a single click, either elastically in a cloud or across your on-premise hardware.
- Domino ships with optimized distributions of the most popular data science tools (e.g., R, Python, Jupyter, RStudio) and you can also build your own standard environments to share within your organization.
- Administrators can manage environments, set resource limits, monitor usage, kill run-away jobs, and attribute costs back to users or teams.
Reduce time to business impact
Domino lets data scientists publish models as REST APIs, hosted interactive web apps (e.g. Shiny) or scheduled jobs for generating reports or running ETL tasks. This avoids delays getting work re-implemented in production systems, so the business gets value faster and IT can focus on strategic objectives. Access controls and gatekeeping features let you enforce governance processes to control deployment.
Strengthen security and governance
Analytical assets are kept in one central place, on managed, monitored infrastructure — instead of being spread across users’ machines. Granular access controls keep work secured, with activity logs and reports available to administrators.
A system of record designed for data science
Domino keeps code, data, results, discussion, and environments linked together in one place. A central hub increases collaboration, transparency, and reproducibility — accelerating innovation while mitigating compliance and audit risk.