the matlab interface for databricks® enables matlab® and simulink® users to connect to data and compute capabilities in the cloud. users can access and query big datasets remotely or deploy matlab code to run natively on a databricks cluster.
scale big data
access cloud data sources through a databricks cluster by connecting it to matlab with database toolbox™. manipulate data remotely and use sql to access a variety of data formats directly from cloud storage or using delta lake. use apache spark™ sql to access and query data sources and file types.
use spark interactively with matlab and databricks connect
incorporate apache spark api into matlab algorithms using databricks connect. run your code in the cloud, exchanging data with matlab files and live scripts right from the matlab ide and on any operating system.
bring matlab to the data
use matlab compiler™, simulink compiler™, and matlab compiler sdk™ to package your algorithms for databricks cluster deployment. deployed algorithms can run as on-demand and scheduled jobs. other databricks users can use these algorithms and become part of data processing pipelines.
collaborate on data and models across the enterprise
with databricks, matlab and simulink® users can access and share a variety of data, including binary, image, text, video, and more, enabling teams in engineering, business, and data analytics to interact across a single data platform.
manage algorithm and model lifecycle with mlflow
use mlflow with matlab to run experiments, keep track of parameters, metrics, and code, and monitor execution results. share your matlab models and discover algorithms in the saving time and allowing colleagues to benefit from your work.
matlab workflows with databricks
resources
customer success
blog