automated machine learning (automl) with matlab video -凯发k8网页登录
get an overview of automl and how it simplifies the machine learning workflow. learn how to build optimized predictive models in three steps:
- apply wavelet scattering to obtain features from signal or image data without requiring signal processing expertise.
- use automated feature selection to identify a small subset of features, which helps prevent overfitting, reduces model size, and speeds up training.
- choose automated model selection to deliver an optimized model in one step with built-in hyperparameter optimization.
the video demonstrates how to apply automl to build a classifier of human activity and use sensor data from mobile devices as input.
automl in matlab delivers optimized models in just three steps. first, you convert raw sensor data to the features machine learning needs as input using wavelength scattering. next, automated feature selection allows you to reduce large feature sets and thus ultimate model size. and finally, automated model selection picks the best model for you and optimizes its hyperparameters in the same step. you can apply these steps without machine learning expertise. however, this video uses some technical terms to explain what's happening behind the scenes.
we demonstrate automl building a model to classify activities, such as standing or sitting, using accelerometer meta data from a mobile phone. the remainder of the video walks you through the steps of automl in matlab. first, to obtain relevant features from signal or image data, you can decompose signals with wavelengths using predefined wavelet and scaling filters that feature a matrix function, applies wavelet scattering to buffers of signal.
next, we could proceed to train a model on the over 400 feature we obtained, but that may result in a large model that does not fit on an embedded device. so a second step in auoml, we apply feature selection to reduce the number of features. the table here helps you choose the appropriate method based on the characteristics of your data.
here we apply the minimum redundancy maximum relevance algorithm, which works really well on continuous and categorical features for classification. the feature ranking charts suggest that as few as a dozen features capture the majority of the variability in the signal.
after selecting a small performance set of features, we can proceed to the third step-- identifying the best performing model. use fitcauto for classification and fit [? r ?] auto for regression. we train on just the dozen features we selected in the previous step. the algorithm evaluates many combinations of models and hyperparameters seeking to minimize the error. in practice, this may take hundreds of iterations to fully converge. though for data of moderate size, we see good results in much less time, around 100 iterations. you can speed up execution using parallelization on multiple cores, on your local computer, or using cloud instances.
applying automl to this data set, we obtain a model with 96.6% accuracy on held-out test data. in summary, automl obtained a highly accurate model in a few steps, effectively allowing engineers to build the models themselves without having to rely on data scientists. if you are knowledgeable in machine learning, automl saves you time on routine steps, allowing you to focus on advanced optimization techniques such as stacking models and engineering even better features. for more information, visit our automl discovery page or the links below.
related products
learn more
featured product
statistics and machine learning toolbox
您也可以从以下列表中选择网站:
如何获得最佳网站性能
选择中国网站(中文或英文)以获得最佳网站性能。其他 mathworks 国家/地区网站并未针对您所在位置的访问进行优化。
美洲
- (español)
- (english)
- (english)
欧洲
- (english)
- (english)
- (deutsch)
- (español)
- (english)
- (français)
- (english)
- (italiano)
- (english)
- (english)
- (english)
- (deutsch)
- (english)
- (english)
- switzerland
- (english)
亚太
- (english)
- (english)
- (english)
- 中国
- (日本語)
- (한국어)