Autonomous technology will touch nearly every part of our lives, changing the products we build and the way we do business. It’s not just in self-driving cars, robots, and drones; it’s in predictive engine maintenance, automated trading, medical image interpretation, and other applications. Autonomy—the ability of a system to learn to operate independently—requires three elements:
Mary Ann Freeman shows you how engineers and scientists are combining these elements, using MATLAB® and Simulink®, to build autonomous technology into their products and services today—to build their autonomous anything.
Discovering promising new materials is central to our ability to design better batteries, but research over the last several decades has been driven by inefficient guess-and-check searches that have resulted in slow progress. Focusing on solid-state electrolyte materials, Austin Sendek built a data-driven model for predicting material performance by applying machine learning to a small set of 40 experimental data points on crystal structure and ionic conductivity from the literature. He used the resulting model to guide an experimental search for high ionic conductivity electrolyte materials and found that incorporating machine learning into the search leads to several times more discoveries than a comparable guess-and-check effort.
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies, the volume and velocity of this data has increased significantly, and this has motivated new approaches to handle data-in-motion. Jim Stewart discusses the use of MATLAB as a data analytics platform with best-in-class frameworks and infrastructure to express MATLAB based workflows that enable decision making in “real-time” through the application of machine learning models. He demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka™. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Predictive maintenance—the practice of forecasting equipment failures before they occur—is a high priority for many organizations looking to get business value from historical performance data. New technologies such as machine learning and big data show promising results, but they fail to capture nuances that may be obvious to domain experts familiar with these systems.
See how machine learning and big data techniques can be used with traditional model-based techniques to create hybrid approaches for predicting failures. Through examples and case studies, Mehernaz Savai shows you how MATLAB® and Simulink® combine to provide a common platform for building predictive maintenance algorithms.
Parallel computing enables you to scale applications to bring faster insight from your data. MATLAB® and Simulink® users can leverage the computational power of available hardware to solve and accelerate computationally and data intensive problems, without the need to be a parallel computing expert. Users can seamlessly develop applications and models on their desktops and scale to GPUs, computer clusters, and clouds. Applications include design optimization, deep learning in computer vision, and neural networks.
Orbiting satellites and spacecrafts in low earth orbit are subject to the collision dangers of more than 500,000 pieces of space debris, which can have an even greater impact on space vehicles if it is not tracked beforehand to allow the spacecraft to maneuver away from collision zones. However, current covariance-driven tracking tactics are vulnerable to orbital variations of space debris clouds, which orbit collectively, due to constantly changing astrodynamics subject to nonlinear celestial disturbances. In Amber Yang’s research, the Iterative Closest Point (ICP) algorithm is applied to register the space debris clouds from two successive motion scans as two point clouds for geometric alignment, which provides kinematic patterns of space debris clouds to train an artificial neural networks (ANN) system. The machine-learning backpropagation algorithm performs pattern recognition using an ANN to predict dynamic changes of the ICP kinematic patterns for accurate point-cloud tracking. Yang discusses how MATLAB® and the Statistics and Machine Learning Toolbox™ provide a cohesive environment for training and testing innovative applications to artificial intelligence.
Carlos Santacruz-Rosero demonstrates how to solve the pick and place problem with a robot manipulator. You can detect and recognize an object with a 3D camera and perform inverse kinematics and trajectory planning to execute a motion plan for the robot arm.
Learn how to:
High-speed motor control is a requirement in many industrial, automotive, robotic, and aerospace applications, often requiring implementation in custom hardware. Learn how to use MathWorks products to enable:
For autonomous driving, we need to be able to localize in the map very precisely. Additionally, in urban areas with high-rise buildings, autonomous cars will face problems with poor GPS signal reception. Using various MATLAB® toolboxes, NIO was able to demonstrate a proof-of-concept algorithm in a short period of time. In his talk, Veera Ganesh Yalla discusses the company’s progress in self-localization of a vehicle for autonomous driving.
Computer vision is an enabling technology that is driving the development of several of the smart systems today, for example, self-driving cars, augmented reality, and autonomous robotics. Computer vision applies complex algorithms to images and video to detect, classify, and track objects or events to understand a real-world scene. Learn how MATLAB® can be used to simplify the computer vision system design workflow from algorithm development to implementation on embedded systems.
Model-Based Design is a path from algorithm to hardware implementation, and in this case, a full custom mixed-signal integrated circuit for Class III implantable medical products. For applications like these, custom integrated circuit die area and power consumption are critical for circuit performance as these parameters translate directly to device size and device longevity. The model is used for design exploration and algorithm validation. The model is transformed to an RTL description using HDL Coder™. The model is refined to meet power and area constraints while iterating between HDL simulation results and model parameters to create the lowest possible power design. This presentation shows a typical design flow of modeling, validating, and implementing an algorithm into a custom ASIC. The steps in taking the design into an ASIC simulation environment to validate power and area are also presented.
In recent releases, MathWorks has introduced features that improve the efficiency of generated code by 10%, 20%, and even 50%. Surprisingly, several of these features are not included with Embedded Coder®, but are available in products and capabilities used for algorithm design, data management, and verification. Learn how to use the latest features in the Simulink® product family to produce highly optimized code.