Autonomous technology will touch nearly every part of our lives, changing the products we build and the way we do business. It’s not just in self-driving cars, robots, and drones; it’s in predictive engine maintenance, automated trading, medical image interpretation, and other applications. Autonomy—the ability of a system to learn to operate independently—requires three elements:
Mary Ann Freeman shows you how engineers and scientists are combining these elements, using MATLAB® and Simulink®, to build autonomous technology into their products and services today—to build their autonomous anything.
In the 1960s, the American cartoon The Jetsons was amazingly accurate in predicting our future. Today, we have at our fingertips almost all their inventions, except one, the flying car. However, we are inching our way towards that breakthrough following one simple and overarching principle: System Design Enablement. In this presentation, you’ll learn how Cadence is solving the challenges of designing and verifying the multitude of electronic disciplines that make up System Design Enablement by harnessing the power of sophisticated EDA solutions, high-performance IP, and strategic partnerships, including MathWorks. Working together, we turn childhood dreams into realities.
Discovering promising new materials is central to our ability to design better batteries, but research over the last several decades has been driven by inefficient guess-and-check searches that have resulted in slow progress. Focusing on solid-state electrolyte materials, Austin Sendek built a data-driven model for predicting material performance by applying machine learning to a small set of 40 experimental data points on crystal structure and ionic conductivity from the literature. He used the resulting model to guide an experimental search for high ionic conductivity electrolyte materials and found that incorporating machine learning into the search leads to several times more discoveries than a comparable guess-and-check effort.
As the size and variety of your engineering data has grown, so has the capability to access, process, and analyze those (big) engineering data sets in MATLAB®. With the rise of streaming data technologies, the volume and velocity of this data has increased significantly, and this has motivated new approaches to handle data-in-motion. Jim Stewart discusses the use of MATLAB as a data analytics platform with best-in-class frameworks and infrastructure to express MATLAB based workflows that enable decision making in “real-time” through the application of machine learning models. He demonstrates how to use MATLAB Production Server™ to deploy these models on streams of data from Apache® Kafka™. The demonstration shows a full workflow from the development of a machine learning model in MATLAB to deploying it to work with a real-world sized problem running on the cloud.
Predictive maintenance—the practice of forecasting equipment failures before they occur—is a high priority for many organizations looking to get business value from historical performance data. New technologies such as machine learning and big data show promising results, but they fail to capture nuances that may be obvious to domain experts familiar with these systems.
See how machine learning and big data techniques can be used with traditional model-based techniques to create hybrid approaches for predicting failures. Through examples and case studies, Mehernaz Savai shows you how MATLAB® and Simulink® combine to provide a common platform for building predictive maintenance algorithms.
Parallel computing enables you to scale applications to bring faster insight from your data. MATLAB® and Simulink® users can leverage the computational power of available hardware to solve and accelerate computationally and data intensive problems, without the need to be a parallel computing expert. Users can seamlessly develop applications and models on their desktops and scale to GPUs, computer clusters, and clouds. Applications include design optimization, deep learning in computer vision, and neural networks.
Orbiting satellites and spacecrafts in low earth orbit are subject to the collision dangers of more than 500,000 pieces of space debris, which can have an even greater impact on space vehicles if it is not tracked beforehand to allow the spacecraft to maneuver away from collision zones. However, current covariance-driven tracking tactics are vulnerable to orbital variations of space debris clouds, which orbit collectively, due to constantly changing astrodynamics subject to nonlinear celestial disturbances. In Amber Yang’s research, the Iterative Closest Point (ICP) algorithm is applied to register the space debris clouds from two successive motion scans as two point clouds for geometric alignment, which provides kinematic patterns of space debris clouds to train an artificial neural networks (ANN) system. The machine-learning backpropagation algorithm performs pattern recognition using an ANN to predict dynamic changes of the ICP kinematic patterns for accurate point-cloud tracking. Yang discusses how MATLAB® and the Statistics and Machine Learning Toolbox™ provide a cohesive environment for training and testing innovative applications to artificial intelligence.
Gain a better understanding of how different MATLAB® data types are stored in memory and how you can program in MATLAB to use memory efficiently. In recent versions, MATLAB introduced several new programming concepts, including new function types. Mike Agostini illustrates and explores the usage and benefits of the various function types under different conditions. You will learn how using the right function type can lead to more robust and maintainable code. Demonstrations show you how to apply these techniques to problems that arise in typical applications.
In order for video-on-demand (VOD) providers to dynamically insert advertisements, users must manually identify ad break in and out points at the precise video frame boundary. The precision needed for this requires a large staff and is time-consuming. Therefore, a solution for automating this process is needed. Cyber Resonance Media Detective (mD) software is a highly accurate and ultra-fast automated solution for identifying ad break points with sub-frame level precision. Using Signal Processing Toolbox™, Wavelet Toolbox™, and DSP System Toolbox™ along with a collection of custom design algorithms to extract various audio features on a short-term basis, Cyber Resonance Corporation was able to detect and uncover data that would otherwise be hidden within the AC3 decoded audio bit-stream. Audio System Toolbox™ provided a streaming solution to capture audio from the set-top-box/DVR directly into MATLAB®. Additionally, Parallel Computing Toolbox™ was used to speed up processing and testing of large datasets. Finally, MATLAB Coder™ was used to convert the company’s tested MATLAB® code into C++ code to create the production ready version of Media Detective.
The wireless landscape is in rapid expansion, with New Radio (NR) 5G, V2X, NB-IoT, and other applications building on top of traditional standards such as LTE and 802.11ac. New challenges arise with the implementation of massive-MIMO systems, hybrid beamforming, new OFDM-based waveforms, LDPC and polar codes, antenna miniaturization, higher carrier frequencies, and wideband signals.
Learn how the latest release of MathWorks wireless products assist you in modeling and simulating existing and upcoming communications systems with realistic waveforms, propagation channels, impairments, and large antenna arrays. Using these wireless products in conjunction with MATLAB® connectivity to SDRs and instruments lets you expedite prototype development.
For autonomous driving, we need to be able to localize in the map very precisely. Additionally, in urban areas with high-rise buildings, autonomous cars will face problems with poor GPS signal reception. Using various MATLAB® toolboxes, NIO was able to demonstrate a proof-of-concept algorithm in a short period of time. In his talk, Veera Ganesh Yalla discusses the company’s progress in self-localization of a vehicle for autonomous driving.
Computer vision is an enabling technology that is driving the development of several of the smart systems today, for example, self-driving cars, augmented reality, and autonomous robotics. Computer vision applies complex algorithms to images and video to detect, classify, and track objects or events to understand a real-world scene. Learn how MATLAB® can be used to simplify the computer vision system design workflow from algorithm development to implementation on embedded systems.
Deep learning is transforming a diverse set of engineering and scientific domains including computer vision, video analytics, robotics, autonomous driving, and more. Deep learning can achieve state-of-the-art accuracy for many tasks considered algorithmically unsolvable using traditional machine learning. In this presentation, real-world examples are used to illustrate how deep learning is implemented in a diverse set of applications. Demonstrations illustrate how MATLAB® and NVIDIA® GPUs are enabling these innovations.
Topics covered include advances to AI computing at the edge through NVIDIA Jetson platform, and the ability to automatically generate high-performance CUDA® code for NVIDIA GPUs from deep learning models in MATLAB.
Model-Based Design is a path from algorithm to hardware implementation, and in this case, a full custom mixed-signal integrated circuit for Class III implantable medical products. For applications like these, custom integrated circuit die area and power consumption are critical for circuit performance as these parameters translate directly to device size and device longevity. The model is used for design exploration and algorithm validation. The model is transformed to an RTL description using HDL Coder™. The model is refined to meet power and area constraints while iterating between HDL simulation results and model parameters to create the lowest possible power design. This presentation shows a typical design flow of modeling, validating, and implementing an algorithm into a custom ASIC. The steps in taking the design into an ASIC simulation environment to validate power and area are also presented.
Don Pakbaz describes the analysis of an external Phase-Locked Loop (PLL) used as a cleanup PLL with High-Speed Serial (HSS) link to calculate system overall peaking and stability for three PLLs chained. The frequency domain transfer function of PLLs are modeled using hardware data and RF Toolbox™. The transfer function of PLLs then cascaded to calculate phase margin and gain margin using Control System Toolbox™. Time domain step response and jitter of overall system also evaluated using MATLAB® and Simulink®.
In recent releases, MathWorks has introduced features that improve the efficiency of generated code by 10%, 20%, and even 50%. Surprisingly, several of these features are not included with Embedded Coder®, but are available in products and capabilities used for algorithm design, data management, and verification. Learn how to use the latest features in the Simulink® product family to produce highly optimized code.
Carlos Santacruz-Rosero demonstrates how to solve the pick and place problem with a robot manipulator. You can detect and recognize an object with a 3D camera and perform inverse kinematics and trajectory planning to execute a motion plan for the robot arm.
Learn how to:
Many designers are adopting SoC FPGA devices to integrate processor and FPGA functions on a single device, reducing system power, cost, and board size. The complexity implementing algorithms on SoCs, however, creates a challenge for algorithm developers, software developers and hardware designers.
In this presentation, we will demonstrate how to:
In this presentation, Terry Denery demonstrates how mechanical, hydraulic, and electronics design and analysis teams can share their knowledge, help develop high-fidelity component and system models, and improve model accuracy. The example used throughout this presentation is a reciprocating pump, a widely used component in various industries. It is a nice example because it possesses interesting mechanics, fluid dynamics, and electronics. Employing the Simscape™ platform, Terry shows that graphical and physically-descriptive modeling approaches, like electric circuits, provide access to the broader engineering team. The solution method of Simscape, grounded in the various physical disciplines, offers numerical efficiency, shows an extension of the model validity beyond the measured data, and can provide insights that attach success and failure mechanisms to the known physical features of the modelled equipment. Delivering this through Simulink® provides an easy path to getting it into software development teams employing Model-Based Design. Connecting these design and analysis teams into the embedded software workflow promises a great enhancement in embedded software from controls applications to Internet of Things.
MATLAB® and Simulink® products for Model-Based Design and technical computing are the industry-standard tools for designing, implementing, and testing air, space, naval, and land systems. Worldwide, aerospace and defense companies have relied on MathWorks products to help certify their software, such as the F-35 Joint Strike Fighter and Mars Exploration Rover. Recently, though, Silicon Valley has seen an increase in new companies creating unmanned drones and self-driving cars. These customers are looking to MathWorks to help make their design and verification workflows more efficient. Through a quadcopter example, you will learn how a Model-Based Design workflow enables our customers to verify and test their autonomous algorithms; ultimately, ensuring their designs are safe prior to implementing them on hardware.
This master class covers tools and techniques for efficient simulation workflows in Simulink®. Specifically, Murali Yeddanapudi explores the best techniques for running interactive simulations and for running batch simulations. He demonstrates Simulink features such as: