Proceedings
Featured Presentations
Plenary Sessions
5G and Radar
Rick Gentile, MathWorks
Houman Zarrinkoub, MathWorks
Mike McLernon, MathWorks
Motor and Power Control
Graham Dudgeon, MathWorks
Manuel Fedou, Speedgoat
AI in Engineering
Louvere Walker-Hannon, MathWorks
Rachel Johnson, MathWorks
Rares Curatu, MathWorks
Greg Coppenrath, MathWorks
Algorithm Development and Deployment
Heather Gorr, MathWorks
David Garrison, MathWorks
Sohini Sarkar, MathWorks
Autonomous Systems
FPGA, ASIC, and SoC Design
Steve Kuznicki, MathWorks
Jesson John, MathWorks
Implementation, Verification, and DevOps
Sean de Wolski, MathWorks
Jeff Harper, MathWorks
Wouter van Heijningen, ASML
MathWorks Automotive Conference
Rajkumar Palanisamy, Flux Auto
Shashank Sharma, MathWorks
Alon Davidi, General Motors
Dr.-Ing. Marco Wegener, ZF
Suresh S, Renault Nissan
Shwetha Bhadravathi Patil, MathWorks
Miao Luo, The Qt Company
Modeling and Simulation
Bill Chou, MathWorks
Bernhard Suhm, MathWorks
Max Curzi, Cambridge Consultants
Teaching with MATLAB and Simulink
Magnus Egerstedt, Georgia Institute of Technology
Sebastian Graszk, RWTH Aachen University
Hands-On Workshops
Sarah Mohamed, MathWorks
Shruti Karulkai, MathWorks
Sara Nambi, MathWorks
Sohini Sarkar, MathWorks
Cloud and IoT
Kishen Mahadevan, MathWorks
Igor Alekseev, Amazon Web Services (AWS)
Arvind Hosagrahara, MathWorks
Emma Haley, Leipziger Stadtwerke GmbH
Applying AI to Radar and Lidar Processing
A Low Cost 5G Testbed Development for Futuristic 6G Cellular Standard
We will not sell or rent your personal contact information. See our privacy policy for details.
You are already signed in to your MathWorks Account. Please press the "Submit" button to complete the process.
Applying AI to Radar and Lidar Processing
Perception is at the heart of autonomous systems and surveillance systems. High resolution sensors such as lidar and radar provide a wealth of data that feed perception algorithms and enable new ways of understanding the surrounding environment.
Radar and lidar engineers leverage deep learning and machine learning to automate and improve accuracy of processing pipelines for a variety of applications in consumer and automotive applications, including target and terrain classification in surveillance systems, object detection and identification in autonomous systems, and AR/VR applications.
Learn how MATLAB® and Simulink® are used to overcome common challenges including:
- Handling data scarcity for training
- Labeling sparse 3D point clouds and radar signals
- Applying deep learning models designed for images and signals to point clouds and radar returns
- Classifying radar returns including micro-Doppler signatures
A Low Cost 5G Testbed Development for Futuristic 6G Cellular Standard
The demand for wireless connectivity has grown exponentially over the last few decades. Fifth generation (5G) communications systems are being tested and will soon be deployed worldwide. We are progressing towards 6G ecosystems that will support more ambitious scenarios and use cases for smart buildings, connected vehicles, satellite communications, digital healthcare, and many more applications. There are several challenges associated with the underlying technologies and architectural changes of these envisioned 6G networks.
In this talk, we present how we used the Model-Based Design approach with MATLAB® , Simulink® , and other related toolboxes to develop an end-to-end system design for the 5G test bed which is flexible to incorporate 6G capabilities.
Linearization of RF Power Amplifiers – Linking Simulation and Measurements on Physical Devices
Efficiency and easy linearization are the key differentiators for modern RF front ends and PAs, while frequency coverage, gain, RF performance are defined by the target application. Linearization using digital predistortion (DPD) is a key enabler to use highly non-linear PA's close to saturation, where they offer best efficiency.
Engineers in PA design need to understand best possbile DUT capabilities and performance using optimal DPD - a feature available with Rohde & Schwarz. Once the devices get introduced to real designs, system engineers look for realistic DPD algorithms which can be implemented with affordable effort. MATLAB® simulation, together with built-in features of the latest T&M instrumentation, allow system engineers to derive the model with user-defined complexity. Verification of the model using a wideband signals such as 5G proof the DPD model versus the best possible behavior. This ensures proper modeling and defining of DPD in real-time applications to achieve maximum linearity and efficiency.
5G and Wireless Design with MATLAB
Learn about new capabilities in MATLAB® and Simulink® for wireless communications. Using these features and capabilities, you can accelerate your innovations in three areas of wireless system design:
- Standards-based modeling and simulation
- Unified joint RF-antenna-baseband design
- Deployment and testing of your wireless implementations
Through case studies and reference examples, see how to:
- Perform 5G NR PHY simulation, including uplink and downlink processing
- Generate standards-compliant waveforms for design verification and over-the-air testing with a range of RF instruments
- Develop smart RF technologies, including power amplifier linearization with DPD
- Model massive MIMO antenna arrays and hybrid beamforming architectures
- Visualize antenna sites, communication links, and signal coverage on maps
- Verify 5G system performance using Xilinx® RFSoC and Avnet® RFSoC Development Kits
SDR Solutions with NI Hardware and MathWorks Software
In this talk, you’ll hear how NI and MathWorks have been collaborating on software-defined radio (SDR) solutions for over a decade. You’ll discover how to use NI hardware with MATLAB® and Simulink® to perform workflows as simple as over-the-air data capture and as complex as partitioning a hardware/software design for a standard-compliant aircraft tracker. You’ll learn about streaming applications and burst mode use cases, using a vector signal transceiver to parameterize a nonlinear power amplifier, and verifying a deep learning network in a signal intelligence scenario. The talk will conclude with a look at future directions in the NI/MathWorks SDR partnership.
Hardware-in-the-Loop Testing of Control Algorithms for Modular Multi-Level Converters
Multi-level power converters are an enabling technology for equipment that supports a modern power transmission system. Present in flexible alternating current transmission system (FACTS) equipment and high voltage direct current (HVDC) systems, these power converters help ensure stable grid operation and power transmission over long distances.
Learn how to use Simulink®, Simscape Electrical™, and Model-Based Design to develop and test the embedded software that controls a multi-level power converter. You’ll benefit from the presentation whether you’re an engineer who wants to improve your knowledge of simulating power electronics control systems or wants an introduction to code generation from Simulink models.
Through an example of a STATCOM operating in the presence of grid sources, loads, and disturbances, you will see how to:
- Model and simulate the power electronics system
- Perform functional testing with Simulink Test™
- Generate production-ready embedded code and implement it on a TI C2000™ microcontroller
- Validate the embedded software by performing hardware-in-the-loop (HIL) testing of the control algorithms.
Developing Embedded Software for Induction Motor Control Using Model-Based Design
CAF Power & Automation needed to develop a new induction motor control strategy with the main objectives of:
• High efficiency, lower energy consumption motor operation
• Efficient, compact code with less load on the processor
• Certification for safety functions according to EN 50657
• Reduced development time
CAF chose to adopt Model-Based Design using Simulink® , Simscape Electrical™, Embedded Coder® , HDL Coder™, Simulink Real-Time™, and a Speedgoat® real-time target machine.
This presentation shows how CAF implements Model-Based Design. Code generation converts Simulink models into C or HDL production code that integrates into the rest of the software running on the microprocessors, DSPs, or FPGAs. Testing is done first on a fully simulated Simulink environment on a host PC. Then, hardware-in-the-loop testing is performed where the device being tested is the actual controller electronics running the generated code, and a Speedgoat target machine is running the simulated plant model and controlling the execution of the tests. The tests are reused on both the PC and HIL environments, and so are the Simscape Electrical plant models, which are deployed onto an FPGA in the Speedgoat target machine that achieves sample times as low as a microsecond.
Model-Based Design, while helping CAF fulfil all safety requirements, allows for a shorter development cycle as the activity of hand coding the control algorithms is eliminated; an activity which is inherently error-prone. The improved control strategy, already more efficient energy-wise, has made it possible to execute two separate control algorithms in the same electronics where previously only one could be executed.
AI for Medical Device Design and Digital Health Applications
Artificial intelligence (AI) is increasingly gaining traction in the medical devices industry, driven by use cases in digital health such as automating analysis of biomedical clinical data, mining digital records to improve healthcare outcomes, personalizing treatments, and developing devices. Discussions on incorporating AI into Software as a Medical Device (SaMD) indicate there is interest in developing digital health-based solutions within medical device and pharmaceutical companies. To be more successful, organizations need to address many challenges, including bridging the gap between medical and data science to develop AI-powered healthcare analytics, improving transparency in models, and complying with regulations and standards in the medical industry.
In this session, learn how MATLAB® and Simulink® can help you develop next generation medical devices and AI-powered digital health applications by:
- Analyzing and exploring large volumes of physiological patient health records (biomedical signals, images, or text)
- Building and optimizing predictive models without coding and AI expertise
- Incorporating MATLAB and Simulink for use in FDA or CE regulated workflows, confirming with standards such as IEC 62304, and leveraging explainability and interpretability techniques
- Deploying models on the edge or integrating models within your cloud applications without recoding
Physics-Informed Machine Learning: Using the Laws of Nature to Improve Generalized Deep Learning Models
Physics-informed machine learning covers several different approaches to infusing the existing knowledge of the world around us with the powerful techniques in machine learning. One area of intense research attention is using deep learning to augment large-scale simulations of complex systems such as the climate. Here, data from satellites is used with simulation data to predict the evolution of these complex systems. While there is a wealth of data and the computational models have achieved remarkable maturity, the tools used in machine learning are often less constrained than the laws that govern physical processes. Non-physical results can be produced by deep learning predictions unless proper constraints are implemented.
Using Deep Learning Toolbox™ in MATLAB® R2020b, new loss functions can be easily implemented and tested on the fly. To demonstrate, in this talk a simple case of pendulum dynamics will be discussed and the prediction of motion is shown by using two neural networks, one trained with traditional loss function, and one with a physics-based loss function. The results show that the extra constraints allow the network to predict the motion of the system far more accurately than the conventional approach. While this represents a simple proof-of-concept, this model features many common aspects of more complex physical systems and allows for a fast and informative testing platform.
Predictive Maintenance Using Deep Learning
Predictive maintenance allows equipment operators and manufacturers to assess the condition of machines, diagnose faults, and estimate time to failure. Because machines are increasingly complex and generate large amounts of data, many engineers are exploring deep learning approaches to achieve the best predictive results.
In this talk, you will discover how to use deep learning for:
- Anomaly detection of industrial equipment using vibration data
- Condition monitoring of an air compressor using an audio-based fault classifier
You’ll also see demonstrations of:
- Data Preparation: Generating features using Predictive Maintenance Toolbox™ and extracting features automatically from audio signals using Audio Toolbox™
- Modeling: Training audio and time-series deep learning models using Deep Learning Toolbox™
Artificial Intelligence and Real Time Adaptive Knowledge Delivery Using MATLAB
APTRaise Technologies presents edupme, an interactive knowledge delivery framework. Traditional methods of teaching and learning involve human intelligence and influence to impart knowledge. APTRaise Technologies has developed an adaptive framework that keeps a continuous check on what is not known to the learner and tailors knowledge delivery to each user’s need in real time. Built using MATLAB® and toolboxes for machine learning and AI, edupme is delivered via a mobile app in the front end while working with a cloud-based scalable back end.
The immediate benefits of using MATLAB products are:
- Fast time to market
- Real-time usage along with a mobile app
- Speed of integrating changes in the production code
Our simple aims are “to deliver knowledge to the right person at the right time” and to make the framework content agnostic. The delivery framework is tested on areas of education like commerce, engineering, technology, and logistics. MATLAB products, along with other cutting-edge technologies, help to make the framework as agile and learner-centric as possible.
Deploying AI to Embedded and Enterprise Systems
Deploying AI raises challenges beyond those associated with developing a performant AI model, including:
- Meeting hardware constraints of the deployment environment, such as limited memory and power consumption
- Monitoringand maintaining model performance over their lifetime
Learn about expanded capabilities to address the above challenges for both compiler-based and embedded deployment using code generation:
- Quantization: Fixed-point conversion for machine learning models and quantization for deep neural networks allow them to fit on hardware with limited memory and power.
- Incremental learning and model updates: Code generation that separates parameters from prediction code and incremental learning make it possible to improve models continuously.
DevOps provides a framework for managing and governing AI models across their life cycle.
Snow Hazard Index Using Conditional GAN and Semantic Segmentation
Recently, there was a record heavy snowfall due to climate change. In Japan, thousands of vehicles were stuck on the highway for three days. Because of the freezing of the road surface, there was a multi-vehicle accident. Road managers are required to provide indicators to alert drivers regarding snow cover at hazardous locations. During the night, the temperature drops and the road surface tends to freeze. Road managers are required to make decisions on road closures and snow removal work based on the road surface conditions at night.
This session proposes a custom loop deep learning application with live image post-processing to automatically calculate a snow hazard indicator. First, we translate the road surface hidden under snow using a generative adversarial network (GAN), pix2pix. Second, we detect snow-covered and road surface classes by semantic segmentation using DeepLabv3+ with MobileNet as a backbone. Third, we prepare one-to-one paired images and develop a snowy night-to-day translator from the night snow image to the day fake output using the Conditional GAN. Based on these trained networks by MATLAB® toolboxes, we can automatically compute the road to snow rate hazard index, indicating the amount of snow covered on the road surface, even at night.
We demonstrate the applied results to thousands of live snow images of the cold region in Japan. This application has the advantage of using only one live image as the input, without any mining of the before and after paired images dataset. Furthermore, the indicator could be delivered to the road managers and users using our pipeline automatically computed snow hazard ratio index, whose value was zero to 100 for multi-points comparison. As a result, the fake day label output of snowy night images has been well approximated to the real label of snowy day per pixel, with the critical ROI as the snow category for monitoring the winter road safety.
How to Use MATLAB with Python
MATLAB® provides flexible, two-way integration with many programming languages, including Python®. In this talk, you’ll learn how to:
- Extend MATLAB with Python libraries
- Collaborate with colleagues who use Python
- Deploy MATLAB algorithms into production software and IT systems
Handling, Analysis and Storage of Big Data for Healthcare
Advances in experimental science and technology are extending the limits of the resolution, scale, and throughput of data that can be acquired. This creates significant data handling and analysis issues. We develop automated data processing pipelines in MATLAB® to remove manual steps and parallelize processing. Removing these challenges enables us to process large volumes of data from the Rosetta CRUK Grand Challenge Project in order to understand the cancer and tumor microenvironment.
Fighting Fires and Saving Lives with MATLAB
While about 96% of US homes have smoke detectors, only 73% of homes have working smoke detectors—leading to increased fire fatality rates. One primary cause of non-working smoke detectors is people disabling detectors because of too many nuisance alarms. To combat the nuisance alarms, Underwriter Laboratories (UL) created a new standard that requires smoke detector manufacturers to be able to differentiate the sources of smoke—food on the stove versus a burning couch, for example—to make sure that detectors remain on and active to save lives in the event of real fires.
To help smoke detector manufacturers meet this new standard, Analog Devices used MATLAB® to analyze, develop, and test new detection algorithms that greatly reduce the number of false or nuisance alarms by leveraging MATLAB to handle very large data sets, tune algorithms, and generate embedded C code for test automation.
Meeting the Challenges of Design Optimization
Engineers use optimization tools to automate finding the best design parameters while satisfying project requirements and evaluating trade-offs among competing designs. Using these tools results in faster design iterations and allows the evaluation of a larger number of parameters and alternative designs compared with manual approaches.
In this talk, you’ll learn how optimization can be used to meet design challenges for different types of models, from detailed finite element analysis to controls and system engineering. You’ll learn how to address challenges from conflicting design criteria and for leveraging available computing resources. You will also learn about new features that make design studies easier to perform and quicker to complete, including simpler ways to set up and run the optimization steps and new solvers designed for computationally expensive black-box models.
Controlling the COVID-19 Epidemic in Italy Using a Network Model
The COVID-19 epidemic hit Italy particularly hard, yielding the implementation of strict national lockdown rules. Previous modeling studies at the national level overlooked the fact that Italy is divided into administrative regions who can independently oversee their own share of the Italian National Health Service. Here, we show that heterogeneity between regions is essential to understand the spread of the epidemic and to design effective strategies to control the disease. We model Italy as a network of regions and parameterize the model of each region on real data spanning almost two months from the initial outbreak.
The model parameterization is a non-convex constrained nonlinear optimization problem requiring the development of an ad-hoc procedure. Specifically, we used a predictor-corrector algorithm developed within Optimization Toolbox™. We identified the time points at which parameter values present significant changes leveraging a gradient descent method based on the MATLAB® fminsearch function. In addition, we enforced soft constraints to ensure continuity of the trajectory between different time windows and to avoid parameters changing too abruptly.
Furthermore, we confirmed the effectiveness at the regional level of the national lockdown strategy and proposed coordinated regional interventions to prevent future national lockdowns, while avoiding saturation of the regional health systems and mitigating impact on costs.
To cope with parameter uncertainty and possible unmodelled dynamics, we proved numerically the robustness of the designed interventions via a Monte Carlo approach. The parameter realizations were sampled using the latin hypercube technique, implemented in the lhdesign function within Statistics and Machine Learning Toolbox™. This allowed us to explore a wide and homogeneous range of parameter variations. To optimize the computational resources, the code has been designed to run on multiple cores with the use of parfor within Parallel Computing Toolbox™.
The study and methodology proposed can be easily extended to other levels of granularity to support policy- and decision-makers in Italy and abroad.
Sharing MATLAB Apps and Simulink Simulations as Interactive Web Apps
MATLAB Web App Server™ provides the server infrastructure to deploy and share your MATLAB® apps and Simulink® simulations as interactive web apps. You’ll learn how to complete the workflow from development to deployment in three steps:
- Create apps with App Designer
- Package apps, models, and associated data as web apps using MATLAB Compiler™ or Simulink Compiler™
- Deploy and share apps using MATLAB Web App Server
You’ll also see the authoring tool capabilities of App Designer, and how to use Simulink Compiler to deploy your Simulink simulations.
Application of the Pulp Chemistry Monitor at a Copper Mine in Australia
It has long been observed in the laboratory that the pulp chemistry (i.e. pH, pulp potential (Eh), dissolved oxygen and oxygen demand) vary with changes in mineralogy, reagent additions and grinding environment. In the case of grinding environment, Magotteaux has extensive experience in making pulp chemistry measurements within a plant before and after a change in grinding media, and associating these differences with variations in metallurgical performance.
These measurements are collected using handheld laboratory instruments and are collected in short campaigns before and after the change in media. This technique, while valid, has limitations, and the question was asked: can this data be collected on-line and in real time? This led to the development of the pulp chemistry monitor (PCM). However, it quickly became apparent that measuring these parameters is only part of the story.
Measuring is NOT enough! The data must be processed into a form that provides value to the plant by either improved reagent utilisation and/or delivering better metallurgical performance. To this end, a PCM was installed on the Jameson Cell feed at Prominent Hill in December 2018. The pulp chemical data generated was married together with other plant parameters to build algorithms for concentrate grade and recovery control. Subsequent step testing suggested that it would be possible to employ the algorithms to control the Jameson Cell flotation behaviour dynamically. This hypothesis was tested in a short ON/OFF trial. The paper discusses the results of the step testing and trial using the algorithms to control Jameson Cell flotation.
Rocket Flight Safety Analysis and Space System Applications Using MATLAB
Commercial space is a rapidly emerging market that has previously not been viable until the miniaturization of satellites. As the space industry is quickly moving forward, so too must the generation of software tools that enable it. A key priority in achieving this goal is to focus on capability development as opposed to software development – which is an area that MATLAB® excels in.
Southern Launch is a launch service provider with the goal of frequently launching orbital rockets from its Whalers Way Orbital Launch Complex in Port Lincoln. To support this endeavour, we have developed Rocket Flight Safety Analysis tools and Space System applications using built-in MATLAB functionality.
The Rocket Flight Safety Analysis tool (named MAGIC) can assess the danger areas, compute the expected number of casualties, and generate exclusion zones for a launch. This is achieved in MATLAB by wrapping around a 6-DoF simulation tool (ASTOS) and conducting a parallelised Monte Carlo analysis by simulating various failure response modes (FRMs) to capture how the rocket could fail and predict where rocket debris would land. The locations for where rocket debris lands can be statistically analysed to calculate the probability of impact, which is input into range safety equations to determine the expected number of casualties resulting from a launch.
Our Space System applications using MATLAB include a range surveillance network and monitoring meteorological conditions at the launch site in support of the GO/NO-GO decision.
Autonomous UAV Development and Evaluation with MATLAB and Simulink
Unmanned aerial vehicle (UAV) usage is continuously increasing in a variety of commercial applications such as agriculture, construction, and delivery. Developing autonomous UAV requires multi-disciplinary skillsets covering various technology domains (controls, perception, motion planning) and executing simulations to reduce risk/cost/rework before flight testing. In this session, learn how you can use MATLAB® and Simulink® in development workflows for autonomous UAV applications that span from design and simulation to deployment and test, including:
- Defining system architectures
- Modeling UAV systems of varying fidelities
- Simulating autonomous UAV applications in cuboid and 3D simulation environments
- Evaluating autonomous algorithms for self-awareness and situational awareness
- Automating verification and validation tasks to ensure system robustness
- Automatically generating code and deploying to UAV hardware such as autopilots and onboard compute boards
- Interactively analyzing post-flight telemetry data with the Flight Log Analyzer app
Smart Maritime Surveillance Systems
With the increase in available computation and the miniaturization of optical sensors, autonomous surveillance systems are now capable of exploiting information gathered by cameras operating in different wave spectra, such as visible and thermal cameras. The use of these systems is particularly fitting for maritime surveillance purposes such as sea patrolling, incoming threat detection, and maritime traffic management.
In this session, you will see how this project aims to realize a system capable of supporting the decision-making process of operators involved in maritime applications. We will show the workflow from the initial design to the first system prototype deployment. We will also cover the acquisition and preprocessing of video flows coming from multispectral sensors, the transfer learning application on YOLOv2 for the detection of vessels, and the integration of the code generated by GPU Coder™ in a Visual Studio project.
The built-in functions provided by MATLAB® toolboxes for deep learning and image processing applications and the continuous support from our MathWorks worldwide friends allow us to rapidly test our ideas and, with the coder, the resulting algorithms can be translated and implemented in our projects with virtually zero effort. This has been fundamental in completing the research project in the tight 12-month deadline and having enough time to manage environment characteristics. The maritime environment, particularly the open sea, represents a challenging scenario for autonomous surveillance systems; cameras mounted on vessels are subjected to wave motion, continuous background changes, and an almost total lack of reference points. During the final tests, the system was able to correctly detect and track the selected vessel type after video flow stabilization, both in the open sea and near coast environment.
Smart Factory: Autonomous Industrial Robots from Perception to Motion
Advanced robotics systems are core in the factory of the future. Designing autonomous robotics systems requires knowledge and experience in many engineering domains, including mechanical design, perception, decision making, control design, and embedded systems. In this talk, you’ll hear about a complete autonomous workflow that allows engineers to easily learn and apply the many functional domains of robotics. You will also learn about the key features that enable engineers to develop an end-to-end workflow from perception to motion for industrial robot application designs. Some additional topics you’ll gain insight on include:
Performing scalable physics simulations
Designing perception algorithms using computer vision and deep learning
Setting up co-simulation with sensor and environment models
Using motion planning for obstacle avoidance
Achieving advanced control via reinforcement learning
Connecting hardware through ROS network and deployment
Improving FPGA, ASIC, and SoC Quality with Early Architecture Modeling
Whether you are building a prototype or working with a hardware team, adding hardware and system on a chip (SoC) implementation detail in MATLAB® and Simulink® can help you:
- Partition your design components and test bench for reusability
- Model and simulate SoC architecture to identify and eliminate performance bottlenecks earlier
- Model hardware micro-architecture that addresses common challenges in wireless, DSP, controls, and video/image processing
- Make fixed-point quantization tradeoffs and verify functionality and performance before writing any code
- Verify each stage to eliminate bugs before prototyping or handoff
- Improve handoff to a hardware team by providing verification models
Hardware Agnostic Model-Based Design Workflow for Rapid Prototyping of Image Processing Applications
A prototyping platform with a supporting software toolchain enables efficient development and evaluation. In addition, it allows collaboration and can play the role of a demo vehicle. This yields a scalable prototyping platform that caters to a broad range of system engineering needs for seamless modeling of multiple applications in the imaging domain.
A mature hardware-friendly ecosystem of tools and an efficient workflow are essential components to ensure a smooth development cycle. MathWorks toolboxes provide crucial components for simulation, model-based algorithm design, hardware interfacing, and testing; thereby catering to the various stages of the development cycle.
HDL Coder™ and Vision HDL Toolbox™ feature the ability to generate hardware-agnostic HDL code that can be used for deployment to FPGAs and ASIC designs. The HDL Workflow Advisor also streamlines integration of custom board and reference designs to MathWorks tools, thus opening the door for seamless rapid prototyping. The workflow allows the algorithm modeling engineer to stay within the MATLAB® and Simulink® framework for evaluation of custom target hardware.
The well-documented framework and API from MathWorks help designers to configure development boards for rapid prototyping and deployment. The workflow also offers options for scalable solutions downstream and enhances integration experience for a broad variety of applications.
The proof-of-concept and the corresponding workflow utilizing the toolboxes will be presented.
Integrated Workflows for Design, Simulation, Data Visualization, and Analysis of Analog and Mixed-Signal Systems
Continuous Integration with MATLAB and Simulink
MATLAB users are increasingly employing continuous integration (CI) systems to establish a consistent and automated way to build, package, and test their applications. You’ll learn how to run MATLAB® and Simulink® on CI servers like Jenkins™ and Bamboo® and on cloud-based CI services such as Azure® DevOps, CircleCI®, and Travis CI. You’ll also hear about the licensing considerations involved.
Ensemble Embedded Software Integration Platform
We present a model-based, specification-driven embedded software integration platform (ESiP) we refer to as Ensemble. Its primary objective is to shorten the embedded systems life cycle, accelerating the path from conceptualization through design, development, testing, verification, and transition into production and maintenance. The platform achieves this objective by heavily relying on automation of model and code synthesis for portions of embedded software, which are not essential to the software's core functionality, but critical for the overall requirements of compatibility and interoperability of embedded applications with modern deployment ecosystems. This partially relieves the burden placed on embedded application developers. It allows them to focus on the application core without being overburdened by interoperability and portability constraints of their application, thus reducing the overall cost of the development process.
We introduce a new, declarative meta-language called Ensemble iSpec, which allows developers to define systems of systems, data type defintions, and build and deployment specifications. We then present Ensemble ESiP architecture which relies on iSpec for development and integration automation. We describe common workflows used in embedded system design automation using the Ensemble platform. We also work through a set of case studies illustrating how the platform helps in embedded system test and evaluation, integration, and deployment. We cover how it applies to the development of complex systems of systems, simultaneously targeting heterogeneous deployment environments consisting of real-time, embedded platforms, cloud computing platforms, as well as so-called “edge” segments bridging the gaps between ``clouds and embedded boxes.'
Developing Embedded Software with Model-Based Design to Meet Certification Standards
Learn how you can use a reference workflow with Model-Based Design to develop embedded software that meets certification standards such as ISO 26262, DO-178C, IEC 61508, IEC 62304, EN 50128, and others. The workflow spans engineering activities at the system and software levels following a requirements-based process with traceability across all artifacts. Static and dynamic verification and validation activities detect design flaws earlier in the development process. Reports are automatically generated as evidence to comply with standards.
MATLAB and Simulink Integrated in the DO-178C Software Certification Workflow for a New Helicopter
This talk will explain how MATLAB® and Simulink® tools can improve the high-level requirements (HLR) test procedure definition in case of an embedded software certification with respect to the DO178C tables (A6 & A7) criteria.
The main MathWorks capabilities used are Simulink modeling aspect for the requirements (behavior and data), Test Sequence block for test case creation and execution, and MATLAB scripts for automatically saving test case inputs and expected outputs.
This new method leads to the earliest validation of test case maturity and coverage because of simulation of HLR. It improves modification lead time of model and/or scenario in case of HLR modification. It facilitates the review of HLR models by system engineers.
The objectives link to the method integration in the certification process are successful, and the advantages of its applications are proven efficiency for test model validation (and therefore HLR maturity) and allowing more software changes with low validation and verification impact on the host.
What’s New in MATLAB and Simulink for Automated Driving Development
MATLAB®, Simulink®, and RoadRunner help engineers to build automated driving systems with increasing levels of automation. In this session, you will discover new features in R2020b and R2021a that will allow you to:
- Design 3D scenes for driving simulation
- Simulate sensors, scenarios, and vehicle dynamics
- Analyze, calibrate, and label sensor data
- Design detection, localization, sensor fusion, planning, and controls algorithms
- Deploy to C, C++, GPU, and ROS
- Test functionality and code
Designing and Evaluating Sensor Fusion Algorithms for Automated Driving
Learn how to use MATLAB® to design sensor fusion and tracking algorithms. Discover how to simulate different scenario and sensor models using highway lane change as the application. See how to establish quantitative measures and use the tooling in Sensor Fusion and Tracking Toolbox™ to evaluate track quality.
Low Velocity Maneuvering Development with the MathWorks Toolchain
The last hundred meters of an autonomous drive entail unique challenges. There is variability in direction as the vehicle may drive in reverse or perform three-point turns. There are often no road markings, no GPS signal, and no map. This is where a self-driving car is at its most autonomous. The problem statement of the Low Velocity Maneuvering (LVM) team at General Motors is to drive the vehicle in a GPS denied environment with high accuracy, in a variety of parking scenarios such as street parking, valet parking, and learned parking. In this talk we review our LVM development cycle, starting from the architecture management, model-based development, requirements and testing coverage, MIL/SIL/HIL simulations, and code generation for multiple platforms.
Developing Realistic Scenes to Enable Virtual Automated Driving Testing
RoadRunner is an interactive editor that lets you design 3D scenes for simulating and testing automated driving systems. The designed scenes can be exported to most driving simulation environments, including MATLAB® and Simulink®. In this session, you will learn how to:
- Design scenes with roads, markings, signs, signals, and other assets
- Export to common simulators and standard file formats like OpenDRIVE®
- Simulate scenes and sensors to test perception and closed-loop controls
Low-Speed Vehicle Motion Control Algorithms for Automated Driving Functions
ZF Friedrichshafen AG has a rich product portfolio with various Advanced Driver Assistance Systems (ADAS). While pursuing the “Full System Supplier” business strategy, ZF uses its many years of experience in model-based engineering and continuously extends the ADAS capabilities to new use cases and a higher degree of autonomy.
In this presentation, a short overview is given of our approach to solving the maneuver planning and vehicle motion control problem for low-speed scenarios. We will show how development frameworks—especially MATLAB® and Simulink®, the corresponding toolboxes, and technical guidance from MathWorks—enabled our engineers to accomplish the complex task of designing highly accurate and robust trajectory planning and vehicle motion control for ZF’s automated driving products.
Developing a Motion Planner for Highway Lane Change Maneuvers
An automated lane change maneuver (LCM) system requires a motion planner to enable safe and smooth lane changes in highway traffic. The LCM system identifies the surrounding traffic environment, including road construction and target vehicles. The motion planner generates a collision-free optimal smooth trajectory. The LCM system executes lateral and longitudinal controls to follow the optimal trajectory. In this session, you will learn how you can use MATLAB® and Simulink® to:
- Design architecture for an LCM system
- Implement behavior layers to configure the motion planner depending on the surrounding traffic environments
- Implement the motion planner to generate an optimal smooth trajectory
- Simulate and evaluate the LCM with a series of driving scenarios
ADAS/AD Virtual Platform for End-to-End Software Development and Testing
ADAS/AD functions integrate complex perception systems, combine sensors of different technology, and deal with complex traffic scenarios while guaranteeing safety. During software development and testing, a large number of corner cases need to be tested and additional cases have to be discovered; therefore, the final number of kilometers required for certification can explode exponentially. To achieve time-to-market and feasible development costs, ADAS/AD requires strong support from the simulation. Furthermore, each project has unique requirements, a different scope, and must be implemented throughout different vehicle platforms with relevant hardware and software differences.
However, the perfect tool that gathers all your project requirements does not yet exist. Nonetheless, many high-quality software solutions can simulate sensors, traffic, vehicle dynamics, driver behavior, and realistic environments with the level of detail that every project needs. This combination of the best software solutions currently on the market can meet most project requirements. At Porsche Engineering—more precisely at the Porsche Engineering ADAS Testing Centre (PEVATeC) —we bring together the best tools, data sources, and experience in automotive and ADAS to create and provide a flexible modular simulation platform that supports end-to-end ADAS/AD development.
Designing and Deploying Service-Oriented Architectures (SOA) with Simulink
In recent years, the automotive industry has been accelerating its investments in electrification, autonomous driving, connected vehicles, and modern user experience. This trend demands more computing power and innovative electric, electronic, and complex software architectures in cars. The automotive industry is embracing service-oriented architectures (SOA) as a new paradigm to design software applications that are highly reusable, easy to update, and loosely coupled to hardware. SOAs are based on the concept that an application consists of a set of services that are dynamically discovered, published, subscribed, and reconfigured at run time. SOA concepts are used in multiple industry standards, including AUTOSAR, ROS, and DDS.
In this talk, you will learn how to use the Simulink® product family to model, simulate, and deploy application software based on SOA. Highlights include:
- Modeling message-based communication
- Modeling Adaptive AUTOSAR software components
- Generating C++ production code with Adaptive middleware interfaces and exporting AUTOSAR XML
The Evolution of E/E Architecture and the Impact on Future Software Development
Explore vehicle E/E architecture trends as well as hardware and software technologies that are designed to meet complex and heterogenous feature demands in the automotive industry. See how safety, security, and over-the-air updates are considered in the AUTOSAR embedded software and how MATLAB® and Simulink® can be integrated by Vector DaVinci. You will also see how continuous integration can speed up the process of software-driven development.
Design the Next-Gen User Experience with Simulink and Qt Design Studio
The benefit of integrating a Simulink® model with Qt Design Studio results in functionally complete, accurate simulations of Automotive HMI that can be used to study how the system works as well as how the UI looks and behaves. It also enables a parallel workflow in which a developer can build the simulation model in Simulink while a designer is building the UI in Qt Design Studio without any overlapping work between them. In this talk, we will walk you through how a designer creates a functional Automotive HMI with Qt Design Studio and how the HMI and the Simulink model can interact.
High Fidelity Motor Modeling for HIL with FPGAs
Real-time capabilities are drastically increasing what testing can be done for electric powertrains. Important physical effects such as spatial harmonics and high-speed switching of wide bandgap devices can be achieved in the same model. Additionally, you can run these models in parallel with virtual vehicles. This allows you to connect the device under test (DUT), the motor controller in this example, with a virtual or real ECU for expansive test coverage. You can also use these same techniques to implement virtual dynamometers if ECU integration is unnecessary. See how the approaches presented are hardware agnostic and can often take advantage of existing lab equipment.
PIL-Target Introduction for Infineon AURIX TC377
The automotive functional safety standard ISO 26262 highly recommends back-to-back testing for ASILs C and D. It notes the importance of testing in a representative target hardware environment and stresses the need to be aware of differences between the test and hardware environments—for example, due to different bit widths of data words and address words of the processors.
Continental developed an AURIX™ PIL-Target with MathWorks consulting services to improve the code quality for brake systems and fulfill ISO 26262 requirements. This presentation highlights how having a PIL-Target was useful to the development teams and enabled automated PIL verification including code execution profiling, code coverage analysis, and testing. The use of PIL-Targets within Jenkins™ CI systems is also highlighted.
Fuel Cell Virtual Vehicle Models for Fuel Economy, Performance, and Thermal Analysis
Learn about recent advances in modeling fuel cell systems for automotive applications. See how Simscape™, Powertrain Blockset™, and other MathWorks products are integrated to assemble a complex, multidomain fuel cell virtual vehicle model that includes:
- A polyelectrolyte membrane (PEM) fuel cell stack with fundamental electrochemical reactions, H2 and air handling systems, and a thermal management system
- An electric powertrain system with a battery, DC/DC converter, and power distribution unit (PDU)
- Supervisory controllers
- Multiple drive-cycle scenarios
Explore how to use this model for fuel-economy study, controller design, thermal analysis, and component selection.
Integrating AI into Model-Based Design
Deep learning and traditional machine learning techniques can solve complex problems that traditional methods can’t adequately model, such as detecting objects in an image or accurately predicting battery state-of-charge based on current and voltage measurements. While these capabilities by themselves are remarkable, the AI model typically represents only a small piece of a larger system. For example, embedded software for self-driving cars may have different adaptive cruise control, lane keep control, sensor fusion, lidar logic, and many other components in addition to deep learning-based computer vision. How do you integrate, implement, and test all these different components together while minimizing expensive testing with actual hardware and the vehicle?
In this session, you will learn how to use AI with Model-Based Design to make the complexity of such systems more manageable, use simulation for adequate testing, and deploy to targeted hardware (ECU, CPU, and GPU) using code generation. You will see this approach through a few industry examples.
Accelerating Emergency Ventilator Development
At the start of the COVID-19 pandemic, the UK government asked technology companies to rapidly design and manufacure tens of thousands of emergency ventilators to meet the predicted shortfall in the UK’s ventilator capacity. This would involve compressing a work program that would usually take several years into only a few weeks. This presentation covers how Simulink®, Simscape™, and other MathWorks tools were used to support, enable, and accelerate a number of workstreams such as:
- Maturing the pneumatic architecture design
- Understanding the effect of specific component non-ideal behaviour
- Developing and validating a full suite of clinical alarm algorithms
- Developing and validating a Spontaneous Breathing algorithm
Cambridge Consultants discuss how they were able to accelerate development through implementing a comprehensive system model with the use of Simscape and Stateflow®. Using hardware-in-the-loop testing enabled by Simulink Desktop Real-Time™ allowed additional value to be extracted from the system model as a number of algorithms could be rapidly protoyped within simulation before being deployed to representative hardware. The team explains how they were able to overcome challenges presented by the system architecture and changing requirements to successfully develop and deploy a Spontaneous Breathing algorithm, enabling a number of additional clinical modes in less than two weeks.
Speed Up Automotive, Industrial, and IoT Applications with NXP Model-Based Design Toolbox
Learn how to simulate, test, and program applications for NXP™ processors with MATLAB® , Simulink® , and NXP Model-Based Design Toolbox.
NXP processors enable a wide spectrum of automotive, industrial, and IoT applications. NXP processors go from efficient devices with single cores to powerful devices with multiple cores and accelerators.
All NXP processors provide impressive computing power with a low power consumption envelope and a rich selection of peripherals.
NXP Model-Based Design Toolbox works seamlessly with MATLAB and Simulink to deliver a streamlined way to prototype and develop complex applications as they connect the rich MATLAB and Simulink eco-system with NXP platform software, drivers, and libraries.
Simulation and code generation capabilities from MATLAB and Simulink allow NXP customers to focus on developing their applications and then easily run them on NXP processors, directly from MATLAB and Simulink.
During this presentation, we will show how using NXP Model-Based Design Toolbox with processors speeds up development of automotive, industrial, and IoT applications such as motor control, battery management systems, process automation, predictive maintenance, and embedded vision.
Allocation Workflows for Architectures and Requirements
Systems engineers manage multiple levels and types of requirements and architectures. For example, they might need to create a functional architecture based on the functional requirements, then allocate those functions to components in a logical architecture based on the system requirements. Managing and analyzing all the relationships between these artifacts can be challenging. System Composer™ and Simulink Requirements™ can make these tasks easier for you. Learn how to create links between requirements and architectures, view links in a compact matrix form, and perform analysis based on your defined criteria.
System Architecture Creation Using System Composer
The process which we follow in requirement-based software development lacks a standard process of creating requirement specifications. Currently, the specification used by the developer might be in the form of a flow chart, textual requirements, or hand drawing.
The process for analyzing the requirements and creating a system architecture that can be used for software development is necessary.
To address this void, architecture-based software development shall be adopted with minimal changes in the conventional process.
System Composer™ can address this void. As it is a Simulink® product, the conventional process of software development won’t be altered much.
After going through the presentation, the participants can correlate to the advantages of architecture-based software development over requirements-based software development.
The aim is to create a standard process of analyzing the requirements, creating a system level architecture, and using it for software development.
With this process, the developer has access to the entire architecture while implementing the software and can relate to how the requirements are affecting the entire system with functional background. It will be easy for the developer to understand the entire architecture even if he is new to the topic. Other advantages of using System Composer would be requirement traceability and impact analysis of change of requirements on the design or vice versa.
On the other hand, this process will be easy if the software development is done in Simulink, as the integration between System Composer and Simulink is seamless. Additional scripts might be required if the software development is done in another software such as TargetLink® .
Integrating External Simulation Components with Simulink
As a simulation integration platform, Simulink® provides multiple and versatile ways to bring in components modeled in third party simulation tools.
In this talk, you will see a demonstration of S-functions and FMI/FMU—the most frequently used ways to integrate a third-party model with Simulink.
You will also hear how Simulink can conveniently bring in your custom C/C++ code, automatically compensate co-simulation signals for better numerical accuracy and scale up for system-level simulations.
The Digital Drive - FMI Customer Models
Danfoss Drives (DDS) develops variable speed drives for multiple applications covering a broad spectrum of industry segments. Among customers there is an increasing demand for system simulation models to benefit from Model-Based Design advantages. As simulation tools differ depending on industrial sector and application, DDS starts to provide a standard Digital Drive model to be used in various tool environments. The Functional Mockup Interface (FMI) is utilized as it is industry standard for sharing models.
DDS is using a Model-Based Design toolchain to generate production code from MATLAB® and Simulink® models. To use the same base asset for customer models, this toolchain is enabled to automatically generate Functional Mockup Unit (FMU) models.
The presentation shows the journey of creating Digital Drive FMI customer models, starting in application development:
- Danfoss Drives and the Model-Based Design toolchain
- The Digital Drive use case
- Generation of FMI models from MATLAB and Simulink model assets
- Digital Drive use in third-party simulation tools
Advancing Engineering Education with Virtual Labs
The COVID-19 pandemic has accelerated ongoing transformations by forcing the adoption of new technologies across industries, including education. One of the most challenging areas to adapt existing instructional and development workflows into a new format is in the laboratory. In this session, you’ll see the fundamental components of lab work in engineering education and research outlined and mapped to online formats, plus examples of virtual lab activities and workflows using MATLAB® and Simulink®. Get a detailed walkthrough of the Robotarium at Georgia Tech along with the outline of a vision for how virtual labs will enhance current education and research strategies.
Employing Machine Learning to Correlate Fluid Properties
This contribution presents a classroom exercise aimed at first-year science and engineering college students, where a task is set to produce a correlation to predict the normal boiling point of organic compounds from an unabridged data set of >6000 compounds. The exercise, which is fully documented in terms of the problem statement and the solution, guides the students to initially perform a linear correlation of the boiling point data with a plausible relevant variable (the molecular weight) and to further refine it using multivariate linear fitting employing a second descriptor (the acentric factor). Finally, the data are processed through an artificial neural network to eventually provide an engineering-quality correlation. The problem statements, data files for the development of the exercise, and solutions are provided within a MATLAB® environment and are general in nature. A discussion is presented on possible extension of these exercises within the physical sciences.
Teaching Electric Power Systems with MATLAB and Simulink
COVID-19, university distancing restrictions, and limited lab space drove the need for virtual teaching of ECE 3033 Electric Power Systems for the Fall 2020 semester. Interactive virtual lecture materials were created using MATLAB® . Four virtual lab experiments were developed using Simulink® and Simscape™ products. These lab experiments included a single-phase transformer, three-phase power system, DC shunt motor, and synchronous generator.
Equivalent circuit models (ECMs) are the foundation of the Electric Power Systems course. Each electric machine’s performance and circuit parameters are related through the ECMs. It was imperative that the virtual lab experiments based on Simulink would also be related to the ECMs taught in the lectures.
The philosophy behind developing the labs were:
- To reproduce what the engineering student would experience during a normal hardware lab
- To compare the results back to equivalent circuit models covered during the lecture
In a hardware lab, the engineering students will first adjust various inputs. Next, they’ll observe how the effect of the input propagates through the system and then record the performance data. The students will repeat this process until the lab is complete. The lab encourages the engineering students to explore—make a change, see what happens—to develop their engineering intuition and furthermore, ask questions.
The students are divided into three-person lab teams. Each team is required to complete the virtual lab and then write a professional report detailing the experiment and associated questions. The student feedback showed that the virtual labs were well accepted and reinforced the concepts from the lectures. The feedback also demonstrated that the labs did not replace the experience of building and troubleshooting a real circuit or electric system. The students felt that instead of replacing a hardware lab, the virtual labs could complement them and provide opportunity for further exploration with no risk to equipment.
Cloud Data Workflows for Scientists and Engineers: What You Should Know
Organizations are generating and collecting more data than ever before. Engineers and scientists need to unlock new discoveries and insights from this data. Accessing and managing such large amounts of data is expensive, complex and is beyond the expertise of most researchers. To overcome these barriers, organizations are moving data to the AWS® cloud and using Databricks to speed up data analysis.
In this session, you will learn how to move data from an on-premises desktop environment to a research production environment on the cloud.
We will review how to:
- Bring data to AWS and use relevant analysis-optimized storage techniques
- Use the first and only lakehouse platform in the cloud. Databricks unifies all your data, analytics, and AI/ML motions on a simple, open, and collaborative cloud platform.
- Use MATLAB® to connect to AWS and Databricks and harness cloud power in the code and toolboxes you trust, right from the desktop
- Share data, algorithms, and models in production at enterprise scale with MATLAB Production Server™ on AWS
Learn from AWS, Databricks, and MathWorks what lies ahead where desktop and local storage limitations no longer constrain innovation.
Exploring Challenges with Artificial Intelligence and Augmented Reality
Artificial intelligence (AI) is used for a variety of applications in various industries. AI can be combined with other technologies to assist with understanding implications of certain aspects of applications. In this workshop, you’ll explore how pose estimation results implemented using deep learning are impacted based on a location provided with augmented reality. You’ll see how these combined technologies provide insight into how poses could be interpreted differently based on a scene. You’ll also consider the consequences of using AI for applications that are different from its originally intended use, which could lead to both technical and ethical challenges.
Mission Planning of a Quadcopter Using a Digital Twin
Simulation is becoming increasingly important in today’s world. Using models of machines and systems, you can run simulations before actual devices are built and deployed. The digital representations used to simulate physical systems are known as digital twins, and each physical entity has its own digital representation. Digital twin has become one of the most promising emerging technologies.
In this hands-on workshop using MathWorks cloud tools, you will use a digital twin of a quadcopter to collect simulation data, analyze collision avoidance controls, and test scenarios to ensure the drone can accomplish its critical mission. The digital twin uses a physics-based modeling approach using Simscape™. You will learn about autonomous navigation with path planning algorithms using the Navigation Toolbox™ and will also see how to predict component failure using the Predictive Maintenance Toolbox™ to successfully replan the mission.
You will walk away from the workshop with a better understanding of these trending technologies that can be used in multiple application areas.
Modeling and Solving Optimization Problems
You can apply mathematical optimization techniques to find optimal designs, estimate parameters, and make optimal decisions for problems in engineering and finance.
In this workshop, you will learn the problem-based approach for modeling and solving optimization problems. You will see how to use an intuitive syntax to represent your problem, starting from its mathematical description. Working through a variety of examples, you will learn how to:
Define sets of optimization variables using arrays indexed by numbers and strings
Construct objectives and constraints from expressions of optimization variables using MATLAB® operators and functions
Include black-box functions
Use automatic differentiation
Apply an automatically selected solver
AI in the Cloud Workflows with MATLAB
See how engineers and scientists who work on AI projects know when to move some or all of their development to the cloud and which cloud platform to choose.
You’ll also learn about MATLAB® cloud options and the advantages they offer across the development stages of a system based on AI.
Through a case study of training, tuning, and deploying a semantic segmentation model, you’ll hear about the following topics:
- Managing data efficiently
- Provisioning MATLAB on AWS®—ready-made for deep learning with NVIDIA® GPUs
- Scaling to GPU cloud compute clusters
- Deploying models for unlimited accessibility
You’ll also learn about the following capabilities:
- Running MATLAB Deep Learning Container on NVIDIA Compute Cloud (NGC)
- Using MATLAB Parallel Server™ on AWS for multi-GPU computing
- Deploying the MATLAB Production Server™ pay as you go (PAYG) environment on Azure®
Digital Twins for Embedded, Edge and Cloud Platforms
In today’s world of the next industrial revolution, many key industry players are forced to change their conventional process and practices. To join this major transformation, usually referred to as Industry 4.0, they must pursue extensive R&D efforts in developing cyber-physical systems.
A digital twin concept as a part of Industry 4.0 strategy can offer answers for these challenges by integrating and deploying different variants of digital twins (production, product, and performance) of the physical assets on various systems such as embedded, edge, and cloud platforms.
This talk will discuss the development of a gas turbine performance digital twin using tools such as Simulink Real-Time™ and Simulink PLC Coder™. The performance digital twin utilizes real-time high-speed computing and can be leveraged with various enterprise and IoT cloud platforms. Proposed solutions are provided in a form of modular software architecture for a range of hardware platforms with corresponding functionalities to support model-based control strategies and advanced asset health management.
This project explored novel advanced techniques, which can meet the challenging requirements of increased reliability, improved efficiency, and extended operational life of gas turbine assets. The digital twin, based on real-time dynamic engine models, has emerged as the most viable approach for solving challenging control and diagnostics requirements.
The real-time, online digital twin technology has the ability to enhance current state-of-the-art offerings which are predominantly based on non-real time and offline solutions. The devised solution highlights the next generation of digital twins that exploit modular functionalities distributed across the whole IoT chain consisting of embedded, edge, and cloud computational platforms. The gas turbine performance digital twin has been deployed on the operational site, and collected field data have been analyzed and presented in this study.
Energy Asset Health Monitoring on a Data Science and IIoT Platform
We have to change something! Climate change is one of the biggest challenges of our generation. We need an energy revolution moving to renewable energies. Wind energy is one of the most important technologies on the way to the future. Wind farms offer an interesting field of application for data analysis and IoT platforms. Wind turbines generate a lot of data, from which big knowledge can be generated with the help of data analytics. We show how MATLAB and other technologies can be used to build a technologically sustainable IIoT & Data Science platform to help us generate the energy of the future.
Scientists and Engineers Save the World
The COVID-19 pandemic has been unlike anything we’ve ever experienced. Scientists and engineers from all over the world pivoted to take on this monumental challenge. From detection, to mitigation, to treatment, they innovated rapidly—while experiencing their own digital transformations—to help us all through this. Many of them rely on MATLAB® and Simulink® as essential tools for their work. This talk will highlight transformative projects and surprising applications throughout this battle, and the role MATLAB and Simulink are playing as leading researchers and engineers work to overcome this challenge and prepare us for a brighter future.
Electrification in the Aerospace Industry
Electrification is being hailed as a pillar of the so-called Fourth Industrial Revolution. Along with digitalization, it’s touted as the harbinger of new efficiencies in supply chains and the platform for new public policy. It will radicalize the design of everything from our power grids and roadways to vehicles and urban centers. Electrification offers the opportunity for sustainable growth, reduced carbon emissions, and a new, fundamental change in the way we power the world around us. The vision of Rolls-Royce Electrical is to be a world-class provider of electrical power and propulsion systems and thus to champion electrification.
Electrification of aircraft bears the promise of more efficient, silent, and sustainable flight—reducing fuel consumption and operating costs for aircraft operators. Power density, reliability, weight, volume, and fault tolerance are of paramount importance for aerospace-intended electrical machines and drives. We tackle these challenges with proven systems integration, systems engineering expertise, and professional and experienced flight test and flight operations capability of our team. We have built our long-standing expertise with propulsion systems in the MW class for marine and industrial applications, and have become a world leader in the development of MW power for hybrid-electric aircraft in the regional aircraft class. Electrification also opens new markets like sub-megawatt propulsion for commuter aircraft and urban air mobility that enable us to grow value beyond today’s core markets and scope of supply. Ongoing developments in the field of More Electric Aircraft technology to eliminate hydraulic, pneumatic, and gearbox-driven aircraft subsystems will contribute to further reduction in fuel consumption and operational costs. Join this session to understand the direction that electrification will take in the aerospace industry.
Building Knowledge in an Interdisciplinary World
Engineering, science, and product development are becoming more interdisciplinary. To be successful in this environment, team members must have broad knowledge across many different fields.
With advancements in artificial intelligence and with engineering tools like MATLAB® and Simulink® providing more capability, the question becomes how much do you really need to understand about the underlying engineering and science versus just being able to run the algorithms?
One person can’t know everything, but it is often helpful to know something. We need some knowledge to be able to perform trade studies to see if a particular technology is worth pursuing, or to communicate with other teams outside of our own expertise, or to understand how these new technologies impact and interface with the products we’re responsible for.
Join Brian Douglas as he talks about what that knowledge looks like and how MATLAB Tech Talks, shipping examples, and documentation can go a long way to providing it.
Transforming the Software Development Paradigm to Meet Unique Needs of Our Industry and Customers
Software-defined features are becoming a key differentiator for automotive, industrial automation, energy, and other industries. Cummins is transforming its software development paradigm in order to meet the unique needs of our industry and customers. Our goal is to increase development speed by utilizing AUTOSAR architecture to leverage a wider supplier base for software and tools while focusing on model-based engineering methods that enable simulation-based product development. To achieve this goal, Cummins launched C-SAR as a key change initiative for our future—growing energy diversity, connectivity, and increasing integration. In this talk, hear about the direction and journey of Cummins thus far.
The Interactions Between Natural and Artificial Intelligence
Modern AI was inspired by brain research more than 60 years ago. Neuroscience has moved forward since, allowing the mapping of neuronal circuit architecture at ever-increasing scale and pace. While this progress requires modern AI to succeed, there is the justified hope that connectomes from cognitively capable animals will inform modern approaches to artificial intelligence that may overcome the limitations of energy and label inefficiency in concurrent AI.
Advancing 5G for a New Decade
We are at the beginning of a decade of 5G wireless technology evolution, with new 5G improvements for mobile broadband experiences and smartphones, and 5G proliferation into new devices, services, spectrum, and deployments. With recent advancements in 5G and AI, we are transforming the intelligent wireless edge, which will unleash new capabilities and efficiencies that weren’t possible previously. We will explore how 5G is enabling a vast variety of industries, including industrial IoT, automotive, and extended reality (XR). For example, 5G will enable the wireless smart factory of the future, providing ultra-reliable and low-latency services required to connect a wide range of industrial IoT services. On the automotive front, 5G will improve our infotainment experience and make our roads and intersections safer. Boundless extended reality (XR) is a new human interface that will make our interactions with the digital and virtual worlds more immersive. These are just a few examples that show how 5G is at the foundation of our increasingly connected world. Join this session to see where we are with 5G today, what’s included in the recently completed 3GPP 5G NR Release 16 specifications, and how we are driving 5G into the future.
What’s New in MATLAB and Simulink R2021a
Learn about new capabilities in MATLAB® and Simulink® to support your research, design, and development workflows. You will see new capabilities for modeling, simulating, and sharing designs, as well as new tools for increasing productivity and authoring better code and models.
Develop a By-Wire Braking System for L4 Autonomy Trucks
Software-defined features are becoming a key differentiator for automotive, industrial automation, energy, and others industries. Cummins is transforming its software development paradigm in order to meet the unique needs of our industry and customers. Our goal is to increase development speed by utilizing AUTOSAR architecture to leverage wider supplier base for software and tools while focusing on model based engineering methods that enable simulation based product development. To achieve the goal, Cummins launched C-SAR as a key change initiative for our future considering growing energy diversity, connectivity, and increasing integration. In this talk, Cummins will discuss our direction and journey thus far.
System Architectural Design According to Automotive SPICE Using the MathWorks Toolchain
Automotive SPICE is applied in the development of electronic control units of Robert Bosch powertrain systems. Special challenges are the large amounts of projects and customer-driven variance in the design, which lead to increased efforts in the creation of system architectural design models based on a SysML/UML approach.
As Simulink® is a well-known tool for software engineering in Robert Bosch powertrain systems, the MathWorks toolchain is used as an alternative. System Composer™ is used to create the functional architecture and logical architecture including automated generation of models. Simulink Requirements™ is used for allocation and traceability to requirements. The dynamic behavior of the system is described with Simulink and Stateflow® .
The solution is proven by use in several customer accounts including Automotive SPICE assessment.
DevOps for Software and Systems: Putting Algorithms and Models in Operation
Many organizations using MATLAB® and Simulink® to develop algorithms and models see an increased need to deploy, monitor, and manage them over their lifetime in production. DevOps refers to the set of capabilities needed to operationalize software applications, usually in an IT context. However, it is not straightforward for engineering algorithms and models; Gartner reports that more than 50% of data science projects do not result in business value due to problems with their operationalization. It is even more challenging when they involve physical systems.
DevOps for software and systems are needed by teams responsible for the operational performance of algorithms and models. Those teams often include engineering, software development, IT, and OT (Operation Technology). Engineers test, deploy, and debug their algorithms and models through the entire lifecycle, including redeploying algorithms after they are in operation. These production systems are typically owned by IT/OT.
In this session, learn how engineering teams are using MATLAB and Simulink product families to operationalize their algorithms and models and to bridge the gap with IT/OT teams.
Accelerating Design, Data Visualization and Analysis of Analog and Mixed-Signal Systems
Digital, analog, and mixed-signal ICs are a core part of the current wave of innovative technologies that enable connectivity, IoT, medical devices, autonomous systems, and mixed reality. IC development teams are bringing increasingly complex systems to market faster than ever to meet the needs of this wide variety of applications. Design automation workflows that scale and improve various stages of the IC development process, including digital design, analog and mixed-signal design, pre-silicon verification, and post-silicon validation, are critical for the success of these complex engineering tasks. In this presentation, hear from Cadence and MathWorks about the latest advances in the integrated MATLAB® and Virtuoso® ADE workflow for data visualization, analysis, characterization, and verification of advanced AMS designs. You’ll also learn about behavioral modeling and simulation of mixed-signal systems in Simulink® and cosimulation and export of DPI-C System Verilog models for final simulation in Virtuoso.
Introduction to AI
Are you new to machine learning and deep learning? Do you want to learn how to apply these techniques in your work? Machine learning algorithms use computational methods to learn information from data without relying on a predetermined equation as a model. Deep learning is a machine learning method that applies neural networks with many hidden layers. These neural networks learn directly from raw data and can surpass the accuracy of machine learning algorithms in certain applications.
In this hands-on introductory workshop, you will learn how to apply machine learning and deep learning to images and signals. You’ll see how MATLAB® provides an environment to apply advanced techniques without requiring coding or experience in machine learning and deep learning. You will also:
Learn the fundamentals of machine learning and understand terms like “supervised learning”, “feature extraction”, and “feature selection”
Build and evaluate machine learning models based on images and signals
Learn the fundamentals of deep learning and understand terms like “layers”, “networks”, and “hyperparameter tuning”
Build and evaluate deep neural networks for images and signals
Understand the difference between both techniques and their use cases
Holistic Learning by the Integration of a Test Rig into Engineering Teaching
Most of our students in our study program “sustainable supply of energy and resources” can be very passionate about it. To achieve the goals towards sustainability some highly complex technological developments have to be achieved. However, one of our struggles is to transfer passion of the students into highly important basic educational areas such as programming, electronics, modeling or control theory, that are needed to improve sustainability.
To address this issue, we decided to develop a mini pumped storage power plant and connect it to problem solving based teaching principles. Using the test rig, a list of topics can directly be addressed and integrated into state of the art, hands on engineering education from first year to master programs. The theoretical background connected to student developed models and system parts give direct feedback of the learned and applied knowledge to their problem-solving approach using MATLAB.
This approach helps us create a holistic educational concept where students not only obtain knowledge, but are able to connect disciplines, to solve problems, and eventually be better professional engineers that engineer for a more sustainable future.
In this talk, you will learn on the one hand, of the setup of the test rig and integration into MATLAB via Robot Operating System. On the other hand, you will learn about the integration and potential in teaching, how the students can interact with the test rig and can apply the knowledge within the different courses. This brings us towards the creation of a holistic learning culture. Last but not least, we will talk about our lessons learned.
We, the Institute for Advanced Mining Technologies, work on Research, Teaching and Transfer for safe, efficient and responsible raw material extraction through the automation and digitalization of mining machines and processes.
Active Digital Twins at ESA's Control Lab: Enablers for Complex Spacecraft Controls Solutions
Current and future control problems at the European Space Agency (ESA), such as reusable launch systems, high performance satellites, and the next generation of planetary high-precision entry descent and landing systems, require robust stabilization and performance as well as real-time adaptation. The same holds true for the future ESA science, exploration, and robotic missions. These include rendezvous, robotic assembly, refueling and mated flight operations, and spacecraft high-precision pointing and agility. In general, it remains difficult for engineers to acquire and integrate domain-specific models stemming from fluid dynamics, mechanics, thermal, propulsion, and other disciplines.
This talk will show:
- How digital twins enable applications at the ESA through a unified multi-domain toolset for the exploration of novel active controls solutions.
- How using advanced multi-physics modeling and control techniques within a monolithic design framework allows systematic assessment of specifications in time and frequency domains.
- How to cycle rapidly between design options and validation through model-in-the-loop, processor-in-the-loop, and hardware-in-the-loop within in-flight testing environments enabled by automatic code generation.
Model-Driven Production Software Development for Calibration at ASML
ASML develops lithography machines that operate with nanometer precision. To achieve high accuracy, a large number of calibrations and qualifications are needed. To achieve this, ASML deploys specific software applications on the machine to maintain performance during its entire lifetime.
A challenge is that functional engineers, who are responsible for requirements and design, work in a different domain language than software engineers, who are responsible for implementation. This can lead to misinterpretations and rework.
To deal with this challenge, ASML has successfully ramped up model-driven development of these applications by providing a common language for both groups of engineers, which enables them to work closely together during the whole development process. In this workflow, an executable MATLAB® and Simulink® model is developed and matured gradually. The annotated model acts as single source of truth for design, documentation, and implementation. The model is tested using a remote connection from Simulink to a real lithography machine, which results in early risk mitigation. Finally, C++ code is generated from the model, eliminating re-implementation.
Model-Based Design for Next Generation AURIX Automotive Micro Controller
The current trends in the automotive industry, like ADAS, xEV, and EE architectures, lead to a demand of high computational power. Fulfilling this demand within the environmental constraints of an automotive system is a challenge. The next AURIX™ generation TC4x addresses this with an heterogenous multicore architecture. A multicore compute cluster with scalar cores is combined with an accelerator capable of processing multiple data with a single instruction.
This presentation will demonstrate how such an architecture can be programmed using Model-Based Design. We will focus on the challenge of moving algorithms from a classic scalar architecture to a parallel architecture. In addition, we also will demonstrate a workflow for finding an optimal partition between the different compute clusters on the AURIX SoC. The challenge for the parallel architectures is that there are many different programming models in the market, which are not really compatible with the paradigms of the safety critical software development. We will demonstrate how this can be solved using Embedded Coder® and its customization capabilities. The basis of the partitioning concept is the capability to easily move software from one compute cluster to another. Therefore, different partitions of the application can be profiled using the PIL methodology. The results can then be used to determine an optimal partition and model it within Simulink®. With the help of SoC Blockset™, it is also possible to simulate different scenarios and further fine-tune the application.
Overall, this presentation demonstrates how the full computational power offered by the next generation AURIX can be easily utilized.
Accelerating AI and Deep Learning Workflows with MATLAB and NGC
AI has moved beyond research into mission-critical production, solving real-world problems for organizations around the globe. However, developing and deploying AI at scale is hard. NVIDIA® NGC catalog is a hub for GPU-optimized AI software such as containers, pretrained models, SDKs and Helm charts that span various industries and use-cases. The catalog content allows engineers and scientists to build AI solutions faster, and DevOps and IT teams to streamline the development-to-production processes. In this session, we’ll discuss how the NGC catalog helps accelerate AI workflows. We’ll also show how you can use the MATLAB® container available in the NGC catalog to easily build and deploy your AI solution on-premises, at the edge, or in the cloud.
Mapping In Vitro Data with Clinical Outcomes in Generic Drug Development Using SimBiology
Pharmacokinetics of inhalation drugs is complex and is driven by multiple interdependent variables such as drug deposition, absorption, and clearance. Present study aims to correlate the lab-based experimental data with the clinical data of plasma concentration of the same drug in healthy volunteers. A series of mechanistic compartmental modeling (or physiological modeling) and non-compartmental analysis were done using SimBiology®, with in vitro data as the input to predict the clinical fate of the drug. Fluticasone Propionate, a corticosteroid prescribed often in the treatment of asthma and COPD, was used as the molecule of interest. The model included drug-specific physicochemical properties, human pulmonary physiology, and parameters of the delivery mechanism. The model captures lab-based in vitro data of delivered Fluticasone Propionate, the lung physiology parameters of healthy volunteers from Indian population, and estimates the blood plasma concentration levels of the drug in these volunteers. The model was validated against the literature data adequately and was verified against a challenge case designed using an in-house developed formulation. The model accuracy is 95% and more for oral inhalation and has specificity of 80%. The model thus enables early decision making in the planning of clinical trials, screening multiple formulations, and drug delivery optimization, which can have a significant upside for the organizations with first-time-right clinical outcomes.
Deploying Artificial Intelligence on PLCs
Artificial Intelligence (AI) can be leveraged in industrial applications such as predictive maintenance, machine vision for automatic inspection and analysis, virtual sensors, and more.
In this session, see how industrial control systems engineers can use MATLAB® and Simulink® to design and deploy a predictive maintenance AI algorithm on PLCs or other industrial controllers.
You’ll learn how engineers can use MATLAB to quickly try out different approaches and apply their domain expertise to prepare the data and AI models.
You’ll also see how engineers can use Simulink to integrate AI models, automatically generate C/C ++ code, and upload or deploy algorithms on industrial controllers such as PLCs. This allows engineers developing the AI model and control software to stay hardware platform independent and focus on the design without having to worry about manual code conversion.
Building AI Applications for Signals and Time-Series Data
AI techniques can be applied to signal and time series data to classify signals, identify events of interest and anomalies, and make intelligent decisions at edge computing nodes. In this session, you will see how to use MATLAB® to build robust real-world applications for communications, digital health, and machine health monitoring. You’ll also learn tips and tricks to speed up data preparation, improve network accuracy and performance, and work with less training data. You’ll see the latest features for:
- Data synthesis through apps, simulation, and GANs
- Automated signal labeling techniques using the Signal Labeler app
- Advanced preprocessing and feature extraction techniques to improve deep networks, including automated feature extraction techniques and single line deep learning
- Deployment to embedded devices and GPUs
Radar System Engineering for Next Generation Systems
Multifunction radars are increasingly relied on to perform functions beyond detecting, tracking, and classifying objects. The integration of electronically steered phased array front ends enables communications and signal analysis operations from the same system. The resulting system complexity drives the need for increased levels of modeling and simulation.
In this session, you’ll see how you can integrate your modeling and simulation work across the different phases of radar system development, starting with requirements analysis and moving through detailed design, testing, and deployment.
You will learn how to use synthesized radar data to improve your design choices, how to share design models across organizations, and how to use testing data to shorten integration cycles. Using a practical example, you’ll also discover how to:
* Perform link budget analysis and evaluate design trade-offs interactively with the Radar Designer app
* Simulate radars using statistical models and signal-level models to ensure the best balance between fidelity and time
* Model closed-loop, multifunction radars
Handling Analysis and Control Development of Commercial Trucks with Volvo Transport Models
Volvo performs complex handling analyses for different truck combinations using Simscape Multibody™ (formerly SimMechanics™). A library of truck subsystems and complete vehicles have been developed by Volvo engineers. These libraries enable easy assembly of truck models and fast simulations of dynamics for small and large trucks. Thanks to its fast performance and animation features, this model library, now named Volvo Transport Models (VTM), has been adopted by multiple teams at Volvo. VTM is also used in early phase of truck control development.
Transition to ISO 26262 Compliant Development Through Gap Analysis
ISO 26262 has been around for over ten years, is now viewed by the industry as a “best practice,” and may in future become a regulatory requirement. Because of the lag in adoption due to automotive product life cycles, engineering teams may not have gained the necessary practical experience to understand the expectations and work product deliverables required by such a project. Succeeding in an ISO 26262 project requires both formal training and practice in the art. It is also necessary to know the proper time in a project to get outside help to gain the experience and guidance to succeed. An ISO 26262 gap analysis is a good way to receive the feedback needed to direct resources to deliverables that will meet the expectations of the assessors.
Rapid Prototyping of Medical Image Analytics Used in Clinical Decision Support Systems
Medical imaging and analysis facilitate better diagnostics, patient management, and medical research. Rapid advancements in medical imaging technology and healthcare have increased data volume and complexity and influenced decision-making. The modern personalized health care integrates multi-modal data to achieve improved clinical decision support and management of the condition. Research data analytics involves identifying patterns, connections, and relationships from the data to get insights from quantitative and qualitative data. MATLAB® enables data analysis and rapid prototyping of decision support systems through its image and signal processing, optimization, and AI modeling and deployment capabilities. In this talk, learn about the challenges of healthcare and how tools like decision support systems help to ease the pain points by discussing:
- Transformation in healthcare models
- Pain points in healthcare
- Opportunities and challenges in ML-based radiology
- Examples of pre-clinical and clinical projects
Continuous Modeling with MATLAB and Microsoft Azure DevOps
Is the concept of continuous integration familiar to you? Have you tried, or are you planning, to apply these practices to Model-Based Design? We want to show you how you can succeed at that tough task and present to you our approach to continuous modeling.
The complexity and challenges of any kind of projects, where a piece of code is needed, are getting bigger and bigger. This also applies to Model-Based Design, which is now used in almost every industry. Development and maintenance of models is not a single-player problem anymore. In contrast, development teams, usually located at different locations, even in different countries, are connected remotely (especially due to the COVID-19 pandemic) and work on different parts of bigger models.
Agile methodologies have come to stay and this is an overhead that development teams or companies adopting them from scratch must overcome. The adoption process of some of the most well-known agile methodologies is often underestimated, since the teams that are adopting them must be appropriately coached, new roles must be defined within the organization depending on the methodology, and so on. Choosing the right tool, in our case Microsoft Azure® DevOps, has been crucial for success. All of that increases the challenges of the efficient model-based development.
Do the following points sound familiar to you?
- People sharing parts of models via USB flash drive, emails, or shared network folders
- Big bangs when models are put together
- Models are developed without any kind of version control system, e.g., git
- Unmaintainable and unversioned build scripts
- Lack of continuous integration strategy for models
- Lack of release and defect management for models
In this case, we want to show you our approach to continuous modeling to solve all these challenges in an elegant way. To this end, we have defined a modeling framework on top of Microsoft Azure DevOps. This framework allows us to create, maintain, and extend our existing models faster and more easily than before.
Authentic Engineering Assessment: From Quizzes to High-Stakes Examination
This talk highlights developments in two engineering courses, with a goal to deeply integrate “authenticity” into all elements of course design and assessment. Achieving this goal required fundamental changes to the assessment structure and modes of delivery, most prominently the adoption of MATLAB® to augment theoretical concepts.
Despite the abundance of software in professional practice, learning and teaching applications of engineering software are not deeply integrated into curricula. As a discipline, we have struggled to reconcile the need to teach algorithmic thinking and software against a perception that it lowers assessment integrity (and detracts from learning fundamental engineering skills). An unmistakable example of this conflict is high-stakes, invigilated, pen-and-paper examination; attempting (and often failing) to isolate a student brain from modern conveniences in the interests of “integrity.”
I have redeveloped three courses to have MATLAB live scripts at their core, enabling all activities to mirror professional practice yet remain scaffolded. Rather than being a superficial addition to coursework, software is embedded into all classes, quizzes, projects, and exams.
For every theoretical calculation taught in class using “traditional” hand calculations, a digital counterpart was developed with MATLAB live scripts. These scripts allow for native integration of theory, widgets, simulations, and graphical output. The digital counterparts follow a common three-step framework: replicate the class example, generalize the calculation, then extrapolate to a class of similar problems. Students develop algorithmic problem-solving skills which allow them to decouple conceptual understanding from the underpinning mathematics. Class projects are designed to encourage students to extend the codebase provided to them. Open-web final exams then complete the feedback loop, leveraging their code library and skill sets they have developed over the course.
Integrated Classroom Teaching of Control Using MATLAB and Simulink
Process control, as a university course, is viewed by learners as a challenging subject primarily due to its inevitably rich mathematical nature. Furthermore, it involves concepts and methods (e.g., algebraic and frequency domain analysis of dynamical systems) that are unique and novel in the undergraduate engineering curriculum. An added challenge is the increasing discomfort with mathematics-intensive courses. On the other hand, familiarity with tools for control system design based on MATLAB® and Simulink® is considered a valuable skill set in process industry. The prime objective of this session is to present a consolidated solution to these issues using MATLAB and Simulink. The software environment serves as an excellent platform for conducting virtual experiments for introducing few core results in control and the concepts of open-loop and feedback control in a non-mathematical way. Furthermore, the same platform is shown to be useful in realizing and visualizing the mathematical concepts, using Control System Toolbox™. Finally, these tools serve to seamlessly integrate modern computational tools with traditional teaching methodology. This approach has been tested in the undergraduate control course taught at IIT Madras over the last three years. The results have been highly encouraging—a curiosity-evoking introduction and an enhanced appreciation of the control theory. The success experience will be shared along with the challenges encountered in these efforts. This session is particularly useful to academicians and teachers of introductory control courses in engineering programs.
Energy Asset Health Monitoring on a Data Science and IIoT Platform
We have to change something! Climate change is one of the biggest challenges of our generation. We need an energy revolution moving to renewable energies. Wind energy is one of the most important technologies for the future. Wind farms offer an interesting field of application for data analysis and IIoT platforms. Wind turbines generate a lot of data, from which big knowledge can be generated with the help of data analytics. We show how MATLAB® and other technologies can be used to build a technologically sustainable IIoT and data science platform to help us generate the energy of the future.
Energy Asset Health Monitoring on a Data Science and IIoT Platform
We have to change something! Climate change is one of the biggest challenges of our generation. We need an energy revolution moving to renewable energies. Wind energy is one of the most important technologies for the future. Wind farms offer an interesting field of application for data analysis and IIoT platforms. Wind turbines generate a lot of data, from which big knowledge can be generated with the help of data analytics. We show how MATLAB® and other technologies can be used to build a technologically sustainable IIoT and data science platform to help us generate the energy of the future.
eCAL Toolbox for Simulink
Highly complex electronic control units (ECUs) with increased communication needs in today’s vehicles pose a challenge during development and prototyping. Different types of ECUs based on different development platforms must interact and exchange large amounts of data in real time.
This situation is particularly relevant for ADAS and autonomous systems where a flexible, light, and performant means of communication is necessary. Continental R&D developed a specific middleware solution to run AD software components on a wide range of hardware platforms, which can be implemented in different computing languages.
eCAL (enhanced Communication Abstraction Layer) enables scalable, high performance interprocess communication in heterogenous networks. It is based on a publish/subscribe pattern, designed for minimal latency and high data throughput, and leverages UDP or shared memory as the transport layer for best performance. Thanks to its lightweight C/C++ API, eCAL has been integrated into Simulink® as an open-source toolbox for simulation on desktop and for prototyping on Speedgoat® real-time target computers, providing a flexible and highly performant multi-node communication layer.
Hydrogen Is the New Diesel: Electrifying Heavy-Duty Vehicles with Nuvera Fuel Cells
"Shipping ports around the world are concentrated sources of greenhouse gases, and diesel-powered port equipment is responsible for producing pollutants that severely impact air quality and health in surrounding areas. Hydrogen fuel cells have emerged as a smart, practical solution enabling electrification of many types of industrial vehicles and machines.
In a fuel cell, hydrogen and oxygen are combined electrochemically to produce water, liberating electrons that are captured to power an electrical device. Optimizing this process for everyday commercial use requires advanced control algorithms. Nuvera uses Model-Based Design with Simulink® to simulate and optimize the control algorithms’ behavior against a model of the fuel cell engine. Nuvera additionally uses hardware-in-the-loop testing to add rigor to the embedded computer’s performance verification to thoroughly test capabilities.
Nuvera uses Simulink to estimate and control the state of charge of the lithium ion batteries used in conjunction with the fuel cell power system. Batteries are used to accept power regenerated from braking in a fuel cell electric vehicle, or from lowering the load of a forklift or container handler. These capabilities are vital for creating high-performance heavy-duty vehicles that can compete with diesel engine counterparts, but without the emissions at the point of use - thus creating cleaner port operations.
Simulation Framework for Highly Autonomous Trucks in a Logistics Centre
Vehicles are becoming increasingly connected and autonomous. In the case of connected and autonomous mobility, vehicles may use any number of communication technologies to communicate with each other (vehicle-to-vehicle or V2V) and with the infrastructure (vehicle-to-infrastructure or V2I). One exemplary use case is the simulation of multiple highly autonomous trucks in a logistics center. There are many such applications that lie between the extremes of a single-ego vehicle use case and global vehicular traffic applications.
In this talk, we demonstrate an approach to develop a modular and coherent simulation framework to simulate multiple autonomous trucks maneuvering in a logistics center using MATLAB® and Simulink®. The framework can be used for function development for this as well as other multi-ego simulation use cases. The feasibility of the framework is determined by a proof of concept (POC) for the logistic use case. The POC leverages the Simulink-based automated parking valet example found in Automated Driving Toolbox™ and focuses on the planning, vehicle simulation, multi-agent communication, and visualization aspects of the simulation.
The POC addressed the challenges of a complex simulation framework where multiple aspects are working seamlessly in a simulation together. Another challenge that the POC addresses is integrating realistic vehicle dynamics and real vehicle dimension in model. The POC resulted in a successful multi-agent simulation of two trucks following the planned path from start pose to the goal pose. Throughout the POC, the MathWorks reference example and documentation were helpful for reference. MathWorks also provided technical advisory services for the project and participated in regular meetings and information exchange.
Target Detection and Classification in Radar Point Cloud with MathWorks
In autonomous driving functions, an accurate perception of the vehicle environment is a crucial prerequisite. Our goal was to benchmark ADAS sensor performance, particularly RADAR with different radio frequency (RF) and field of view (FoV). Subject to weather conditions, RADAR performance is better than other sensors (camera and ultrasonic) in detecting the target and its attributes (range, velocity, and dimensions). RF signals are transmitted and received by the RADAR mounted on the vehicle frames (front, rear, and side) to enable synchronization and data coherence from all antennas. These signals are modulated in terms of frequency and time domain. The signals are received as point clouds, which are used for target detection and classification. The point clouds are grouped using clustering technique, and attributes of all the points are obtained to develop classification models. We use the Classification Learner app to choose the best classification models to decode ADAS-specific use cases. To have the best confidence factor for the primary target classes such as cars, trucks, motorcycles, bicycles, and pedestrians, we train our classification models (algorithms) based on real-time field data to establish ground truth (GT) information. Our goal is to have an optimized confusion matrix which minimize the false positive rate (FPR) and improve the classification accuracy in different infrastructures, RADAR mounting patterns, and weather conditions. It also enables us to benchmark RADAR performance and determine the ADAS sensor configuration suitable to Renault-Nissan vehicle lines.
Development of Novel Sensor Fusion Architectures for Autonomous Vehicles
Flux Auto mainly focuses on developing self-driving technology in trucks and cars to increase the viable commutability and sensor fusion plays an important role in providing a complete situational awareness for the vehicle even in harsh weather conditions like snow, rain, fog where different sensors work in different conditions.
Fusing the data from multiple sensors improves the 3D perception and determines the position, velocity, and orientation of other participants in the scenario. Different sensor fusion architectures like detection level fusion and track level fusion from Sensor Fusion and Tracking Toolbox™, Automated Driving Toolbox™ and MATLAB Coder™ were used to develop a novel architecture for multilevel fusion and was evaluated using OSPA and GOSPA metrics.
In this talk we will be talking about the following points:
- JPDA-based 3D multi-object tracking
- Density-based spatial clustering for detection-level fusion using GNN Tracker
- Implementation of ToMHT, GM-PHD, and JPDA for track-level fusion
- Effective filters and multi-object trackers for use on the sensors in autonomous stack
- C++ code generation for the sensor fusion architecture designed in MATLAB® for real-time deployment and testing
- Multi-sensor data association for real-time and synthetic data
- Multi-sensor calibration in real time and concatenation of units
Developing a Driver Monitoring System Using Model-Based Design
Model-Based Design helps to manage complete system design, implementation, test, validation and certification steps for system and software development. At Elektrobit, a proof-of-concept AUTOSAR Adaptive driver monitoring system was developed using a model-based approach. The driver monitoring system consisted of a camera system, computer vision algorithms, and a simple HMI to show the driver state. An AUTOSAR Adaptive architecture-based ECU was designed to communicate information using Ethernet interface. This presentation highlights the ease of developing such an application using MathWorks tools and shows results through a live demonstration where the driver’s presence and state are successfully detected by the system.
Electric Machine Calibration Using Model-Based Calibration
Accurate calibration of electric machines is critical to meeting the aggressive power and efficiency requirements of automotive traction applications. Model-Based Calibration Toolbox™ allows for a single toolchain to be used for electric motor calibration from DOE definition, fitting test results to a model, optimizing for the required system constraints, and finally building the calibration.
Development of an Enhanced Heavy Duty Truck Autonomous Driving Environment
In this presentation, we introduce an autonomous driving simulation framework for heavy duty trucks based on MathWorks and third-party technologies. The goal is to assess how existing engine/vehicle control features perform when integrated with autonomous driving, and explore innovative solutions for new scenarios. System performance under different driving conditions such as stochastic highway traffic scenarios and highway merging scenarios are evaluated. This framework also enables engineers to simulate real-world highway routes by capturing details such as road curvature and grades. With this simulation platform, different vehicle, powertrain, ADAS, and autonomous driving controls can be integrated and tested.
System Architecture Creation Using System Composer
The process which we follow in requirement-based software development lacks a standard process of creating requirement specifications. Currently, the specification used by the developer might be in the form of a flow chart, textual requirements, or hand drawing.
The process for analyzing the requirements and creating a system architecture that can be used for software development is necessary.
To address this void, architecture-based software development shall be adopted with minimal changes in the conventional process.
System Composer™ can address this void. As it is a Simulink® product, the conventional process of software development won’t be altered much.
After going through the presentation, the participants can correlate to the advantages of architecture-based software development over requirements-based software development.
The aim is to create a standard process of analyzing the requirements, creating a system level architecture, and using it for software development.
With this process, the developer has access to the entire architecture while implementing the software and can relate to how the requirements are affecting the entire system with functional background. It will be easy for the developer to understand the entire architecture even if he is new to the topic. Other advantages of using System Composer would be requirement traceability and impact analysis of change of requirements on the design or vice versa.
On the other hand, this process will be easy if the software development is done in Simulink, as the integration between System Composer and Simulink is seamless. Additional scripts might be required if the software development is done in another software such as TargetLink®.