Proceedings
Featured Presentations
Dr. Jason Ghidella, MathWorks
AI
Naoya Kaido, All Nippon Airways Co., Ltd.
Algorithm Development and Data Analysis
David Garrison, MathWorks
Lola Davidson, MathWorks
Autonomous Systems and Robotics
Balinder Malhi, Microsoft
Fred Noto, MathWorks
Tohru Kikawada, MathWorks
Cloud, Enterprise, and DevOps
Arvind Hosagrahara, MathWorks
Nicole Bonfatti, MathWorks
Christine Bolliger, MathWorks
Gandharv Kashinath, MathWorks
Electrification
Dr. Danielle Chu, MathWorks
Chris Lee, MathWorks
Ruth-Anne Marchant, MathWorks
Inclusive Innovation: Technology by and for All
Daniela Dobreva-Nielsen, AZO Anwendungszentrum GmbH Oberpfaffenhofen
Mohan Sundaram, ARTILAB Foundation
Eva Pelster, Moderator, MathWorks
Milka Santana, Perfecto Labs
Armani Hrobowski, Argonne National Laboratory
Ji Yang Luo, Moderator, MathWorks
Modeling, Simulation, and Implementation
Dallas Perkins, MathWorks
Yann Debray, MathWorks
Plenary
Dr. Jason Ghidella, MathWorks
Preparing Future Engineers and Scientists
Nattha Jindapetch, Prince of Songkla University (PSU)
Melda Ulusoy, MathWorks
Wireless Connectivity and Radar
Dr. Houman Zarrinkoub, MathWorks
We will not sell or rent your personal contact information. See our privacy policy for details.
You are already signed in to your MathWorks Account. Please press the "Submit" button to complete the process.
ANA's Predictive Maintenance Challenge: Replace Aircraft Parts Before They Break
All Nippon Airways Group started its air transport business in 1952 and celebrated its 70th anniversary in 2022. ANA Maintenance Center is working on predictive maintenance for aircraft maintenance. We detect failure signs of various aircraft components based on sensor data collected during flights, aiming to improve safety and on-time performance. To perform predictive maintenance, it is necessary to identify features to predict failures from sensor data. The ANA Maintenance Center leverages the deep domain knowledge of the maintenance crew and sensor data from more than 20 thousand flights around the world. The sensor data includes external factors, such as weather conditions, flight path, and number of passengers. In this talk, see how we used machine learning with MATLAB® to test hypotheses and create machine learning models. We successfully identified robust features that enabled early failure detection of the cabin air compressor (CAC), one of the main air conditioning components. The preprocessing pipeline and the trained model were integrated into an existing system using MATLAB Compiler™ for daily inspection.
Highlights:
- Predictive maintenance for aircraft maintenance
- Hypothesis verification of failure-identifying features using domain knowledge and field data
- A case study on detecting the deterioration of aircraft components using a neural network
Breaking the Boundaries: Integrating GIS, AI, and Lidar for Digital Innovation
A key component of the digital transformation of GIS data is the rapid integration and interrogation of large data sets, lidar point clouds, and visual information to support rapid decision making. For example, smart city automation relies on accurate knowledge of street infrastructure, such as the location and condition of utility poles and power lines, while forestry managers require canopy and trunk estimation and digital terrain mapping to enhance their operations. Significant challenges remain, including the need for rapid but accurate processing of spatial point clouds and the integration of visual and multispectral data with lidar data to provide information beyond a simple (x,y,z) location.
Traditional feature-based methods for processing point cloud data can be slow and imprecise. In some cases, such as on mine sites, feature-based algorithms may misinterpret ground profiles as buildings due to the planar shapes of spoil heaps, leading to inaccuracies. In contrast, at Spacesium we are using cutting-edge deep learning algorithms, such as R-CNN, DeepLab v3 and PointNet++, in conjunction with semi-supervised retraining, to rapidly segment and classify point clouds utilizing cloud scale resources.
The integration of visual and multispectral data presents a unique set of challenges and opportunities. While it allows for more than just the specification of the position of an xyz point cloud, using both location and visual data can enable the determination of viable routes for autonomous vehicles. On the challenge side, it is computationally expensive to fuse these data sets that are captured in separate imaging systems. At Spacesium, we have implemented custom algorithms based on tools from Image Processing Toolbox™ and Computer Vision Toolbox™ to efficiently correct, compute, and fuse these data sets.
Catching Fire: Using Autonomous Drones to Detect and Track Wildfires
How a Pseudo-Pressure Sensor Improves Diagnostics in a Solenoid Actuated Valve
The Coca-Cola Freestyle dispenser uses a solenoid actuated valve for dispensing water. There is no pressure sensor on the water line, and it is difficult to distinguish between an upstream pressure loss issue and inherent valve failure. This results in the unnecessary replacement of good valves in dispensers in the field. In this talk, hear about the development of a novel pseudo-pressure sensor that estimates the line pressure based on the solenoid valve current signature. MATLAB® was used to analyze the current signature test data, extract features, and develop a regression model to predict line pressure. This model was then deployed to a memory-constrained ARM Cortex-M microprocessor using Simulink® for testing and deployment to production. This pseudo-pressure sensor feature has improved dispenser diagnostics and reduced valve replacements, leading to a significant drop in field service cost. In addition, learn about future work plans to develop diagnostics around valve blockages using MATLAB and Simulink for machine learning and deployment.
Machine Learning for Cancer Research and Discovery
The Department of Machine Learning at Moffitt Cancer Center has been developing AI technologies for personalizing cancer care and accelerating cancer discovery. These applications utilize multiscale data from molecular testing, medical imaging, and electronic health records (multi-omics) and create an interesting but challenging application of AI. Discover how deep learning methods are used for better data representation, actuarial analysis to predict time for events, and reinforcement learning for optimizing decision making. The department has also been on the forefront of coupling quantum computing with machine learning to improve robustness. In this session, see examples of cancer research and discovery applications and hear about the successes and inherent challenges of the work.
Reduced Order Modeling with AI: Accelerating Simulink Analysis and Design
With Model-Based Design, you use virtual models to design, implement, and deliver complex systems. Creating high-fidelity virtual models that accurately capture hardware behavior is valuable and can be time consuming. However, these high-fidelity models are not suitable for all stages of the development process. For example, a computational fluid dynamics model that is useful for detailed component design will be too slow to include in system-level simulations to verify your control system or to perform system analysis that requires many simulation runs. A high-fidelity model for analyzing NOx emissions will be too slow to run in real time in your embedded system. Does this mean you have to start from scratch to create faster approximations of your high-fidelity models? This is where reduced order modeling (ROM) comes to help. ROM is a set of automated computational techniques that helps you reuse your high-fidelity models for creating faster-running, lower-fidelity approximations.
In this talk, learn about different ROM techniques and methods, covering AI-based approaches, linear-parameter varying (LPV) modeling, and strategies for bringing large-scale sparse state-space matrices from FEA tools into MATLAB® and Simulink® for applications such as flexible body modeling and control. The focus of the talk, however, will be on AI-based ROM. See how you can perform a thorough design of experiments and use the resulting data to train AI models using LSTM, neural ODE, and nonlinear ARX algorithms. Learn how to integrate these AI models into your Simulink simulations, whether for hardware-in-the-loop testing or deployment to embedded systems for virtual sensor applications. Learn about the pros and cons of different ROM approaches to help you choose the best one for your next project.
Understanding and Verifying Your AI Models
Neural networks can obtain state-of-the-art performance in a wide variety of tasks, including image classification, object detection, speech recognition, and machine translation. Due to this impressive performance, there has been a desire to utilize neural networks for applications in industries with safety-critical components such as aerospace, automotive, and medical. While these industries have established processes for verifying and validating traditional software, it is often unclear how to verify the reliability of neural networks. In this talk, we explore a comprehensive workflow for verifying and validating AI models. Using an image classification example, we will discuss explainability methods for understanding the inner workings of neural networks. Learn how Deep Learning Toolbox™ Verification Library enables you to formally verify the robustness properties of networks and determine whether the data your model is seeing during inference time is out of distribution. By thoroughly testing the requirements of the AI component, you can ensure that the AI model is fit for purpose in applications where reliability is of the utmost importance.
Using Deep Learning and Kalman Filters for Temperature Soft Sensing
See how to leverage AI to assess internal temperatures of hydraulic motors. The aim was to create an embedded algorithm for temperature estimation based on the motor loads and environment. The challenges were to consider the load history, and generate data using a single internal tool, with fewer measurements and less prior data available. The size of the AI model was also important. MATLAB® enabled Poclain Hydraulics to use pretrained AI models and quickly ramp up their expertise in deep learning, deal with load history issues, and accelerate the project. The benefits of this project will be preventing field catastrophic failures due to prolonged time at unexpected high loads and improving the size of the overall hydraulic transmission.
Cell Array, Table, Timetable, Struct, or Dictionary? Choosing a Container Type
What do you do when you need to work with data that combines numbers, strings, datetimes, categoricals, and other types of data? Container types such as a cell array, table, timetable, struct, and dictionary can store heterogeneous data, but how do you choose the right one for your application? The choice of container type can have a big impact on your productivity and the performance of your code. Learn about the five major container types in MATLAB® , how they represent data, and the advantages and disadvantages of each.
How to Optimize the User Experience of Your MATLAB Apps
With MATLAB®, non-professional software developers can develop professional applications. However, creating visually appealing and user-friendly applications requires design considerations which might elude the engineer who is—rightfully—focused on the app’s functionality rather than aesthetics or usability aspects. Nonetheless, an easy-to-use app is always desirable and sometimes a requirement when the app needs to be published.
In this presentation, learn tips and tricks to design appealing and user-friendly apps with MATLAB, such as:
- customizing graphics objects
- resizing the app components to fit every screen
- exploiting every corner of the canvas space
- using custom HTML components inside the application
Low Code Data Analysis in MATLAB
Data analysis is an integral part of many engineering and scientific workflows, but applying the right data analysis techniques typically takes lots of manual coding before getting useful results. It can be particularly challenging when you are not familiar with your data. MATLAB® Live Editor Tasks and apps such as Data Cleaner make it easy to explore, clean, and prepare data. These apps and tasks can also automatically generate the equivalent MATLAB for you to build on. This demo shows how you can analyze data and generate reusable data analysis programs in MATLAB with minimal coding.
MATLAB for Control of Cryogenic DT Fuel for Nuclear Fusion Ignition Experiments
The National Ignition Facility (NIF) houses the world’s most energetic laser that can deliver over 2 MJ of energy at the 500 TW level of power of 351 nm UV light. This immense optical energy is focused on a small target that can create such high energy densities that the unique physics underlying this extreme regime can be explored. One of the primary goals of the NIF is to explore controlled nuclear fusion where the energy from the laser is used to compress hydrogen (more specifically, its isotopes deuterium and tritium or DT) to 100 billion atmospheres where temperatures reach 100 million Kelvin, and at which point the atoms can overcome Coulombic repulsion and fuse. This grand challenge of ignition where the nuclear fusion energy out was greater than the optical energy in was successfully achieved in December 2022 in a landmark experiment. This is expected to usher in a new age of nuclear fusion research with diverse and far-reaching goals. One of the many challenges of this experiment was the formation of a spherical DT ice layer at ~19 K with very high dimensional precision and smoothness. In this presentation, see a brief overview of the key aspects of the complex and multifaceted system needed to carry out these experiments and a detailed explanation of the use of MATLAB® for the formation of the DT ice layer for the ignition experiment—as well as previous experiments that explored the physics needed to achieve that. In particular, the image analysis tools in MATLAB were used to control the ice layering process. This use case serves as a good example of the wide-ranging applications of this software tool.
Process of Building AI Models for Predicting Engine Performance and Emissions
Simulation-based product development requires efficient engine models that can not only predict engine performance and emissions accurately but also produce results at real-time computing speeds. These models are used to predict in a closed-loop system-level environment integrated with controls and other components. 0D/1D physics-based software can model engine performance with good accuracy, but these models cannot be deployed in a closed-loop environment with controls and other components because they have a significantly higher runtime—about 20 to 40 times slower than real time. Investigation was done to find a low-fidelity model or algorithm with similar performance that could replace high-fidelity physics-based models in the simulation domain and can be deployed in a system level environment. The robustness of AI models’ ability to learn from the data encouraged Cummins to try several machine learning models. A detailed approach was taken to build LSTM-based deep neural network models that achieved target model accuracy with a runtime of about one-eighth that of real time. Cummins developed a total of 26 models using MATLAB® to predict 26 different responses of the engine which consist of temperature, pressure, and flows across multiple locations of the engine along with emissions, efficiencies, and engine brake torque. In the process of building these deep learning models, multiple techniques were evaluated such as defining an optimal deep neural network structure consisting of the type and number of layers, using different activation functions, and optimizing associated hyperparameters. In future, the process of building these models will be automated, and the use of MATLAB to train the models in parallel will be a significant advantage. As these models meet all the requirements, future work would be done to integrate these models in the pure simulation domain, followed by integration with actual hardware and control.
Using MATLAB with Python
Accelerate Aerial Autonomy with Simulink and Microsoft Project AirSim
Autonomous aerial systems are critical systems that incorporate AI and require rigorous verification to ensure safety and performance. Aerial autonomous technologies are driving advanced air mobility use cases such as power line inspections, logistics, and flying taxis. Integration of airframe physics and sensors in scalable high-fidelity simulations with photorealistic 3D environments is needed to test and evaluate autonomous aerial systems while generating synthetic data to train AI models.
Explore how to leverage Microsoft Project AirSim and Simulink® with Model-Based Design to accurately simulate vehicle physics with sensors and dynamic environments in high-fidelity simulations. Learn how to use co-simulation in software-in-the-loop workflows to safely build, train, test, and validate autonomous aerial systems.
Applying AI to Enable Autonomy in Robotics Using MATLAB
The AI applications in robotics greatly expanded in recent years to include voice command, object identification, pose estimation, and motion planning, to name a few. AI-enabled robots continue to grow for manufacturing facilities, power plants, warehouses, and other industrial sites. Warehouse bin-picking is a good example. Deep learning and reinforcement learning enable robots to handle various objects with minimal help from humans, reducing workplace injuries due to repetitive motions.
Learn how to empower your robots using AI for perception and motion control in autonomous robotics applications. MATLAB® and Simulink® provide a powerful platform for successful AI adoption in robotics and autonomous systems. You can use the same development platform to design, test, and deploy your AI applications in intelligent bin-picking collaborative robots (cobots), autonomous mobile robots (AMRs), UAVs, and other robotics systems. This reduces development time as well as time-to-market.
Gain insights into:
- Reducing manual effort with automatic data labeling
- Detecting and classifying objects using deep learning for robotics applications
- Motion planning using deep learning
- Controlling robot motion using reinforcement learning
- Deploying deep learning algorithms as CUDA-optimized ROS nodes
Bringing the iCub Humanoid Towards Real-World Applications
The iCub project was launched in 2004 as part of the RobotCub European Project, whose main aim was to study embodied cognition—the theory that an organism develops cognitive skills as it interacts with its environment. The main outcome of the iCub project is a one-meter-tall, 53-degrees-of-freedom humanoid currently being developed at the Italian Institute of Technology (IIT). Over the years, the iCub robot has been used as a research platform for diverse fields of science ranging from neuroscience to engineering. This presentation focuses on the work that the Artificial and Mechanical Intelligence research lab at IIT is carrying out with the iCub along the three axes: physical human-robot collaboration; avatar systems; and aerial humanoid robots (jet powered humanoid robots). The presentation shows how MATLAB® and Simulink® are fundamental for research and development for the iCub in control, planning, estimation, and artificial intelligence—and how these tools can be beneficial for humanoid robotics as a whole.
Development and Validation of ADAS/AD Features Using MATLAB Solutions
Autonomous driving (AD) and advanced driver assistance systems (ADAS) are technologies which will see widespread adoption globally. The design, development, and validation of complex ADAS/AD algorithms are critical to assure the safety of the vehicle occupants. Simulation is an ideal choice to develop the ADAS/AD algorithms and validate them in a PC-based environment before the hardware availability. During the session, hear about the work done by Tata Elxsi to simulate ADAS/AD algorithms using the product portfolio from MATLAB® and Simulink®. Dive into topics like sensor modeling, scenario creation, environment modeling, control algorithm development, vehicle dynamics modeling, closed loop integration, and validation. Learn how ISO safety of the intended functionality can be validated in a virtual space using MathWorks solutions and about their intuitive nature for enhancing work productivity. See examples of the advantages of the MathWorks solutions that helped Tata Elxsi to accelerate the development process, and watch demonstrations of the interoperability of MATLAB solutions with 3D simulation tools for better visualization of the scenario. Finally, see how MathWorks solutions are used for ADAS/AD hardware-in-the-loop simulation validation for validation of ADAS/AD electronic control units.
Model Autonomous Navigation of a Mars Rover
What’s New in MATLAB, Simulink, and RoadRunner for Automated Driving Development
MATLAB®, Simulink®, and RoadRunner help engineers to build automated driving systems with increasing levels of automation. Discover new features and examples in R2022b and R2023a that will allow you to:
- Interactively design scenes and scenarios for driving simulation
- Generate 3D scenes from HD map data
- Generate driving scenarios from recorded sensor data
- Simulate driving applications like emergency braking, lane following, lane change, platooning, and valet parking
A Cloud-Based MATLAB Visual Inspection System
Hear about a use case study describing a repurposable cloud-based visual inspection system powered by image processing and computer vision algorithms running on MATLAB Production Server™. See how to use mobile device–based workflows to visually catalog and inspect objects of interest via streaming video and image capture and implement a DevOps workflow.
Build Scalable AI Solutions with MATLAB Production Server in Kubernetes on Azure
With more than 150 years of experience in the industry, the AERZEN Group is one of the top 3 application specialists for high-performance blowers. The compressors, blowers, and turbos are mainly used in wastewater treatment plants, in the process industry, and in oil-free pneumatic conveying of bulk materials. Sustainability, smart energy, resource usage, and reliability of machinery are important concerns for AERZEN’s customers when operating their plants all over the world. The Aerzen Digital Systems business unit is working on smart services so that users can operate the machines even more efficiently and reliably with the help of AI and machine learning.
Aerzen Digital Systems designs AI models for forecasting, condition monitoring, and predictive maintenance for this purpose. Functions and models may be used interactively for data exploration as well as automatic processing of streaming IoT data. While planning to operationalize these models, several challenges surfaced and certain requirements for the platform were defined:
- Flexibility and scalability for many unique plants and sets of machinery
- MLOps to monitor and adapt AI models over decades of equipment lifetimes
- Agile and quick deployment of new functionality on availability of improved AI techniques
- Integration with applications from different frameworks developed both in-house and by customers
In this talk, we detail a sample solution to these challenges centered around MATLAB Production Server™ running in Kubernetes on Microsoft Azure. The Aerzen Digital Systems libraries of MATLAB functions and AI models are deployed to MATLAB Production Server through an MLOps pipeline. In the present case study, data from a large wastewater treatment plant is analyzed and an anomaly detection algorithm for a single blower is developed. This model is then uploaded to the cloud and trained individually for all blowers. During runtime execution, every model is monitored and automatically retrained if necessary.
DevOps with MATLAB: A Predictive Maintenance System for Streaming Data
Many organizations use MATLAB® and Simulink® to develop algorithms—but how do they deploy, monitor, and manage them over their lifetime? DevOps refers to the set of capabilities, tools, and best practices needed to operationalize software applications, usually in an IT context. But Gartner reports that more than 50% of data science projects do not result in business value due to difficulties with operationalization.
In this session, see a demonstration of a complete predictive maintenance DevOps system for monitoring the state of health (SOH) of a battery fleet. Learn how to develop an SOH prediction model and a drift detection model in MATLAB. Then, see how to automatically test and deploy these algorithms in operation using a CI/CD pipeline, MATLAB Production Server™ on Microsoft Azure®, and dashboards for performance monitoring.
Discover how engineering teams can use MATLAB to operationalize their algorithms and how to bridge the gap with IT/OT teams.
MATLAB Based DevOps Workflow in AWS for Hospital Patient Monitoring Applications
The Maternal Infant Care division at GE HealthCare is interested in remote baby monitoring in neonatal intensive care units (NICUs). Some of the babies admitted in NICU are fragile and must be isolated for special care in an optimal incubator environment. Remote monitoring enables healthcare professionals to assess babies’ health information and enable parental bonding with the baby in an incubator. This is enabled with a video camera attached to the NICU bed which streams data to MATLAB® on the cloud. The doctors and parents can access baby information via apps deployed on web and mobile applications. MATLAB Production Server™ hosts the MATLAB runtime environment in a Kubernetes-based AWS cloud platform integrated with Edison™ cloud services developed internally by GE HealthCare. As new patient data is collected, the research teams tune processing algorithms developed using MATLAB following the standard DevOps workflow. Deployment is scaled using MATLAB Production Server for multiple data streams.
Running MATLAB Machine Learning Jobs in Amazon SageMaker
Amazon SageMaker is a popular AI platform in the cloud providing a broad range of capabilities and services for AI developers. One of the great advantages of SageMaker is that it lets developers run jobs on a wide selection of compute instances, including GPU support with the pay-as-you-go pricing model, without managing underlying infrastructure. Learn how to leverage SageMaker to train a machine learning model using MATLAB®.
What's New for Managing, Testing, and Building Your MATLAB Code
Optimize EV Battery Performance Using Simulation
In early vehicle design stages, engineers need to make decisions about battery pack sizing. This often involves trade-offs between competing objectives such as EV range and cost, making it difficult to select an optimal design. It requires significant time and modeling expertise to build the closed-loop system-level models needed to evaluate potential designs.
In this talk, we present an accessible workflow to address these pains. We demonstrate how to create a system-level electric vehicle model using the Virtual Vehicle Composer app from Powertrain Blockset™. The model will be used to iteratively evaluate the vehicle’s performance using optimization techniques. The results of this study can then be used in Simscape Battery™ to generate an appropriate battery pack design and evaluate the pack behavior in more detail. This battery model is then integrated back into the system level model for verification against the requirements.
This approach provides a rigorous numerical method to quantify trade-offs in the design problem. It also simplifies the process of model building, both at the system level and battery subsystem level. Domain experts who may not be tool experts can now take advantage of these powerful design workflows.
Optimized Motor Control Applications: From Idea to Deployment with NXP MBDT
Developing complex applications for motor control requires a streamlined, efficient process that delivers high-performance results. In this presentation, explore NXP™ Model-Based Design Toolbox (MBDT), which seamlessly integrates with MATLAB® and Simulink® to provide an automated path to applications development on NXP S32 microcontrollers. With extensive automotive math and motor control function libraries, peripheral driver blocks, and processor-optimized motor control functions (running on a specialized co-processor, the Enhanced Time Processor Unit), the toolbox offers multiple simulation modes and code generation capabilities for NXP processors. By using MBDT with MATLAB and Simulink, NXP customers can easily prototype motor control applications. Discover how NXP MBDT enables the development of optimized performance applications for motor control.
Simscape Battery Essentials
Techno-Economic Analysis of the Impact of EV Charging on the Power Grid
With more and more electric vehicles connecting to the power grid every day, there are concerns that existing grid infrastructure will be strained beyond acceptable operational limits. We can address these concerns by bringing operations, pricing, and forecasting into techno-economic models of power systems. Using these models, we can assess feasibility, risk, optimal operations, and profitability of charging infrastructure. These models provide key insights such as expected system performance over time, identification of factors that lead to bad outcomes, and right-sizing of components through optimization studies.
In this talk, we consider a scenario where a system operator can command individual electric vehicle battery units to both store and supply electricity while connected to the grid. The operator applies techno-economic optimization to the charging profiles to minimize electricity cost while accounting for system requirements and constraints, such as limits on state of charge, grid supply, and charge/discharge rate. The optimization provides a fast and automated approach for leveraging all of the units connected to the grid for overall system benefit. Charging profiles are then assessed for the impact on voltage and power flow levels using a grid-level simulation.
Toward Zero-Emission Shipping with Fuel Cells and Model-Based Design
The shipping sector accounts for a significant amount of global CO2 emissions. Fuel cell technology enables electrified transportation and hence realizes a vision of carbon-free transportation. In converting diverse fuels like ammonia, methane, hydrogen with oxygen into electricity, fuel cells could replace combustion engines to ensure long-range autonomy in freight transportation and contribute to cleaner and more efficient propulsion. But what are the intrinsic challenges to fuel cells and their commercial upscaling in the maritime sector? Is there one fuel cell technology–such as solid oxide—with an edge? How do engineering methods allow for a thorough yet agile concept evaluation? Please attend this presentation by Clara Venture Labs and Alma Clean Power to discover how Model-Based Design is used to evaluate the robustness and efficiency of various fuel cell solutions in high demand by the maritime sector. Furthermore, the benefits of model customization and tunability in the process of developing industrial partnership proposals will be highlighted.
APAC EMEA Inclusive Innovation: Technology by and for All
Inclusive innovation is of fundamental importance to develop engineering solutions that support the needs of diverse communities. How do we build learning environments and workplaces that recognize community needs in order to engineer inclusive solutions? Hear thought leaders discuss the needs, challenges, and social factors that impact inclusive innovation.
Accelerating Safe Railway Application Development Using Model-Based Design
Alstom’s train traction team uses MATLAB® and Simulink® to develop traction control software adhering to Software Safety Level 2 standard EN 50657 (including former EN 50128).
Hear a brief summary of the earlier talk from MATLAB EXPO 2018 when the traction control team used MATLAB and Simulink for prototyping code generation on traction controllers for the very first time. Learn how Alstom transformed its software development processes and tools from traditional tools such as Visio, textual documents in combination with IEC 1131 design tools, and hardware-based testing using MATLAB and Simulink for requirement management, software development, and verification on personal laptops according to safety standards EN 50128/50657. Alstom has developed software products and projects to successfully have trains running in passenger service with independent safety assessment using a Model-Based Design certified workflow from MathWorks. Alstom upskilled its software developers and verifiers to transform into this new way of working and new tools. For certain validation activities, the team has managed to cut cost by 80% compared to traditional way of working.
Hear about the opportunities Alstom sees in MATLAB and Simulink and what the team is prototyping to create an even more efficient system development process. Learn about some of the challenges faced and how to overcome them, as such transformations don’t always go smoothly.
Formalizing Requirements and Generating Requirements-Based Test Cases
Systems engineers typically capture requirements using text that can be incomplete and inconsistent, resulting in errors that become exponentially more expensive to fix over time. By formalizing requirements using the Requirements Table block, you can define the expected behavior and analyze the requirements for completeness and consistency before you even begin your design. Furthermore, since the model is independent of the design model, you can generate requirements-based tests from your modeled requirements and verify if your design is meeting those requirements without needing to manually write thousands of test cases.
Implementing a PLCnext-Based Turbine Control System in Simulink
Model-Based Design provides an efficient approach to modeling and designing control systems. Both the controller and the plant model are implemented in MATLAB® and Simulink®. By linking these models, you can optimally analyze the controller and adapt it to the environment and the plant. Different stages of testing like unit tests or model-in-the-loop tests are optimally implemented with Simulink Test™. Further advantages like a visual representation of data flow, intuitive language learning, and smooth implementation of new features make Simulink the ideal choice for Sokratel’s task of implementing a PLCnext-based wind turbine control system. This talk focuses on the following topics:
- What is Model-Based Design and why does Sokratel use it?
- The iterative process of refining a requirement from the idea to the prototype
- Introduction to the process based on a small example
- Toolchain integration: How Sokratel automates daily work
- Testing concept: Hardware-in-the-loop testing with the PLCnext system
After this talk, you will have an idea of how to successfully lead your team on the road from MATLAB and Simulink development to the final product on hardware.
Infineon, Supporting Dependability for Automotive MCUs with Model-Based Design
Dependability is fundamental to Infineon and encompasses automotive safety, cybersecurity, availability, and commitment to quality and long-term product lifecycle support. As automotive systems become more complex, designing a hardware and software application architecture that is fit for purpose is crucial for our customers. Mastering this complexity requires the support of Model-Based Design methods. In this presentation, learn how Infineon and MathWorks strategically teamed up to offer model-based software design capabilities for Infineon’s latest AURIXTM TC4x microcontroller. The AURIX TC4x hardware support package has already seen success with customers, some examples of which will also be shared in the presentation.
Managing the Complexity of FPGA-Based Rapid Control Prototyping
Siemens Healthineers is exploring rapid control prototyping (RCP) for the development of power electronic components. RCP with FPGAs represents one of the most powerful workflows within the Simulink® framework. The workflow enables engineers to implement sophisticated and high-performing control systems with very fast design iterations. The tight integration between HDL Coder™ and Simulink Real-Time™ eliminates a lot of intricacies of FPGA design. Nevertheless, FPGA-based RCP remains one of the most complex Simulink workflows. Thus, managing the workflow complexity is one of the key factors for success. Three main topics can be employed to this end: workflow automation, a suitable model architecture, and a well-customized utility library for handling direct memory access (DMA). Developing an FPGA-based RCP application requires many iterations of successive builds with FPGA synthesis, which require approximately one hour each. It is highly recommended to automate this workflow with a continuous integration server to separate this procedure from the core modeling tasks. There are a number of considerations for an effective and sustainable RCP architecture. There needs to be a clear separation between the core responsibilities of experiment design, hardware interfacing, physical modeling, control design, and real-time signal tracing. To be effective, this architecture requires barrier-free communication between CPU and FPGA. Basic support for this comes with the DMA-blocks from Simulink Real-Time. To be really barrier-free and flexible, a DMA utility library has been designed. This presentation encourages the user to think about efficient modeling procedures, the robustness of their designs, and the validity of their model architecture.
Model-Based Design and Prototyping of FPGA/SoC in an Aerospace Application
The objective of this project is the development of an embedded controller application using FPGA with significantly less time and cost and enhanced reliability. In a conventional method, these issues impact the project:
- Handshaking between multiple stakeholders and domains drives multiple iterations
- Bugs increase time and cost throughout the project lifecycle
- Limitations with virtual integration and early validation
- Reduced scope for reuse and early prototyping
Adoption of Model-Based Design for FPGA development enables seamless integration of different stakeholder needs, performs virtual integration, and automatically generates production code. This talk includes topics such as:
- Model-Based Design of an embedded controller for PMSM
- Virtual integration and simulation
- VHDL code generation and co-simulation
This talk will go over a detailed workflow to develop an embedded controller for an aerospace application from system requirements to design and code using Model-Based Design including:
- Requirements to model development in S-domain
- Virtual simulation to validate the architecture
- Development of a fixed-point model
- Development of a model for VHDL code generation
- Design and simulation of control systems with plant models using similar test cases from the MATLAB® environment
- VHDL code generation and co-simulation of code using the same test cases and plant models
- Effective post-processing of results as a part of analysis and validation
This talk will also cover how Model-Based Design led to significant savings in terms of time and cost and how continued support from MathWorks led to effective incorporation of tools and methods enabling successful deployment.
Standardized NPSS Propulsion Model Integration into Simulink Process Using FMI
Hear about newly developed methods of integration of propulsion performance models built in the NPSS (numerical propulsion system simulation) environment into MATLAB® and Simulink® using the FMI (Functional Mockup Interface) industry standard. NPSS model can now be easily packaged as an FMI version 2–compliant FMU (Functional Mockup Unit) and brought into Simulink using a native FMU Import block. This requires no code to be generated or compiled for the interface as in previous S-function-based methods. Using an NPSS FMU in Simulink also allows for the use of solvers, analysis tools, and other features to analyze and visualize the behavior of the NPSS model. Another feature of the FMU interface is taking advantage of Simulink bus signals, which allow a large number of signals to be grouped together and made available to the user so they can select their own parameters of interest for their task. Additionally, with recent development efforts, the NPSS FMU can be wrapped using Simulink Coder™ to enable the entire Simulink model to be packaged as source code for use in the real-time simulation environment. The FMU generation process around the NPSS model was developed in collaboration with the NPSS Consortium and will be publicly available in FMI version 3.3 later in 2023. This improved process has resulted in faster simulation integration times and added capabilities for process automation and coupling with MBSE tools for requirements testing and verification.
Using Model-Based Design to Develop SOA Applications for In-Vehicle OS
With the growth of the service-oriented architecture paradigm in the automotive industry, several middleware options have emerged, along with the widely known AUTOSAR Adaptive Platform. Many powerful companies like ZEEKR are also looking to develop their own middleware for in-vehicle operating systems (OS). ZEEKR implements multiple function clusters for such an in-vehicle operating system for smart EVs. When we talk about how to develop software for SOA or do the actual coding, the first thing that comes to mind is handwritten C++ language, which puts forward high requirements not only on the programmer's ability but also on the accompanying toolchain. In contrast, Model-Based Design is the dominant approach in traditional onboard embedded system development and there are a lot of experienced engineers. Starting with R2022a, Simulink® provides some new features like a client-server interface, which enables modeling instead of handwriting SOA software. Companies can use the power of MATLAB® and Simulink to quickly transition from the embedded software era to the emerging SOA software era. There are two main challenges in using Model-Based Design to develop SOA software running on in-vehicle OS: how to model the software behavior and how to maintain complex software clusters. This talk focuses on solving these two problems. ZEEKR made full use of the powerful capabilities of MATLAB and Simulink for SOA software development, formulated many strategies according to local conditions, and facilitated the implementation of SOA application software running on in-vehicle OS.
Using Simulink with Python
Engineers use MATLAB® and Simulink® with other programming languages, for example C/C++ or Python®, to develop algorithm components for their increasingly complex projects. Integrating these components, which are developed in different environments, together for system-level simulation is becoming a key step of the project’s success.
In this presentation, see a demonstration of major workflows using Python with Simulink. Learn how to use Python Importer and other methods to bring Python code (including AI models and data processing pipelines) easily into Simulink. In addition, discover how you can simulate your Simulink model from a Python-based environment through the MATLAB engine or compile your Simulink model as a Python package for deployment.
Verification of Avionics Systems Using Simulink Test and Simulink Real-Time
The current market situation forces companies to search for methods to accelerate the development of new avionics systems in accordance with guidelines and industry standards (including ARP4754A and DO-178C) while maintaining high quality and cost competitiveness. In this talk, hear about challenges related to the automation of the system verification process of avionics devices, solutions using Simulink Test™ and Simulink Real-Time™, and future development directions of the test environment.
How Siemens Energy Enables the Global Energy Transition
The transition to carbon-neutral energy production is one of the cornerstones for limiting global warming. Recent discussions about energy transition have focused on the "energy trilemma"—the need to find the right balance between affordability, reliability, and sustainability of energy.
Part of the solution is to strengthen the power grid for highly fluctuating distributed generation and feed-in from renewable energy sources. Siemens Energy is accomplishing this with solutions such as HVDC (high-voltage direct current transmission) and FACTS (flexible AC transmission).
The power grid of the future relies on power electronics and their control software. Therefore, the energy transition is also driven by software development. For this purpose, it is elementary to understand the grid as well as possible and to test the software thoroughly.
Working closely with MathWorks as a technology partner, Siemens Energy's Grid Solutions Control and Protection Department has extended Model-Based Design to study, develop, and verify complex power transmission systems. Simulating the plant on a desktop PC helps to "push left" in the v-model development cycle to test system behavior as early as possible with the same control software that is later deployed on hardware. This enables the development of a digital twin to analyze and understand fault scenarios and incorporate final system validation for the end customer with MathWorks products.
Leveraging this approach as an end-to-end development ecosystem helps in all phases of project execution—be it bidding, engineering, or system studies. Modern cloud-based CI/CD workflows extend this ecosystem to automate test and code generation and make it even more accessible to all engineers.
Using a model-based engineering ecosystem helps Siemens Energy provide answers and solutions to our customers and support them on their journey to clean and available energy for all.
Moonshots: How Engineers and Scientists Are Achieving the Impossible
Moonshots—projects with lofty and seemingly impossible goals—are the engines that drive innovation, increase human knowledge, and improve our standard of living. Over 50 years ago, NASA's Apollo program landed the first people on the moon, accomplishing a literal moonshot and fostering emerging technologies that are now ubiquitous, such as integrated circuits, photovoltaic cells, and digital image processing. Today, engineers and scientists are aiming to generate unlimited clean energy, create advanced medical devices to save and improve lives, travel to Mars, and explore the universe. In this talk, learn about some of these visionary projects, the role MATLAB® and Simulink® played in helping engineers and scientists achieve their goals, and how you can apply these same tools and techniques to your own "moonshots."
Project-Based Learning and Design with Simulation
King's Engineering at King’s College London was relaunched in 2020 as a general engineering department. The aim is to deliver innovative engineering education that addresses new technological and societal challenges through an interdisciplinary, transdisciplinary, and intradisciplinary education. We are reimagining how project-based learning and design can be applied, at scale, in a research-intensive environment and how we can put people at the center of engineering. To support this vision, we are embedding simulation workflows and methodologies within our teaching modules. It allows students to work in agile sprints and fully explore scenarios and iterate design ideas within a multidomain systems environment. MATLAB® and Simulink® become both the learning environment and the assessment tool. In this talk, we discuss our pedagogic approach to designing assignments with Simulink and developing the students' modeling and simulation skills. We argue that project-based and practical learning can include hands-on experience of digital artifacts and that the parallels between Model-Based Design and engineering system design help students in their learning.
What's New in MATLAB and Simulink R2023a
Learn about new capabilities in MATLAB® and Simulink®. This talk highlights new tools such as interactive apps, Live Editor tasks, and high-level functions for completing tasks in MATLAB with little to no coding; and new features in Simulink for defining input signals, managing design variants, and debugging simulations. See new capabilities for building and testing your code and models, using Python® with both MATLAB and Simulink, and integrating with other tools and environments, including Jupyter®, VS Code, and Unreal®.
Engaging First-Year Engineering Students with Deep Learning and IoT
A recent survey by the American Society of Engineering Education Corporate Member Council highlighted two areas in which engineering graduates are inadequately prepared to meet industry demands: AI and Internet of Things (IoT). At the Ira A. Fulton Schools of Engineering at Arizona State University, we’re taking steps to address this skills gap by introducing engineering students to AI and IoT concepts early in their college careers. Specifically, a new learning module was added to the first-year Introduction to Engineering course in which students complete hands-on AI and IoT exercises using MATLAB®. In these exercises, students perform image classification with a deep learning network and then send the results of their classifications to the ThingSpeak IoT analytics platform for aggregation and analysis. The module requires no previous programming experience in MATLAB and no additional hardware—students use their own laptops, tablets, and webcams. Just as importantly, the module requires minimal instructor preparation because the exercises were designed, implemented, and validated by MathWorks engineers and are ready to use in the course.
Interactive Learning with MATLAB Apps, Live Scripts, and MATLAB Grader
NETH and University Collaboration for Talent Workforce Development
The NEXTY Career Identity Program is NETH’s project for automotive software (SW) development industrial knowledge transfer to academia for future talent workforce development. The project’s objectives are to:
- Acquire talented workforce with the SW development skills needed to become a SW engineer.
- Build automotive SW development technology network between university and industry.
- Transfer automotive software development knowledge to universities.
- Develop automotive SW development curriculum for workforce upskilling.
- Define skill evaluation criteria for target technology in the embedded systems SW development field.
To transfer industrial knowledge to students, Model-Based Design and embedded SW development training related to Electrification was provided to them from November 2021 to November 2022. The challenge of project was to provide concrete SW development practices and examples with quality standard mindset while keeping in mind the various student majors and programming backgrounds. Imparting Model-Based Design philosophy is the key to successful training because NETH (also known as Toyota Tsusho NEXTY Electronics (Thailand) Co., Ltd.) uses Simulink® for industrial SW development. With NETH’s university collaboration network and business partnership support, the NEXTY Career Identity Program provided training courses to 52 students, including basic Model-Based Design training, embedded systems workshop, DC motor modeling with FPGA hardware-in-the-loop, and battery management system using Stateflow®. Eighteen professors from six universities joined the collaboration program, and nine of them plan to develop further curricula with NETH. These achievements show that our workforce development network is growing.
Preparing Students for Impactful Careers in Industry
In this session, hear about the challenges that deans and department heads face as they evolve their curricula to meet the ever-changing needs of industry and students. See how MathWorks aligns as a university partner by developing teaching tools and resources that help prepare students to fit industry and research workflows. This talk integrates MathWorks resources such as assignment autograding, online training courses, courseware, integrations with open source software (OSS), student programs, and capstone projects in a conversation about enabling educators to teach complex engineering concepts in an effective and engaging manner.
Teaching Robotics and Controls Made Easier
Robotics and controls, as rapidly advancing engineering disciplines, are transforming various industries and sectors across the board. Educators who teach these subjects face limited time to create and update their curriculum while trying to keep up with advancing technology and scientific trends. Discover how to apply tools used in industry and interactive courseware to teach robotics and controls. Analyze a typical academic curriculum from early courses to final projects to understand how you can approach the different branches of knowledge that form these disciplines, from the different phases of the teaching activity. Learn how to supplement your teaching using free self-paced online training, educational videos, and course assignments to automatically grade code. You will also learn how to incorporate interactive apps and scripts, cloud solutions, virtual labs, and low-cost hardware projects developed by MathWorks and leading universities to make your lessons more engaging.
Can You See Me Now? Analyzing Satellite-to-Ground Station Visibility
6G Wireless Technology: Accelerate Your R&D with MATLAB
The world of wireless communications has begun the research and development to build sixth-generation (6G) wireless systems. 6G research and development aims to improve on the performance of the current 5G systems and develop networks that are faster, more intelligent, operate with lower latencies, and enable new applications. Enabling technologies for 6G may include new frequencies like sub-THz communication, as well as artificial intelligence and machine learning, reconfigurable intelligent surfaces, joint communication and sensing, and new digital waveforms. Hear an overview of the goals and vision for 6G systems, the enabling technologies, and how MATLAB® wireless communications tools can enable you to accelerate your 6G R&D process with reliable modeling and simulation.
Development of Signal Processor and Extractor Module for 3D Surveillance Radar
In this session, explore radar signal processing and data extractor modules developed for 3D surveillance radar for a ground-based air surveillance system. Radar design starts from user specifications such as unambiguous range, range resolution, and azimuth and range resolution and accuracies. We evaluated the Radar Designer app to perform initial radar design calculations. Radar waveform selection is crucial to the radar design meeting its performance specifications. Using the Pulse Waveform Analyzer app, we selected and verified radar waveforms. Matched filtering–based digital pulse compression was verified using MATLAB® scripting. A radar signal processor (RSP) is responsible for target detection under environmental clutter. Many RSP algorithms are available whose performance verification requires high fidelity simulated input data. We successfully designed a complete RSP chain using Phased Array System Toolbox™ and Radar Toolbox. We built a simulated radar data extractor module consisting of elevation extraction and range and azimuth estimation methods in MATLAB using the toolboxes to provide better assessment of site recorded data. We overcame challenges encountered with range, azimuth, and elevation estimation of the data. MATLAB visualization enabled us to present our findings in a comprehensive manner. The RSP chain designed in MATLAB was validated on actual radar site data and found to be matching with actual performance with high fidelity. MATLAB proved to be extremely useful by reducing our design cycle and we were able to demonstrate our concept with a higher degree of confidence.
Intel Agilex FPGA-in-the-Loop Simulation: Enabling DSP Emulation for Space-BACN
The U.S. Defense Advanced Research Projects Agency (DARPA) Space-Based Adaptive Communications Node (Space-BACN) program will provide communication in hard-to-reach areas by using a network of small low-Earth orbit satellites as relay stations to forward and amplify signals from ground-based radios, aircraft, and other communication systems, essentially acting as a space-based repeater system to extend the range of communication systems that would be limited by the curvature of the earth or interference from terrain. In the development phase, DARPA has selected Intel, with others, to design a reconfigurable optical modem that will support both current and new communication standards and protocols to enable interoperability among satellite constellations. Intel is developing its optical modem solution by bringing together experts from its field-programmable gate array (FPGA) product group, packaging technologists from its Assembly Test Technology Development (ATTD) division, and researchers from Intel Labs. Based on its leading-edge low-power Intel® Agilex™ FPGA, Intel is designing new chiplets that will be integrated using Intel’s embedded multi-die interconnect bridge (EMIB) and advanced interface bus (AIB) packaging technologies, into a single multi-chip package (MCP). To ensure first silicon success, Intel is using FPGA-in-the-loop technologies in MATLAB® and Simulink®. This enables the verification of HDL implementations directly against algorithms in MATLAB or Simulink, the application of data and test scenarios from MATLAB or Simulink to the HDL design on the FPGA, and the integration of existing HDL code with models under development in MATLAB or Simulink. Once completed, Space-BACN will be used in the future for humanitarian assistance, disaster relief, and operations in denied areas. Communication barriers that have long hindered operations in difficult environments will be overcome, allowing for greater efficiency and connectivity.
Optimizing the Design and Operation of Radar and Antenna Systems in MATLAB
Multifunction radar systems perform many tasks including search and track, classification of targets, communications, environmental assessments, and interference mitigation. Engineers designing these systems must deal with multiple design challenges:
- RF spectrum congestion: 5G applications are pushing wireless systems to utilize higher frequency bands, which causes interference challenges for radar applications.
- Resource allocation: Radar systems have limited resources (e.g., bandwidth, transmit energy/time budget, computational resources) that must be allocated to multiple tasks to achieve mission objectives.
- Physical design and cost constraints: For example, designers must minimize the antenna size while maintaining the impedance matching and maximizing the antenna gain.
In this talk, we explore how engineers can address these radar design challenges with automated design optimization workflows using the Optimization Toolbox™. We will cover several examples:
- Optimal beam pattern synthesis to null the power in the direction of interfering signals using Phased Array System Toolbox™
- Optimal resource allocation for a multisector surveillance radar using Radar Toolbox
- Antenna design optimization using Antenna Toolbox™
Transforming Wireless System Design with MATLAB and NI
Wireless communication, radar systems, software-defined radio (SDR), and instrumentation are all highly intricate areas of technology that require advanced mathematical and computational techniques for their design, simulation, and implementation. Engineers and researchers can utilize software and hardware tools from MathWorks and NI to facilitate this process, which enables the characterization, design, simulation, testing, and prototyping of real-world systems for over-the-air testing. These tools, based on flexible COTS systems, allow for the development and real-time testing of signals spanning a wide range of wireless standards such as 5G, LTE, WiFi, FMCW, pulse radar, and the characterization of power amplifiers used for digital predistortion (DPD), as well as the development of narrow-band bursty waveforms like Automatic Dependent Surveillance-Broadcast (ADS-B). The integration of mathematical modeling, simulation, code generation, and hardware connectivity capabilities in MATLAB with NI's expertise in data acquisition, instrument control, and real-time testing ensures that any system can be effectively tested in challenging real-world scenarios. This results in a comprehensive and efficient workflow for engineers and researchers across multiple domains, allowing for a streamlined design and implementation process without the need for extensive knowledge of the underlying hardware.
Inclusive Innovation: Technology by and for All
Are you trying to bring diversity, equity, and inclusion to your organization? At #MATLABEXPO, hear a discussion between an engineer, a founder, and an investor who have done so to drive innovation in their organizations.
AMER Inclusive Innovation: Technology by and for All
AMER Inclusive Innovationis of fundamental importance to develop engineering solutions that support the needs of diverse communities. How do we build learning environments and workplaces that recognize community needs in order to engineer inclusive solutions? Hear thought leaders discuss the needs, challenges, and social factors that their organizations to drive innovation.