Proceedings
Featured Presentations
Keynote Presentations
Data Science and Predictive Analytics
Jannat Manchanda, Mahindra & Mahindra Ltd
Saifee Aliakbar, Mahindra & Mahindra Ltd
Deep Learning and Autonomous Systems
Kaushik Vasanth, NXP India Pvt. Ltd.
Systems Modeling, Implementation, and Verification
Suchay Sawardekar, Schindler
Nabal Pandey, Mahindra & Mahindra Limited
Pranoti Joshi, Whirlpool of India Ltd
Signal Processing Systems: From Design to Implementation
Dr. Sreenath Ramanath, Lekha Wireless
Vishal Kumar Patil, Lekha Wireless
Master Classes
Customer Poster Presentations
We will not sell or rent your personal contact information. See our privacy policy for details.
You are already signed in to your MathWorks Account. Please press the "Submit" button to complete the process.
Beyond the “I” in AI
Mike Agostini, MathWorks
Insight. Implementation. Integration.
AI, or artificial intelligence, is transforming the products we build and the way we do business. It also presents new challenges for those who need to build AI into their systems. Creating an “AI-driven” system requires more than developing intelligent algorithms. It also requires:
- Insights from domain experts to generate the tests, models, and scenarios required to build confidence in the overall system
- Implementation details including data preparation, compute-platform selection, modeling and simulation, and automatic code generation
- Integration into the final engineered system
Mike Agostini demonstrates how engineers and scientists are using MATLAB® and Simulink® to successfully design and incorporate AI into the next generation of smart, connected systems.
What's New in MATLAB and Simulink
Prashant Rao, MathWorks
Learn about new capabilities in the MATLAB® and Simulink® product families to support your research, design, and development workflows. This talk highlights features for deep learning, wireless communications, automated driving, and other application areas. You will see new tools for defining software and system architectures, and modeling, simulating, and verifying designs.
Customer Keynote - Bangalore
AI Techniques in MATLAB for Signal, Time-Series, and Text Data
Dr. Shayoni Datta, MathWorks
Developing predictive models for signal, time-series, and text data using artificial intelligence (AI) techniques is growing in popularity across a variety of applications and industries, including speech classification, radar target classification, physiological signal recognition, and sentiment analysis.
In this talk, you will learn how MATLAB® empowers engineers and scientists to apply deep learning beyond the well-established vision applications. You will see demonstrations of advanced signal and audio processing techniques such as automated feature extraction using wavelet scattering and expanded support for ground truth labelling. The talk also shows how MATLAB covers other key elements of the AI workflow:
- Use of signal preprocessing techniques and apps to improve the accuracy of predictive models
- Use of transfer learning and wavelet analysis for radar target and ECG classification
- Interoperability with other deep learning frameworks through importers and ONNX converter for collaboration in the AI ecosystem
- Scalability of computations with GPUs, multi-GPUs, or on the cloud
waterSenz: A Digital Water Management System
Gnarus is a young technology company based out of Bengaluru, who strives to bring awareness and help people leverage technology in the chores and things that matter in their day-to-day life. Internet of Things (IoT) offers limitless possibilities for building solutions to solve challenges faced in day-to-day life. The team at Gnarus is building innovative products and solutions in this space to create a simple and smarter life for its customers. Gnarus’ flagship product waterSenz™ (indigenously designed and developed) is an integrated water management system for small-to-large residential and commercial complexes. It combines contact-less sensors, wireless connectivity, and cloud-hosted software, all tightly-integrated for simple deployment.
Industrial IoT and Digital Twins
Pallavi Kar, MathWorks
Industrial IoT has brought the rise of connected devices that stream information and optimize operational behavior over the course of a device’s lifetime.
This presentation covers how to develop and deploy MATLAB® algorithms and Simulink® models as digital twin and IoT components on assets, edge devices, or cloud for anomaly detection, control optimization, and other applications. It includes an introduction to how assets, edge, and OT/IT components are connected.
The talk features customer use cases starting from design to final operation, the underlying technology, and results.
Developing and Deploying Machine Learning Solutions for Embedded Applications
Nitin Rai, MathWorks
Machine learning is a powerful tool for solving complex modeling problems across a broad range of industries. The benefits of machine learning are being realized in applications everywhere, including predictive maintenance, health monitoring, financial portfolio forecasting, and advanced driver assistance. However, developing predictive models for signals obtained from sensors is not a trivial task. Moreover, there is an increasing need for developing smart sensor signal processing algorithms, which can be either deployed on edge nodes and embedded devices or on the cloud, depending on the application. MATLAB® and Simulink® provide a platform for exploring and analyzing time-series data and a unified workflow for the development of embedded software by providing a workflow from prototyping to production, including C code generation, processor-in-the-loop testing, and rapid prototyping on popular hardware platforms.
In this this talk, you will learn about:
- Time-frequency feature extraction techniques for machine learning workflows such as wavelets
- Automatic C code generation for preprocessing, feature extraction, and machine learning algorithms
- Rapid prototyping on embedded hardware such as Raspberry Pi™ and Android™
Innovative Method of Deploying MATLAB Based Applications Across an Organization Using MathApps, a Web-Based Platform
Chandrakant Deshmukh, Mahindra & Mahindra Ltd
Jannat Manchanda, Mahindra & Mahindra Ltd
Saifee Aliakbar, Mahindra & Mahindra Ltd
At Mahindra & Mahindra Ltd, component-level design calculations are developed with the help of domain experts to predict the key performance parameters. These design calculations are converted into easy-to-use GUI for input and output interactions. Various MathWorks toolboxes and best practices are integrated with these applications for improved user experience, such as automated report generation. The applications are made accessible to the designers through a web browser and can be run from their workstations without the need to install additional software. The applications are hosted in a web platform, called MathApps, which serves as a central hub of applications. All applications are restricted for internal use by integration with server-based database and two-stage authentications. Applications are open for all designers across multiple product development centers. It covers diverse application domains ranging from simple spreadsheet-based calculations to multiphysics and nonlinear simulations based on beam theory, test data processing, and data analytics. The web site interface enables easy retrieval of calculations, with quick search and filter options. MathApps avoids duplication of efforts using a single central repository and paves the way for knowledge management. In conclusion, the innovative platform is transforming the way in which concept calculations are performed and has the potential to be a disruptive force in CAE democratization.
Building and Sharing Desktop and Web Apps
Dr. Lakshminarayan Viju Ravichandran, MathWorks
After algorithms are developed in MATLAB®, engineers prefer building user interfaces (apps), which can be then distributed across entire organizations. These apps enable teams to automate workflows, get quick quantitative analysis, avoid human error, and collaborate more effectively. Apps and components can be shared as both standalone desktop applications and as software components to integrate with web and enterprise applications.
In this talk, you will learn how to:
- Develop responsive MATLAB applications with rich data visualizations
- Share these applications with other MATLAB users and non-MATLAB users
- Deploy MATLAB applications to enterprise production systems and the web
Predictive Maintenance with MATLAB
Amit Doshi, MathWorks
Predictive maintenance reduces operational costs for organizations running and manufacturing expensive equipment by predicting failures from sensor data. However, identifying and extracting useful information from sensor data is a process that often requires multiple iterations as well as a deep understanding of the machine and its operating conditions.
In this talk, you will learn how MATLAB® and Predictive Maintenance Toolbox™ combine machine learning with traditional model-based and signal processing techniques to create hybrid approaches for predicting and isolating failures. You will also see built-in apps for extracting, visualizing, and ranking features from sensor data without writing any code. These features can then be used as condition indicators for fault classification and remaining useful life (RUL) algorithms.
Predictive maintenance algorithms make the greatest impact when they are developed for a fleet of machines and deployed in production systems. This talk will show you how to validate your algorithms, and then integrate them with your embedded devices and enterprise IT/OT platforms.
Automated Driving System Design and Simulation
Dr. Amod Anandkumar, MathWorks
ADAS and autonomous driving systems are redefining the automotive industry and changing all aspects of transportation, from daily commutes to long-haul trucking. MATLAB® and Simulink® provide the ability to develop the perception, planning, and control components used in these systems.
In this talk, you will learn about these tools through examples that ship in R2019a, including:
- Perception: Design LIDAR, vision, radar, and sensor fusion algorithms with recorded and live data
- Planning: Visualize street maps, design path planners, and generate C/C++ code
- Controls: Design a model-predictive controller for traffic jam assist, test with synthetic scenes and sensors, and generate C/C++ code
- Deep learning: Label data, train networks, and generate GPU code
- Systems: Simulate perception and control algorithms, as well as integrate and test hand code
LiDAR-Based Exploration of Unknown Indoor Space by a Robotic System
Deepak Agarwal, EbyT Technologies Pvt. Ltd.
The exploration of unknown environments can be the fundamental problem for mobile robots, as it involves all the basic capabilities of such systems (e.g., perception, planning, localization, and navigation). From a practical viewpoint, exploration is a central task in many applications, such as planetary missions, intervention in hostile areas, and automatic map building.
This presentation focuses on technique known as frontier-based exploration. The rationale of this approach is that the robot must move towards the boundary (the frontier) between safe explored areas and unknown territory to maximize the information gain coming from new perceptions. The talk discusses how to explore indoor spaces in the absence of predefined map, how to navigate the space while avoiding static obstacles, and how to implement the exploration and navigation module in ROS in MATLAB® and simulate the virtual robot, Husky, in ROS gazebo using Robotics System Toolbox™.
LiDAR Processing for Automated Driving
Avinash Nehemiah, MathWorks
The use of LiDAR as a sensor for perception in Level 3 and Level 4 automated driving functionality is gaining popularity. MATLAB® and Simulink® can acquire and process LiDAR data for algorithm development for automated driving functions such as free space and obstacle detection. With the point-cloud processing functionality in MATLAB, you can develop algorithms for LiDAR processing, and visualize intermediate results to gain insight into system behavior.
This talk shows new capabilities including:
- Acquiring live and offline data from Velodyne® sensors
- Registering LiDAR point clouds
- Segmenting objects and detecting obstacles
- Applying deep learning to LiDAR data
- Generating C/C++ and CUDA® code from LiDAR processing algorithms
Develop and Test Vehicle Controllers for ADAS and Automated Driving Applications Through System Simulation
Abhisek Roy, MathWorks
When developing and testing sensor fusion algorithms and vehicle controllers for ADAS and automated driving applications, engineers need to consider the complex interplay between multiple sensors and dynamics of the vehicle, while the vehicle operates in a wide variety of scenarios, many of which may not have been available in recorded data during development. With multidomain system modeling and simulation-based testing in MATLAB® and Simulink®, you can close the loop in simulation, thus accelerating the development process and reducing costly and hazardous in-vehicle testing.
In this talk, you will learn how to:
- Design model predictive control-based vehicle controllers
- Create synthetic scenarios and test sensor fusion and control algorithms using system simulation
- Improve simulation fidelity with gaming engine integration, vehicle dynamics modeling, and automated scenario creation from recorded data
Deploying Deep Neural Networks to Embedded GPUs and CPUs
Dr. Rishu Gupta, MathWorks
Designing and deploying deep learning and computer vision applications to embedded GPU and CPU platforms like NVIDIA® Jetson AGX Xavier™ and DRIVE AGX is challenging because of resource constraints inherent in embedded devices. A MATLAB® based workflow facilitates the design of these applications, and automatically generated C/C++ or CUDA® code can be deployed to achieve up to 2X faster inference than other deep learning frameworks.
This talk walks you through the workflow. Starting with algorithm design, you can employ deep neural networks augmented with traditional computer vision techniques which can be tested and verified within MATLAB. Bring live sensor data from peripheral devices on your Jetson/DRIVE platforms to MATLAB running on your host machine for visualization and analysis. Train your deep neural networks using GPUs and CPUs on the desktop, cluster, or cloud. Finally, GPU Coder™ and MATLAB Coder™ generate portable and optimized CUDA and/or C/C++ code from the MATLAB algorithm, which is then cross-compiled and deployed to Jetson or DRIVE, ARM®, and Intel® based platforms.
Systems Engineering: Requirements to Architecture to Simulation
Gaurav Dubey, MathWorks
System engineering and model-based system engineering can mean different things to different groups, but most definitions share a common set of concepts, including starting from a set of system-level requirements which are used to drive a system decomposition and requirements allocation process. Then trade-off studies are performed on system architecture alternatives to produce a candidate architecture from which the design is developed and then simulated to verify that the requirements are met.
This presentation shows how MathWorks tools can support this workflow by allowing users to:
- Capture, view, analyze, and manage requirements
- Develop a system architecture model from the requirements, existing Simulink® models, ICDs, and externally created architectures or combinations of the above
- Examine the system architecture model using different views for different concerns
- Allocate (link) requirements to architectural components and perform coverage and change impact analysis
- Perform trade studies to compare, assess, or optimize the system architecture
- Design components specified in the system architecture model
- Simulate the system composition to verify system-level behavior
Model-Based Design of Smart Electric Vehicle
Ather Energy has set out to make a change in the way people perceive electric vehicles. Everyone knows that electric transport is the future, but lack of good products in that space in our country is generally hindering faster adoption of electric vehicle technology. A good product has multiple dimensions that need to be met across performance, user experience, and customer service. The Ather 450, along with their charging infrastructure Point, is the first product out of their factory that looks to meet these dimensions.
The scooter, along with charging infrastructure, creates a complex ecosystem of devices where the interaction between these need to be handled well to optimize performance. The company knew the complexity of the system well enough early on, and for them, it was an easy decision to use MathWorks products. The physical model of the system and the algorithm developed went a long way in helping Ather Energy understand and optimize the performance faster.
Developing a Battery Management System Using Simulink
Prasanna Deshpande, MathWorks
Battery management systems (BMS) ensure maximum performance, safe operation, and optimal lifespan of battery pack energy storage systems under diverse charge-discharge and environmental conditions. With Simulink®, engineers can use simulations to model feedback and supervisory control algorithms that monitor cell voltage and temperature, estimate state-of-charge (SOC) and state-of-health (SOH) across the pack, control charging and discharging rates, balance SOC across the battery cells, and isolate the battery from source and load when necessary. Starting from early design tradeoffs to hardware-in-the-loop (HIL) testing of BMS hardware, Simulink can help engineers perform desktop simulations to ensure the BMS performs as intended under all desired operating conditions and meets design durability requirements. In this talk, you’ll learn how Simulink helps engineers from electrical, thermal, and software backgrounds collaborate throughout the development cycle of BMS algorithms.
Automated Physical Model Verification Framework Using Simulink Test
Maheshwar Dewangan, Schindler
Amrut Ingale, Schindler
Model-based engineering is widely used in industries to build system architectures and engineering applications to enable virtual analysis and qualifications. Within Schindler, physics-based generic system models are developed that serve as the basis for model-based validation, worst-case development to fit to purpose development, and Executable Elevator Body of Knowledge. The model captures the dynamics of the mechanical components (rope, buffer, and overspeed governor) that are driven from electrical components (motor). The motor is controlled by the behaviour model of the main elevator control. Custom Simscape™ blocks are used to accurately model the physics of mechanical components. The elevator controller, drive, and logical components are modelled using Simulink® and m scripts.
The model development cycle follows the SCRUM methodology. In a sprint of four weeks, the development focus is to improve the accuracy of physical models as well as include additional features and variants of the main controller. This generic model, which involves the various configurations in terms of elevator control components, mechanical components, and onsite installation (number of floor levels), needs to be verified prior to the sprint release. The challenge is to automate verification tests of models, which involves time-variant physical signals.
An automated test framework called Model’s Automated Test Harnesses (MATHS) is developed using Simulink Test™ to perform system-level tests and model-in-the-loop (MIL) tests for subsystem components. Simulink Test provides a good platform to create test harnesses and write corresponding test cases. With new developments, the interfaces and, to some extent, logics of the models change. Therefore, it becomes necessary to create a framework which allows the maximum reuse of the test harnesses and minimum rework on defining test cases.
Developing and Implementing Digital Control for Power Converters
Naini Dawar, MathWorks
Using a buck-boost power converter example, this talk explains how Simulink® and Simscape Electrical™ are used to develop, simulate, and implement a controller that maintains desired output voltage in the presence of input voltage variations and load changes to achieve fast and stable response. The presentation covers:
- Modeling passive circuit elements, power semiconductors, and varying power sources and loads
- Simulating the converter in continuous and discontinuous conduction modes
- Determining power losses and simulating thermal behavior of the converter
- Tuning the controller to meet design requirements such as rise time, overshoot, and settling time
- Generating C code from the controller model for implementation on a Texas Instruments™ C2000™ microcontroller
Simplifying Requirements-Based Verification with Model-Based Design
Vamshi Kumbham, MathWorks
With Model-Based Design, informal textual requirements can be modeled and simulated to verify behavior earlier, and then be automatically generated into code for an embedded target. The requirements can include temporal properties to define complex timing-dependent signal logic and can be incomplete or inconsistent. This can lead to errors and miscommunication in the design and test.
This talk shows you how you can model requirements and use the Logical and Temporal Assessments editor in Simulink Test™ to translate informal text requirements into unambiguous assessments with clear, defined semantics that can identify inconsistencies. The temporal assessment language, based on metric temporal logic, provides precise, formal semantics that is highly expressive and extensible to author readable assessments. You will learn how to enter assessments with conditions, events, signal values, delays, and responses using the interactive form-based editor. You can view the assessment in an English language-like statement that is easy to understand or view graphical representations that allow you to visualize the results and debug design errors.
Enterprise-Scale Software Verification for C/C++ Code
Vaishnavi H.R., MathWorks
Polyspace® products help you statically verify embedded software written in C and C++. They can find bugs and prove the absence of overflow, divide-by-zero, out-of-bounds, array access, and other run-time errors including checking C/C++ coding standards for MISRA® and AUTOSAR. Recent developments in Polyspace products help development teams improve software quality, safety, and security across their enterprise. Analysis execution is automated using continuous integration tools such as Jenkins™. Results can be published for web browser-based code review to triage and resolve coding errors. Integration with defect tracking tools like Jira help manage identified defects. Dashboards display information that development managers can use to monitor software quality, project status, number of defects, and code metrics.
Design, Analysis, and Verification of 5G NR Waveforms Using MATLAB
Tabrez Khan, MathWorks
Dr. Sreenath Ramanath, Lekha Wireless
Vishal Kumar Patil, Lekha Wireless
Development of 5G products is accelerating with the first device and network deployments in 2019. 5G New Radio (NR) technology introduces a flexible architecture that will enable the ultra-fast, low-latency communications needed for next-generation mobile broadband networks and applications, such as connected autonomous cars, smart buildings and communities, digital healthcare, industrial IoT, and immersive education. The flexibility of the 5G NR standard will make design and test more complex. In this talk, learn about key capabilities of 5G Toolbox™ and how Lekha Wireless used 5G Toolbox™ for independent design, analysis, and verification during 5G NR Physical layer stack development.
Simulink for the Analysis of Self-Interference Cancellation in Full-Duplex Transceiver
Full-duplex communications is a lucrative way for doubling the available system bandwidth. However, self-interference is a major impediment for realizing a FD node. The interference must be cancelled at RF, analog domain, and the baseband. This requires simulation of the entire RF chain along with impairments.
Design and Verification of Mixed-Signal and SerDes Systems
Aniruddha Dayalu, MathWorks
The design and integration of mixed-signal integrated circuits is becoming increasingly more challenging due to analog effects that cannot be neglected, and complex embedded digital signal processing algorithms and control logic.
This talk introduces workflows in MATLAB® and Simulink® that help achieve fast system-level simulation speed, perform comprehensive design space exploration, and develop high-quality behavioral models. You will see a realistic PLL example that shows the entire design and simulation workflow. Through practical examples, you will also learn how to reuse a Simulink model or a MATLAB function in EDA tools to develop high quality verification testbenches.
Development of Multi-Target Tracker for Surveillance Radar Using MATLAB
Srihari B R, BEL
A multi-target tracker developed for surface target tracking in a coastal surveillance scenario needs a robust track initiation and maintenance algorithm for tracking small targets and reducing the false alarm rate in the presence of sea clutter. The multi-object tracks in Sensor Fusion and Tracking Toolbox™ are evaluated with time-stamped detections recorded from the field. The GNN and ToMHT tracker module parameters are tuned by varying track initiation methods, estimation filters, thresholds, and hypothesis maintenance parameters. The resulting track data from each approach is benchmarked with recorded track data from the OEM system in the field. The performance of the MATLAB® tracker is compared with reference to Track ID maintenance, accuracy in estimated position, speed, course of targets, and time taken for processing each scan data. The results from multi-track line GNN and ToMHT approach with customized EKF and IMM-EKF are found satisfactory with respect to Tack maintenance and kinematics. To improve processing time, sub-optimal assignment methods and custom cost-matrix computation is tried out. The covariance fusion function in SFTT is used for correlating similar confirmed tracks from different tracking lines and removing duplicate tracks from confirmed track list. Code generation from MATLAB Coder™ is used for generating a C++ library with tuned tracker parameters. A wrapper function is written in C++ for interfacing radar detection input and track output to display. Further, custom enhancements are incorporated in C++ deployable code for improving track maintenance and processing time with respect to one scan data. The deployable tracker module is interfaced with a radar in the field and performance evaluation is in progress. An attempt is made to tune tracker parameters generated in C++ library to adapt to present environment conditions based on input data attributes. A display interface is provided for the operator to tune thresholds for reducing false alarm rate.
Sensor Fusion and Tracking for Next-Generation Radars
Abhishek Tiwari, MathWorks
The technology required to design and field a multifunction radar drives a corresponding increase in system level complexity. In addition to performing multiple radar-related tasks such as search and track, these systems are often called upon to do other applications such as weather observations, communications, or EW, all with the same hardware resources. To meet the desire for shorter development cycles, modeling and simulation early in this type of project can lower risk and accelerate the pace of project.
With these trends, sensor fusion and tracking technology is central to data processing and resource management. You can extend your signal processing workflows to directly integrate with data processing algorithms, which also results in efficient resource management and control.
In this talk, you will learn to design, simulate, and analyze systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness for surveillance systems. Through several examples, you will see how to:
- Define and import scenarios and trajectories for simulation
- Generate synthetic detection data for radar, EO/IR, sonar, and RWR sensors, along with GPS/IMU sensors for localization
- Design data association algorithms for real and synthetic data
- Perform “what-if” analysis on tracker configurations
- Evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots
- Model a closed-loop, multifunction radar that performs search and track
Design and Verification of DVB–RCS to DVB S2 On-Board Processing Payload Using MATLAB and Simulink
Dr. Deepak Mishra, Space Applications Centre (SAC), ISRO
An open-standard compatible on-board processing system is under development, which provides mesh connectivity between different user terminals. The system is designed to be compatible with the Digital Video Broadcasting - Return Channel via Satellite (DVB-RCS) protocol in uplink and Digital Video Broadcasting - Satellite - Second Generation (DVB-S2) in downlink channels. Due to widespread use of this protocol in DTH, low-cost compatible terminals are widely available in market. The main elements of this system are DVB-RCS burst demodulator and decoder, protocol convertor, baseband processor with channel estimator, and DVB-S2 modulator.
A complete Simulink® model of the proposed payload is generated, where the complete DVB-RCS and DVB S2 baseband frame is generated, modulated, and demodulated as per the standard. The simulation includes complete end-to-end BER verification. During this development, the following modules are developed in Simulink.
A novel 2x based RRC filter was developed using MATLAB®. A Nyquist sampling-based RRC filter is having sin(x)/x equalization. The proposed filter is having linear phase FIR coefficient and it is free from any FPGA/ASIC vendor-specific logic. The clock rate requirement is only twice of the data rate, which is not available with any design tool. A copyright of the proposed filter is already applied. The RRC filter IP core is tested and verified in hardware DVB S2 downlink frame as well as with standard off-the-shelf satellite modem. The filter is tested with a standard modulation scheme like QPSK, 8PSK, and 16 APSK modulation schemes.
Adopting Model-Based Design for FPGA, ASIC, and SoC Development
Hitu Sharma, MathWorks
The competing demands of functional innovation, aggressive schedules, and product quality have significantly strained traditional FPGA, ASIC, and SoC development workflows.
This talk shows how you can use Model-Based Design with MATLAB® and Simulink® for algorithm- and system-level design and verification, including how to:
- Verify the functionality of algorithms in the system context
- Refine algorithms with data types and architectures suitable for FPGA, ASIC, and SoC implementation
- Prototype and debug models running live on hardware connected to MATLAB or Simulink
- Generate and re-generate verified design and verification models for the hardware engineering team
- Keep the workflow connected to speed verification closure and meet functional safety requirements
Deep Learning and Reinforcement Learning Workflows in AI
Avinash Nehemiah, MathWorks
AI, or artificial intelligence, is powering a massive shift in the roles that computers play in our personal and professional lives. Two new workflows, deep learning and reinforcement learning, are transforming industries and improving applications such as diagnosing medical conditions, driving autonomous vehicles, and controlling robots.
This talk dives into how MATLAB® supports deep learning and reinforcement workflows, including:
- Automating preparation and labeling of training data
- Interoperability with open source deep learning frameworks
- Training deep neural networks on image, signal, and text data
- Tuning hyper-parameters to accelerate training time and increase network accuracy
- Generating multi-target code for NVIDIA®, Intel®, and ARM®
Comprehensive Workflow for AUTOSAR Classic and Adaptive Using Model-Based Design
Durvesh Kulkarni, MathWorks
Modeling and code generation for AUTOSAR software components lets you automate the process of specifying and synchronizing lengthy identifiers in designs, code, and description files. Join us to learn about Simulink® Advance Support for AUTOSAR features of modeling AUTOSAR Classic and Adaptive Software Components, simulating AUTOSAR compositions and ECUs, and of C and C++ production code generation. A MathWorks engineer will provide a brief overview of the latest AUTOSAR standards, including Classic and Adaptive Platforms, and provide product demonstrations showing how you can use Simulink, AUTOSAR Blockset™, and Embedded Coder™ to design, simulate, verify, and generate code for AUTOSAR application software components.
Deploying AI Algorithms on the Cloud for Near Real-Time Decision Making
Pallavi Kar, MathWorks
With the increasing popularity of AI, new frontiers are emerging in predictive maintenance and manufacturing decision science. However, there are many complexities associated with modeling plant assets, training predictive models for them, and deploying these models at scale, including:
- Generating failure data, which can be difficult to obtain, but physical simulations can be used to create synthetic data with a variety of failure conditions.
- Ingesting high-frequency data from many sensors, where time-alignment makes it difficult to design a streaming architecture.
This talk will focus on building a system to address these challenges using MATLAB®, Simulink®, Apache™ Kafka®, and Microsoft® Azure®. You will see a physical model of an engineering asset and learn how to develop a machine learning model for that asset. To deploy the model as a scalable and reliable cloud service, we will incorporate time-windowing and manage out-of-order data with Apache Kafka.
Model-Based Design for Autonomous Aerial Systems
Naga Chakrapani Pemmaraju, MathWorks
Developing Autonomous Aerial systems require engineers analyze the behavior of various subsystems, simulate sensor and perception algorithms, and controls as an integrated platform, and deploy to the actual hardware.
Model-Based Design with MATLAB® and Simulink® is a modular development approach that enables engineering teams to move from internal research and development to design and implementation in a single environment.
In this master class, MathWorks engineers will showcase:
- Single environment for building UAVs, from requirements to deployment
- Modeling environmental effects and 6DOF aircraft simulations
- Designing a drone autopilot and testing its performance under simulated flight conditions
- Simulating communication with multiple agents and with the ground station
- Deploying and testing correctness of the flight controller’s generated code
Machine Learning-Based Tool for Predictive Analysis in Computational Pathology
Dinesh Koka, Onward Health
Vineet Sharma, Onward Health
The advantage of leveraging machine learning techniques in digital pathology is that it provides the capability to learn, to extract knowledge, and to make predictions from a combination of heterogeneous data (i.e. the histological images, the patient history, and the omics data). The ability to mine sub-visual image features from digital pathology slide images—features that sometimes may not be visually discernible—offers the opportunity for better quantitative modeling of disease appearance, and hence, possibly improved prediction of disease aggressiveness and patient outcomes.
Onward Health has built a tool with expert doctor inputs, state-of-art computer vision techniques, and machine learning algorithms. The imaging algorithms are designed to detect different types of cells. Once detected, the algorithm also processes each cell to determine various properties, such as shape, texture, color distribution features, and distance to tumor and other spatial neighbors. The tool employs machine learning techniques to:
- Automatically identify the location of tumor or stroma
- Cluster patients based on their survival probabilities or risk score similarity
- Combine the image features along with other patient-related features, such as age, gender, history, medications, and genetics, to extract different types of insights and inferences
The company’s vision is that the tool would enable multiple benefits, including providing and improving cross-verifiable evidence for tumor, better resource management, and aiding the research/knowledge of tumor behavior.
Optimization in Energy Management Systems
Dr. Souvick Chatterjee, MathWorks
Energy management systems (EMS) for homes, buildings, factories, and communities are an important part of the trend towards smarter systems, providing better energy system planning, dispatch, resilience, and operation. Systems are used to manage both generation and consumption to optimally respond to variation in demand, market prices, and environmental conditions.
To develop an EMS, an engineer needs to use data analytics, control, simulation, and optimization for:
- Electric demand forecasting
- Electrical system modeling and simulation
- Operations optimization
- Tradeoff analysis
MATLAB® and Simulink® provide an integrated platform with both data analytics and Model-Based Design. You can build predictive models of demand and optimization models to minimize cost in MATLAB. Then, combine these with a system model built with Simulink and Simscape™ that integrates power electronics and controls. Deployment can be to embedded systems, or to enterprise or cloud environments.
Developing Autonomous Robots Using MATLAB and Simulink
Dr. Veer Alakshendra, MathWorks
Autonomous robots are being developed in many industries from industrial warehouses to consumer products. This talk demonstrates new features in MATLAB® and Simulink® for the different functional domains of robotics, including hardware design, perception, planning and decision making, and control design. It describes the challenges and walks through a common robotics workflow. Some of the topics that will be covered include:
- Developing kinematic and dynamic models of robots
- Perception algorithm design using deep learning
- Path planning with obstacle avoidance
- Supervisory logic and control using Stateflow®
- Model-Based Design for developing and testing robotics systems
Optimizing Robotic Systems with Simscape
Dr. Veer Alakshendra, MathWorks
Robotic systems are everywhere—manufacturing lines, amusement parks, and even in your house. Optimizing the performance of a robotic system is a complex task that involves mechanical, electrical, and algorithm design. In this presentation, you will see how Simscape™ enables you to model the physical system so that you can minimize power consumption and increase the robustness of your design.
Control Software Development and Testing Using MATLAB
Aditya Chendke, Mahindra & Mahindra Limited
Nabal Pandey, Mahindra & Mahindra Limited
With increasing complexity and the continuously evolving control systems of the current automotive domain, it is becoming tedious to develop and maintain different application software with the conventional methodology of handwritten C code. It even becomes difficult to use it for technology demonstration projects where time is a critical factor. To address this challenge, the model-based development approach is gaining traction among Tier 1/OEMs. Automatic verification and validation and code generation make the development process much more efficient and effective. This not only saves the time required for the development, but it also avoids error-prone hand-coding. Error prevention and early error detection is achieved using model-based development.
In this talk, a model-based approach for application software is discussed where the control system is developed using Simulink®, Stateflow®, and MATLAB®. For better optimization, standard library functions are created and reused in various sub-modules. Studies from requirement traceability to control function development and tagging with test case are being done using Simulink. To maintain the traceability of the models to system requirements, the description of the functionality is written in Simulink blocks when modelling is completed. In this way, ambiguity between the model developed and the system requirements is eliminated, and higher coverage is achieved.
5G New Radio Fundamentals: Understanding the Next Generation of Wireless Technology
Tabrez Khan, MathWorks
Development of 5G products is accelerating with the first device and network deployments in 2019. 5G New Radio (NR) technology introduces a flexible architecture that will enable the ultra-fast, low-latency communications needed for next-generation mobile broadband networks and applications such as connected autonomous cars, smart buildings and communities, digital healthcare, industrial IoT, and immersive education. The flexibility of the 5G NR standard will make design and test more complex.
Engineers developing 5G enabling technologies and connected devices need a solid understanding of the fundamental concepts behind the 5G NR specification.
This talk demonstrates the key 5G physical layer technologies and concepts. You will learn about the structure of 5G waveforms; how the waveforms are constructed, modulated, and processed; beam management in massive MIMO systems; and methods for simulating and measuring link-level performance.
Seamless System Design of RF Transceivers and Antennas for Wireless Systems
Vidya Viswanathan, MathWorks
Wireless engineers are pursuing 5G and other advanced technologies to achieve gigabit data rates, ubiquitous coverage, and massive connectivity for many applications such as IoT and V2X. The need to improve performance and coexist with multiple communications standards and devices while reducing the overall area and power imposes challenging requirements on RF front ends. Gaining an insight into such complex systems and performing architectural analysis and tradeoffs require a design model that includes DSP, RF, antenna and channel, as well as impairments.
In this talk, you will learn how to model antenna arrays and integrate them in RF front ends for the development of wireless communications, including:
- Analyzing the performance of antennas with arbitrary geometry
- Performing array analysis by computing coupling among antenna elements
- Modeling the architecture of RF front ends
- Developing baseband and RF beamforming algorithms
End-to-End Airborne Radar System and Signal Processor Design
Yogesh Gharote, Honeywell Technology Solutions Lab Pvt Ltd
Honeywell® IntuVue® RDR-4000 weather radar is an advanced radar system capable of providing 3D display of airborne weather hazards. This is a mechanically scanned radar system capable of providing alerts for weather-related hazards like turbulence and wind shear. It uses sophisticated signal processing techniques to process the raw digitized RF data to produce final weather-related warnings. This is a non-phased array radar system, and hence, not directly amenable to modeling using Phased Array System Toolbox™. Still, the toolbox provides the basic “nuts-and-bolts” to build a sophisticated model.
Developing and Deploying Optimization Strategy for Engine Calibrations
Akansha Saxena, Cummins
Diesel engine calibration is a careful balance of emissions while trying to achieve the lowest fuel consumption possible. In a certain instance of trying to achieve this balance, some areas of the calibration yielded enough smoke to cause EGR cooler fouling issues under the right conditions.
This presentation discusses this challenge for Cummins and how the optimization process and MATLAB® based toolkit was used to improve robustness to smoke while maintaining the rest of the emissions constraints. The Cummins-developed toolkit utilizes many MATLAB toolboxes including Statistics and Machine Learning Toolbox™, Parallel Computing Toolbox™, Optimization Toolbox, and MATLAB Compiler™.
Artificial Intelligence Engine Idle Improvement
Prasanta Sarkar & Suchit Pandey, Tata Motors Limited
Presently, engine idle control is done by a PID controller at Tata Motors Limited. The company is working to reduce quality of idling (over shoot, under shoot, saturation time, and fuel consumption) by 10% when compared to production vehicles using artificial intelligence (AI) tools from MathWorks. They are able to run AI-based engine models in software-in-the-loop and are planning to build in ECU and test in the vehicle.
Model-in-Vehicle Validation Methodology for ADAS Features
Nikhil Khadloya, Cognizant Technology Solutions
Cognizant Technology Solutions has developed an interface for all sensors in MATLAB® and Simulink® and established vehicle communications using a high-end automotive grade CPU. This CPU allows the company to run the Simulink model in real time and test the ADAS features along with sensors on the vehicle in real-world driving scenarios. The company calls this technique model-in-vehicle (MIV) validation. MIV leads to in-vehicle verification and validation of algorithms for difficult-to-replicate scenarios in the virtual world in early stages of development in parallel to classical methodology of model-in-the-loop (MIL), software-in-the-loop (SIL), and hardware-in-the-loop (HIL).
HIL Testing of AMT Control Strategy Using Simscape Plant Models
Ajitsinh A. Yadav, Tata Motors
The control strategy for an automated manual transmission is tested using a gearbox model running on a hardware-in-loop (HIL) system. The gearbox is a part of a parallel hybrid (P2) powertrain configuration. The gearbox plant model uses components from Simscape™ and Simscape Driveline™ libraries of Simulink®. The plant model can simulate gearshift forces, including synchronization forces; shift and select detent forces; shift fork leverages, and rotational and translational inertias; and shift sleeve free fly, dog clutch engagement. Gearshift is performed using two electromechanical actuators, one for shifting and other for selecting rail. Detailed models of actuators are also created using data provided by the supplier. Actuator models are validated against their technical specifications like full-load speed, no-load speed, and current drawn at various loads. The control software for AMT is designed in Simulink and Stateflow® using Model-Based Design. The control software is flashed on a rapid prototyping ECU. The plant model consisting of the gearbox and the gearshift actuators runs on HIL.
Home Appliances Controls Development Using Model-Based Design
Priti Madurwar,
Whirlpool of India Ltd
Pranoti Joshi,
Whirlpool of India Ltd
In past two years, Model-Based Design has been extensively used at Whirlpool for embedded software development to overcome various difficulties and complexities that typically arise during the design lifecycle of embedded software for closed-loop control systems. With Model-Based Design, Whirlpool can provide a single design environment so that developers can use a single model of their entire lifecycle for data analysis, model visualization, testing and validation, and ultimately, product deployment, with or without automatic code generation.
This presentation discusses how Whirlpool has deployed Model-Based Design, starting with a single platform (home appliances) and spreading across all four different platforms (cooking, dishwasher, and refrigeration), for controls algorithm development due to the numerous advantages the workflow offers. It also explains how the team has brought maturity into the testing process using different MathWorks tools. Whirlpool has deployed a model-based methodology, following these steps:
- Requirement capturing and review
- High-level and low-level requirements
- Review and finalization
- Importing Dymola Plant Model into Simulink® using FMU Import (FMI Co-Sim and Model Exchange Interface)
- Development of control model using MATLAB®, Stateflow®, and Simulink
- Data management using data dictionaries
- Model verification using Model Advisor
- Model design error checking using Simulink Design Verifier™
- Model validation using Simulink validation and verification tools and Simulink Test™
- Model validation using rapid control prototyping
- Software-in-the-loop checking
- Code generation
5G NR PHY Implementation, Algorithm Design, and New Waveform Research in MATLAB
Sagar Shriram Salwe, Sooktha Consulting Private Limited
The team at Sooktha Consulting Private Limited is developing physical layers of 5G NR in an agile fashion. The physical layer being a key component of their base station product and very complex, it is extremely important to test the implementation against an established reference before they even integrate with an RF front-end or attempt to interoperate with third-party implementations of the UE side. Further, if they notice issues when interoperating with third-party UEs, they need a reference to validate their implementation in those scenarios. The team is expecting their MATLAB® implementation to act as the reference that they use to both speed up their implementation as well as to validate it. Finally, to be able to customize their product to India’s rural connectivity or Internet of Things deployments, they may have to define new algorithms or waveforms and need a reliable reference platform to develop and test these algorithms and waveforms. They are planning to use 5G Toolbox™ and other packages as the reliable reference platform on which they can confidently design and analyze different algorithms and waveforms on the basic 3GPP 5G NR technology and contribute to the specifications through TSDSI and 3GPP.
Antenna Array Simulation and Beamforming for the Expanded GMRT
Kaushal D. Buch, Giant Metrewave Radio Telescope, NCRA-TIFR
Giant Metrewave Radio Telescope (GMRT) is one of the most sensitive instruments in the world for observing celestial objects at radio frequencies. A project proposal called the Expanded GMRT (eGMRT) is aimed at enhancing the scientific capabilities of the GMRT. eGMRT aims to expand the field-of-view of the telescope using a multi-element feed at the focus followed by a beamformer. During the prototype development phase, the team at NCRA-TIFR is building a FPGA-based Focal Plane Array (FPA) beamformer in the L-band with 300 MHz bandwidth and an ability to process 30 independent beams using 144 antenna elements.
The development of the FPA beamformer utilizes MATLAB®, Simulink®, and other MathWorks toolboxes for antenna array simulation, optimization of beamformer weights, FPGA implementation, and data analysis. The simulation of closely-spaced Vivaldi antenna array was carried out using Antenna Toolbox™ and Phased Array System Toolbox™. A GUI-based simulation tool was developed for the simulation of the radiation pattern through user-configurable antenna selection and array configuration. It also helped in understanding the radiation patterns of beams at different offsets from the direction of the main beam and served as a reference for comparison with the results from the practical beamformer testing.
The digital design of the beamformer system was carried out using a model-based approach in Simulink. Xilinx® System Generator blocks were used for implementation of the design. This approach significantly reduced the development time and helped in addressing the incremental changes and debugging.
In this presentation, Kaushal D. Buch describes the array simulation technique, model-based implementation of the beamformer, and recent results from the ongoing prototype development. The presentation also describes how different MathWorks products are useful to a project which uses antenna theory, signal processing, and FPGA design.
waterSenz: A Digital Water Management System
Nandakumar Katta, Gnarus Solutions Private Limited
Gnarus is a young technology company based out of Bengaluru, that strives to bring awareness and help people leverage technology in the chores and things that matter in their day-to-day life. Internet of Things (IoT) offers limitless possibilities for building solutions to solve challenges faced in day-to-day life. The team at Gnarus is building innovative products and solutions in this space to create a simple and smarter life for its customers. Gnarus’s flagship product, waterSenz™ (indigenously designed and developed), is an integrated water management system for small to large residential and commercial complexes. It combines contact-less sensors, wireless connectivity, and cloud-hosted software, all tightly integrated for simple deployment.
Artificial Intelligence Engine Idle Improvement
Prasanta Sarkar, Tata Motors Limited
Suchit Pandey, Tata Motors Limited
Presently, engine idle control is done by a PID controller at Tata Motors Limited. The company is working to reduce quality of idling (over shoot, under shoot, saturation time, and fuel consumption) by 10% when compared to production vehicles using artificial intelligence (AI) tools from MathWorks. They are able to run AI-based engine models in software-in-the-loop and are planning to build in ECU and test in the vehicle.
Model Based Design of Smart Electric Vehicle
Shivaram NV, Ather Energy
Ather Energy has set out to make a change in the way people perceive electric vehicles. Everyone knows that electric transport is the future, but lack of good products in that space in our country is generally hindering faster adoption of electric vehicle technology. A good product has multiple dimensions that need to be met across performance, user experience, and customer service. The Ather 450, along with their charging infrastructure Point, is the first product out of their factory that looks to meet these dimensions.
The scooter, along with charging infrastructure, creates a complex ecosystem of devices where the interaction between these need to be handled well to optimize performance. The company knew the complexity of the system well enough early on, and for them, it was an easy decision to use MathWorks products. The physical model of the system and the algorithm developed went a long way in helping Ather Energy understand and optimize the performance faster.
Dr. Radha Krishna Ganti, IIT Madras
Full-duplex communications is a lucrative way for doubling the available system bandwidth. However, self-interference is a major impediment for realizing a FD node. The interference must be cancelled at RF, analog domain, and the baseband. This requires simulation of the entire RF chain along with impairments.
Development of Multi Target Trackers for Surveillance Radar Using MATLAB
Srihari B R, BEL
A multi-target tracker developed for surface target tracking in a coastal surveillance scenario needs a robust track initiation and maintenance algorithm for tracking small targets and reducing the false alarm rate in the presence of sea clutter. The multi-object tracks in Sensor Fusion and Tracking Toolbox™ are evaluated with time-stamped detections recorded from the field. The GNN and ToMHT tracker module parameters are tuned by varying track initiation methods, estimation filters, thresholds, and hypothesis maintenance parameters. The resulting track data from each approach is benchmarked with recorded track data from the OEM system in the field. The performance of the MATLAB® tracker is compared with reference to Track ID maintenance, accuracy in estimated position, speed, course of targets, and time taken for processing each scan data. The results from multi-track line GNN and ToMHT approach with customized EKF and IMM-EKF are found satisfactory with respect to Tack maintenance and kinematics. To improve processing time, sub-optimal assignment methods and custom cost-matrix computation is tried out. The covariance fusion function in SFTT is used for correlating similar confirmed tracks from different tracking lines and removing duplicate tracks from confirmed track list. Code generation from MATLAB Coder™ is used for generating a C++ library with tuned tracker parameters. A wrapper function is written in C++ for interfacing radar detection input and track output to display. Further, custom enhancements are incorporated in C++ deployable code for improving track maintenance and processing time with respect to one scan data. The deployable tracker module is interfaced with a radar in the field and performance evaluation is in progress. An attempt is made to tune tracker parameters generated in C++ library to adapt to present environment conditions based on input data attributes. A display interface is provided for the operator to tune thresholds for reducing false alarm rate.
Machine Learning-Based Tool for Predictive Analysis in Computational Pathology
Dinesh Koka, Onward Health
Vineet Sharma, Onward Health
The advantage of leveraging machine learning techniques in digital pathology is that it provides the capability to learn, to extract knowledge, and to make predictions from a combination of heterogeneous data (i.e. the histological images, the patient history, and the omics data). The ability to mine sub-visual image features from digital pathology slide images—features that sometimes may not be visually discernible—offers the opportunity for better quantitative modeling of disease appearance, and hence, possibly improved prediction of disease aggressiveness and patient outcomes.
Onward Health has built a tool with expert doctor inputs, state-of-art computer vision techniques, and machine learning algorithms. The imaging algorithms are designed to detect different types of cells. Once detected, the algorithm also processes each cell to determine various properties, such as shape, texture, color distribution features, and distance to tumor and other spatial neighbors. The tool employs machine learning techniques to:
- Automatically identify the location of tumor or stroma
- Cluster patients based on their survival probabilities or risk score similarity
- Combine the image features along with other patient-related features, such as age, gender, history, medications, and genetics, to extract different types of insights and inferences
The company’s vision is that the tool would enable multiple benefits, including providing and improving cross-verifiable evidence for tumor, better resource management, and aiding the research/knowledge of tumor behavior.
Shift-Left Verification of Automotive Radar
Sainath Karlapalem, c India Pvt. Ltd.
Kaushik Vasanth, NXP India Pvt. Ltd.
Autonomous vehicles and driver assistance systems are becoming increasingly popular and the future of the automotive industry with RADAR as critical sensor. Design verification of such sensor products is extremely challenging involving wide variety of Digital, Analog and RF IPs and enormous configuration use cases. Current verification practices are highly focused from Hardware implementation perspective and hence reproducing the field issues faced during Design-In activities is becoming extremely challenging. The key reason for this challenge is the gap existing between the V&V environments used in IC development phase and the environment in which our “Sensor Product” is validated. This presentation is targeted to illustrate a Novel Methodology of “Environment in Loop V&V” which is aimed at modelling the “ON ROAD” scenario and bring the ON ROAD Reflections back to current V&V flows either in the form of Analog inputs, or Digital ADC outputs. This way we can enable Early design-in trails as part of our IC development phase in simulation world.
In our project, we use a radar as a sensor model. The radar is virtually modelled in MATLAB, and the parameters are a reflection of our hardware specification. The radar model is then placed in a virtually modelled environment which closely resembles the outside world.
Simulink for the Analysis of Self-Interference Cancellation in Full-Duplex Transceiver
Dr. Radha Krishna Ganti, IIT Madras
Full-Duplex communications is a lucrative way for doubling the available system bandwidth. However, Self-interference is a major impediment for realizing a FD node. The interference has to be cancelled at RF, analog domain and the baseband. This requires simulation of the entire RF chain along with impairments. In this session, we use SimRF to simulate the RF chain and also demonstrate the cancellation using a combination of algorithms in the base band and RF units.