Model-Based Design has been a reliable and powerful framework for designing complex systems for more than two decades. Today’s engineers are confronting new workflows influenced by trends in software development, the integration of AI, and cloud computing. Learn how MathWorks is investing in Model-Based Design to deliver critical capabilities to automate design tasks, prevent defects, and scale to increasingly complex digital systems.
MATLAB & Simulink R2024a/b 最新功能What’s New in MATLAB & Simulink R2024a/b
10:10~10:40
摘要
分場 A : 無線通訊、高速傳輸與晶片開發
A1使用MATLAB推動無線通信開發–5G和NTN應用實例Advancing Wireless Communications Development using MATLAB – 5G and NTN Practices
With the rapid evolution of the 3GPP industry, the development of wireless communication necessitates innovative and efficient processes to drive progress. This presentation explores the use of MATLAB in 5G and Non-Terrestrial Network (NTN) applications, showcasing how advanced tools can accelerate the development of wireless communication technologies.
This presentation will cover
The importance of 5G signal validation, illustrated through two case studies. The first case study examines the validation process of 5G downlink (DL) and uplink (UL) signals, demonstrating how to ensure signal accuracy and reliability. The second case study introduces the application of a 5G Error Vector Magnitude (EVM) meter in Radio Frequency (RF) components, highlighting the significance of precise measurements for component performance evaluation.
In the realm of NTN research, we studied the interference evaluation of TN/NTN systems and demonstrate how to address challenges in satellite communication through satellite Doppler shift modeling using ephemeris data.
A2混合訊號設計最佳化的嶄新方法New Methodology of Mixed-Signal Design Optimization
13:10~13:50
摘要
With the process moving forward, how to optimize a Mixed-Signal design efficiently becomes more and more important. To achieve this goal, EDA tools play a key role and had a lot of new methodology to support it. This session will introduce brand new Virtuoso optimization solution, AOP (Advanced Optimization Platform), and MATLAB optimization solution which leveraging Virtuoso/MATLAB integration.
A3創新的FPGA/SoC每秒千兆次取樣(GSP/S)訊號處理解決方案Innovative Gigasamples-per-Second Signal Processing Solution for FPGA/SoC
利用數位降頻轉換(Digital Down Conversion,DDC)範例來檢查並促成有效率的基於取樣與基於框架的處理。
使用最佳的實踐方式分析並改善硬體設計,以將延遲、吞吐量、與資源使用最佳化。
產生可讀、可合成的VHDL和Verilog程式碼來進行FPGA部署。
使用DSP HDL IP Designer APP輕鬆為DSP演算法產生HDL程式碼和DPI-C測試平台。
In this talk, we will explore the challenges and solutions associated with processing high-speed data streams in cutting-edge applications such as 5G New Radio (NR), radar, and signal intelligence. With advancements in analog-to-digital converters (ADCs), new architectures of DSP algorithms are emerging to meet the rigorous performance requirements of these applications. We will delve into effective strategies for modeling, exploring, and simulating hardware architecture for various DSP algorithms. Additionally, we will cover methods for generating synthesizable VHDL and Verilog code, enabling seamless deployment on FPGA and SoC platforms.
During this session, you will discover how to:
Apply Model-Based Design methodology to model and simulate DSP algorithms for effectively targeting FPGA/ASIC platforms.
Utilize a Digital Down Conversion (DDC) example to examine and enable efficient sample-based and frame-based processing.
Analyze and enhance hardware designs using best practices to optimize latency, throughput, and resource usage.
Generate readable, synthesizable VHDL and Verilog code for FPGA deployment.
Easily generate HDL code and DPI-C testbench for DSP algorithms using DSP HDL IP Designer App
A4硬體/軟體應用之設計、分析與SoC裝置部署Design, Analyze and Deploy Hardware/Software Applications to SoC Devices
The semiconductor industry is rapidly evolving bringing out heterogenous Systems-on-Chip (SoC) platforms like Versal ACAPs, MPSoC and RFSoCs for prototyping complex applications. At the same time, the competing demands for functional innovation, aggressive schedules and product quality have significantly strained traditional SoC prototyping workflows. It is becoming increasingly important to make design decisions before touching the actual hardware. This talk will show you how you can model and simulate algorithms with SoC architecture to evaluate performance bottlenecks in actual hardware context. Additionally, we will cover streamlined workflow for complete system implementation on hardware.
During this session, you will discover how to:
Create an SoC model of an application by using blocks from SoC Blockset for defining interfaces between ARM processor cores, hardware logic, memory and peripherals.
Simulate algorithms with SoC architectures, including RF Data Converter for RFSoCs, to evaluate interface bandwidth, processing bottlenecks and other design tradeoffs.
Import and integrate existing HDL IP core into SoC model using HDL IP Importer.
Model task execution, multicore execution, and task interrupts to define detailed processor functionality.
Use SoC Builder app to generate complete system implementation from the SoC model including bitstream for FPGA and software executable for processor.
A5使用類神經網路進行毫米波通訊波束選擇最佳化Optimizing Beam Selection in Millimeter Wave Communications Using Neural Networks
This presentation will explore how neural networks can optimize the beam selection process in millimeter wave (mmWave) communications, reducing system overhead and enhancing performance. As mmWave technology becomes crucial in 5G and future 6G systems, traditional beam management methods struggle to meet the demands of high-frequency communication. This agenda will guide participants through the application of artificial intelligence in beam selection, showcasing practical examples and discussing the future of AI in wireless communication
分場 B : 能源轉型、電動車與自主系統
B1應用Model Based Design於馬達控制軟體的標準化開發流程The Standardized Development Process of Motor Control Software with Model Based Design
11:20~12:00
摘要
隨著現代車用軟體的規模和複雜度都在急劇上升,車廠對於供應商軟體設計品質與要求也日趨嚴格,為應對複雜系統的挑戰,應用Model Based Design的設計方法於車用領域一直是關鍵的軟體開發解決方案,並用以滿足開發流程與標準,如;ASPICE、ISO26262等等…。
有鑑於此,本演講將介紹如何將Model Based Design的設計方法應用於車用Traction Inverter Motor Control軟體開發上,建立模型開發與整合平台,應用物理建模有效地模擬系統行為來提高軟體元件開發效率與品質,並將Model Based Design的系統化設計與驗證方法融入ASPICE的標準化設計流程中,使軟體設計更好地滿足對高品質、標準化開發的需求。
B2透過基於學習的自適應控制方法進行智慧型速度控制系統開發與測試Development and Testing of Intelligent Speed Control Systems via Learning-based Adaptive Control Approach
In recent years, automotive manufacturers worldwide have been dedicated to developing vehicles that combine safety and convenience. New vehicle models from these manufacturers continue to incorporate more comprehensive Advanced Driver Assistance Systems (ADAS). However, the ongoing development of ADAS technology faces numerous challenges, including sensor fusion and reliability issues, the need for extensive training data for AI models and their interpretability, handling diverse driving scenarios, standardizing regulations, and ensuring effective driver-system interaction. Considering these challenges, this paper focuses on the Speed Assistance System (SAS) within ADAS, proposing an intelligent speed control solution. This solution integrates automatic emergency braking, adaptive cruise control, and speed adjustment in response to lane-changing maneuvers in adjacent lanes. The speed assistance control strategy is trained based on human driving behavior and employs an Adaptive Neuro-Fuzzy Inference System (ANFIS) for control, selecting the appropriate control mode according to the current driving situation. When a potential collision risk is detected, the system triggers automatic emergency braking. Additionally, the system utilizes data such as the relative longitudinal distance and speed of the preceding vehicle, feeding this information back into ANFIS. This feedback loop effectively mitigates the deceleration response caused by aggressive lane changes from adjacent vehicles and adjusts the relative distance and speed to ensure safe and comfortable vehicle following at various speeds while enhancing overall driving assistance. The entire development process of the smart speed control system was conducted within the MATLAB/Simulink framework, leveraging the powerful toolbox features of MATLAB for design assistance and the advantages of Simulink’s graphical programming language. Software simulations and real-vehicle tests on elevated roads demonstrate that the proposed smart speed control strategy, which includes automatic emergency braking, adaptive cruise control, and ANFIS-based control, not only ensures autonomous braking in emergency but also significantly improves the deceleration response to sudden lane changes by leading vehicles, effectively preventing potential collisions.
B5基於模型的設計如何進行CI/CD的整合Model Based Design with CI/CD
16:00~16:40
摘要
DevOps(Development and Operations)的概念是為了讓整體開發工作能夠同時兼顧速度與品質,所產生的一套合作方式。透過開發人員與IT人員的協同運作、整合技術、流程自動化,可有效提升組織間的合作效率及產品品質。而CI/CD的流程也是因應此概念而產生的。透過持續整合、持續部署的方式,在開發階段就自動協助開發人員偵測程式碼問題,並部署至伺服器,使得軟體的品質能夠進一步的提升。此演講將探討基於模型設計MBD如何導入CI/CD的流程。
分場 C : 人工智慧應用與實踐
C1使用MATLAB進行半導體和電子產品的自動視覺檢測Automated Visual Inspection for Semiconductor and Electronics using MATLAB
Semiconductor manufacturing engineers perform metrology, defect inspection, and failure analysis to optimize yield, quality, and reliability. This involves analyzing images captured across different stages of manufacturing. MATLAB enables you to apply traditional image processing/computer vision techniques as well as AI approaches to automate image analysis for defect inspection, failure analysis, or metrology. In this session, we will use a wafer defect detection example to provide an overview of the typical steps, techniques, and tools of a workflow to analyze images, detect objects, and perform measurements.
Highlights
Data access and image preprocessing techniques
Semantic segmentation and labeling of defects and abnormalities
Defect detection and visualizing network decisions using deep learning techniques
Deployment for prototyping and production
C2Development of Lightweight Deep Learning Single-Lead ECG Algorithms: Verification and Accurate Identification of Atrial Fibrillation Patients and Sleep Apnea Events in REM/NREM Sleep
13:10~13:50
Highlights
The dissertation introduces novel methodologies that employs advanced deep learning techniques to detect obstructive sleep apnea (OSA) by analyzing single-lead ECG data, focuses on identifying OSA patterns during REM and non-REM (NREM) sleep stages and explores their relationship with atrial fibrillation (AF). The main objective is to develop lightweight deep learning systems using single-lead ECG for detection of REM and NREM-related OSA events, while also investigating their association with AF for diagnostic and treatment planning purposes. The research involves creating lightweight deep learning-based OSA detection systems to identify apnea-hypopnea (AH) and non-AH episodes, differentiate OSA and central sleep apnea (CSA) events, recognize REM and NREM-related OSA episodes, and detect AF associated with OSA. This work emphasizes the significance of these innovative approaches in advancing sleep medicine, enhancing diagnostic precision, and enabling personalized interventions for managing health implications linked to OSA.
An AH and non-AH episode detection algorithm, which composed of bag-of-features (BoF) extracted from an ECG spectrogram in conjunction with a deep learning framework utilizing the DarkNet-based convolutional neural network (CNN) called Tiny-DarkOSANet, was developed and achieved commendable classification accuracy and exceptional temporal resolution. Validation was conducted using overnight ECG recordings sourced from the PhysioNet Apnea-ECG (PAED) and National Cheng Kung University Hospital Sleep Center database (NCKUHSCD), with total of 83 subjects and average apnea-hypopnea index (AHI) of 29.63 per hour. This research identified the optimal spectrogram window duration by examining 10-second and 60-second intervals and isolated frequency bands to generate BoF, determining the most suitable frequency band for precise AH and non-AH episodes detection. The algorithm achieved impressive accuracies of 91.2% and 85.6%, sensitivities of 90.5% and 85.8%, specificities of 92% and 85.5%, and AUC values of 0.9757 and 0.9371 for different time windows and frequency bands. Additionally, a novel AH and non-AH episode detection algorithm was constructed using 1D deep CNN with empirical mode decomposition (EMD) applied to preprocessed ECG signals. This algorithm was validated with 33 overnight ECG recordings from the PAED. The 1D deep CNN model demonstrated high accuracy in segment-level classification with 93.8% accuracy, 94.9% sensitivity, and 92.7% specificity through 5fold-CV, and subject-level classification with 83.5% accuracy, 75.9% sensitivity, and 88.7% specificity validated via leave-one-subject-out cross-validation (LOSO-CV). Furthermore, the model was successfully deployed on the NVIDIA Jetson Nano-embedded platform, achieving 92.5% accuracy with a mere 1 MB parameter size.
The various SA events detection system to distinguish normal breathing (NB), OSA, and CSA was proposed utilizing 1D single-lead ECG signals and 2D single-lead ECG continuous wavelet transform (CWT) spectrograms combined with a vision transformer-based classification model called ViSATNet. This research underscores the importance of differentiating between CSA and OSA for accurate diagnosis and tailored management, thereby improving patient outcomes. The algorithm was validated using overnight ECG recordings from 60 subjects sourced from the Sleep Heart Health Study Visit 1 database (SHHS1). The 1D-ViSATNet model achieved an overall accuracy of 86.1%, sensitivity of 79%, and specificity of 89.7%, while the 2D-ViSATNet model attained an overall accuracy of 96.9%, sensitivity of 95.4%, and specificity of 97.7%.
A system for detecting REM and NREM-related OSA episodes was developed using a DarkNet-based CNN classification model named OSADeepNet-Lite, which incorporates electrocardiomatrix (ECM) feature visualization of heart rate variability. This algorithm was validated with overnight ECG recordings from 50 subjects from the NCKUHSCD database. The system features a sleep monitoring capability with minimal memory usage, real-time functionality, and user-friendliness. It achieved high performance metrics, including an accuracy of 91.42%, sensitivity of 82.83%, and an F1-score of 0.8283. Each event is processed in approximately 0.2 seconds, allowing for the analysis of an entire night's events within 3 minutes.
A system for detecting AF-related OSA was developed using independent component analysis (ICA) and a DarkNet-based CNN deep learning framework called SHHDeepNet. This system was validated with overnight ECG recordings from the SHHS1 database. The algorithm demonstrated exceptional performance in segment-level binary classification with 5-fold cross-validation (5fold-CV), achieving 98.22% accuracy, 96.8% sensitivity, 99% specificity, and an AUC value of 0.9981. In multi-class classification, it achieved 98.36% accuracy, 97.14% sensitivity, 98.77% specificity, and an AUC value of 0.9975 for 5fold-CV. For subject-level classification using leave-one-subject-out cross-validation (LOSO-CV), the algorithm attained 86.42% accuracy, 79.4% sensitivity, 90.2% specificity, and an AUC value of 0.9372 for binary classification, and 86.7% accuracy, 72.6% sensitivity, 90.4% specificity, and an AUC value of 0.9166 for multi-class classification.
In summary, this dissertation successfully developed single-lead ECG-based lightweight deep learning algorithms to detect OSA events and their associated phenotypes. It represents pioneering research in utilizing lightweight DL models for the detection of various types of SA events, including OSA events during non-REM and REM sleep stages, as well as AF-related OSA detection, yielding promising results. The AH and non-AH episode detection systems demonstrated high accuracy and sensitivity, with excellent temporal resolution and low memory usage. As the first pioneering system using ECG signals and a deep learning framework, a comprehensive SA event detection system was successfully developed, achieving balanced classification in distinguishing NB, OSA, and CSA. Additionally, a precise and robust lightweight deep learning algorithm tailored for single-lead ECG was developed and validated, effectively identifying REM and NREM-related OSA events. The dissertation also introduced a novel system for identifying AF-related OSA, which showed superior performance in distinguishing AF and OSA events compared to conventional methods. These technologies enable immediate and continuous analysis of sleep-related breathing behaviors, offering significant potential for disease management and treatment advancements.
C3駕馭AI,落實更快的系統設計:在MATLAB進行模型降階Harnessing AI for Faster System Design: Reduced Order Modeling in MATLAB
14:00~14:40
摘要
在以模型為基礎的設計流程中,虛擬模型對於複雜系統的設計佔據相當重要的地位。建立高逼真度的虛擬模型來精確地捕捉硬體行為具有很高的價值,不過這些模型對於整個開發階段來說可能太過耗時且不適當。模型降階(Reduced order modeling,ROM)提供更快速、逼真度稍低的近似物。本段演講將聚焦以AI為基礎的ROM,展示如何進行實驗設計及AI模型訓練,並且將這些AI模型整合至Simulink模擬以進行嵌入式系統的硬體迴圈測試或虛擬感測器應用。本段演講也將討論各種ROM方法的優缺點,幫助你選擇對你的專案最適合的方法。
演講要點:
使用No/Low Code工作流程建立以AI為基礎的降階模型
將經過訓練的AI模習型整合進Simulink以進行系統層級模擬
產生經過最佳化的C程式碼並執行HIL測試
In Model-Based Design, virtual models are crucial for designing complex systems. Creating high-fidelity virtual models that accurately capture hardware behavior is valuable, but these models can be time-consuming and unsuitable for all development stages. Reduced order modeling (ROM) offers faster, lower-fidelity approximations. This talk will focus on AI-based ROM, demonstrating how to conduct a design of experiments and train AI models and integrate these AI models into Simulink simulations for hardware-in-the-loop testing or virtual sensor application in embedded systems. The talk will also cover the pros and cons of various ROM approaches to help you select the best one for your project.
Highlights
Creating AI-based reduced order models using the No/Low Code workflow.
Integrating trained AI models into Simulink for system-level simulation
Generating optimized C code and performing HIL tests
C4以漂移感知的增量學習工作流程提升AI的可操作性Operationalizing AI with Drift Aware Incremental Learning Workflow
Enhancing the operationalization of AI models is possible with drift-aware incremental learning. This method allows for real-time adaptation of models as data patterns evolve, ensuring sustained accuracy and performance. Discover how to seamlessly incorporate drift-aware techniques into your MLOps pipeline using MATLAB, facilitating efficient model updates and robust deployment. Uncover the advantages of merging incremental learning with drift detection to uphold optimal model health and reliability in dynamic environments.
Highlights
Developing the Data Driven model using Low code workflow
Deploying in the Model in Production Environment
Monitoring the Data Drift
Feedback based model retraining and operationalizing the workflow
C5MATLAB與大型語言模型應用Large Language Models (LLMs) with MATLAB
This presentation will explore the possibilities of integrating Large Language Models (LLMs) with MATLAB, how to accelerate MATLAB development using LLMs, and how to enhance data analysis and automation capabilities through LLMs integration. One example is the development of Retrieval Augmented Generation (RAG) functionalities on LLMs. LLMs, known for their advantages in natural language processing, are increasingly being applied across various fields to streamline tasks, analyze large datasets, and enhance AI-driven applications. Additionally, the challenges encountered during the integration process will be discussed, along with a forward-looking view on realizing advanced AI solutions in MATLAB environments.
分場 D : 高手養成技術講堂
D1工程設計之模擬模型:透過MATLAB工具達到高精確度與彈性Simulation Models in Engineering Design: Achieving High Fidelity and Flexibility via MathWorks Tools
Reinforcement learning has been gaining attention as a new control design method that can automatically learn complex control and realize high performance. However, reinforcement learning policies often use deep neural networks, which makes it difficult to guarantee the stability of the system with conventional control theory.
In this session, we will introduce ideas on how to use reinforcement learning for control design with MATLAB and Reinforcement Learning Toolbox. We will cover some of the latest features available in the tool and we will also introduce workflow for the design, code generation, and deployment of the reinforcement learning controller.
D4利用MATLAB APP自動產生報告Automating Report Generation for MATLAB Applications
15:10~15:50
摘要
MATLAB報告產生器非常適合需要定期報告的用例,無論是報告測試結果、認證、合規性、法規等。您也可以透過編寫MATLAB 程式碼來自訂報告並將其製作成範本使用。此外,MATLAB報告產生器支援產出格式豐富的 Microsoft Word、Microsoft PowerPoint®、HTML以及PDF檔案格式 。
MATLAB Report Generator is great for use cases that require periodic reports, whether it is reporting on test results, certification, compliance, regulations, and more. You can also customize reports and work with templates by writing the same MATLAB code you have always been familiar with. In addition, you can automate the creation of richly formatted Microsoft Word, Microsoft PowerPoint, HTML, or PDF reports using the MATLAB Report Generator.
D5訊號完整性進階技巧Advanced Signal Integrity Techniques
16:00~16:40
摘要
本演講專注於訊號完整性的進階技術,特別是利用 MATLAB 的 RF Toolbox 進行 S 參數的時域閘選及利用 Signal Integrity Toolbox 進行腳本控制的最佳化。
This session focuses on advanced signal integrity techniques, particularly the use of MATLAB RF Toolbox for time-gating S-parameters and leveraging script-based optimization with the Signal Integrity Toolbox.
Participants will learn advanced techniques for automating and optimizing signal integrity processes to enhance the performance and reliability of high-speed digital designs through detailed demonstrations and explanations.