Divine Sim Suite

ADAS & Autonomous Driving


// Divine Sim Suite


Divine Technology has developed integrated autonomous driving simulation software to address safety concerns in autonomous driving. This software evaluates the performance and safety of autonomous driving sensors, algorithms, and systems.

We have implemented autonomous driving simulation technology to assess the coe technologies of autonomous vehicles quickly and safely.


This virtual simulation technology allows for the repeated testing of numerous risky situations to prepare for unexpected events on the road.

Divine Technology provides the most reliable results by applying engineering expertise to the sensor components, making it comparable to real-world conditions.

Through our proprietary high-reliability autonomous driving simulation platform, Divine Sim Suite, we offer a comprehensive solution encompassing compliance with regulations and laws, edge case scenarios, traffic flow simulation, thermal camera simulation, and high-reliability simulations.

// Features


  • High-reliability, Physics-based Virtual Environments
  • 3D Object Library with physical properties
  • Map data in *.xodr format
  • OpenSCENARIO
  • Scable Platforms via Data Hub (ROS, API, Plugin)
  • Test and Training Datasets with Physics-based Sensor Realism for Algorithm Training
  • AI Learning Models
  • autoGT (GT Data, Labeling, JSON File)

// Evaluations


  • Edge Case Safety Verification and Evaluation
  • AI Algorithm Performance Verification and Evaluation
  • Optimization of Fusion Sensors
  • SIL / MIL / HIL / DIL / VIL

// 3D Object Library


  • High-reliability 3D objects with physical properties of a surface subdivided into layers
  • In order to simulate the processing of signals reflected by objects just like real sensors do with radio waves from Radar or light from LiDAR, it assigns unique reflectance properties (BRDF), permittivity (dielectric permittivity), and light sources to all objects in the simulation, aiming to create a virtual environment as close as possible to the real world
  • When it comes to cars, they are divided into 25 layers to define their physical properties, including the body, front and rear windshields, bumpers, left and right headlights, left and right brake lights, wheels, and more.
  • Definition of the illumination of streetlights and traffic signals

// Scenario


  • Edge scenarios encompass a wide range of exceptional situations that autonomous vehicles may encounter on the road. These scenarios include adverse weather conditions, unpredictable pedestrian behavior, sudden mechanical failures, complex traffic interactions, and more. Testing autonomous vehicles in such edge cases is essential to enhancing their capabilities and ensuring they can effectively handle unforeseen accidents.
  • Developing edge case scenario in OpenSCENARIO format by analyzing the disturbance type presented by JAMA



// Feasibility Study - Camera Sensor Simulation


  • Based on Daily Time (Over Time)
  • Based on Camera F-number Variations
  • for Front Windshield Contamination
  • RGB Color Filter Errors caused by Heat and Chemical Reactions
  • Thermal Camera

// Feasibility Study - LiDAR Sensor Simulation


  • Testing the LiDAR sensor under cut-in scenario condition on the Seohae Bridge map
  • The vehicle model is divided into 25 layers, which include the body, front and rear windshields, wheels, bumpers, left and right headlights, etc., and it defines unique physical property information.

// Feasibility Study - Radar Sensor Simulation


  • Thermal noise
  • Masking (depending on the distance between two objects)
  • Multipath (impact of road and buildings)
  • Radar signal depending on the direction of ojbect

// Feasibility Study - Autonomous Driving


We conducted autonomous driving tests on a custom-made Gangnam road under dark and rainy night conditions. In the virtual environment, all objects are divided into material-specific layers with defined physical properties. Light sources, such as streetlights, traffic lights, and headlights, have been assigned luminance values. The sensors mounted on the vehicle are modeled to operate like real-world sensors, ensuring physical-based sensor modeling.


For defining physical properties of objects and modeling physical-based sensors, Ansys AVxcelerate was used and integrated with the driving simulation platform CARLA. The sensor output data is fed into the autonomous driving platform Autoware for perception and decision-making, and subsequently controls the ego vehicle in CARLA.