Fanless MXM-GPU Computer, i9-10900TE(10xC, 4.5Ghz) ,RTXA2000 (2560CUDA, 8GB GDDR), -20°C to 60°C
- 10th Gen. Intel® Core™ i9-10900TE Cometlake (10xC, 4.5Ghz)
- NVIDIA MXM RTXA2000 (CUDA 2560 , 8GB GDDR6)
- 2 x DDR4- up to 64GB
- 2 x SSD – RAID 0/1 Support
- TPM Security on Board
- Multi Display with DisplayPort, HDMI
- 12V DC-Input ( Options for 9V~36V)
- Extended Temperature -20~+60 Degree
- Technical Profile
- Artificial Intelligence (AI) and Machine Learning (ML) as the Backbone of the Success of Driverless Vehicles
With the rise of advanced technology, the appearances of cutting-edge vehicles have been sprung up recently. Out of all the burgeoning vehicles, driverless vehicles have gone viral the most. However, the road to the success is not always effortless for driverless vehicles; it requires the support of lots of technologies. Among these technologies, Artificial Intelligence (AI) and Machine Learning (ML) is the backbone of the success of driverless vehicles.
Safety is Crucial for Driverless Vehicles
From the first day that driverless vehicles start boosting, lots of resulting problems have occurred. For example, safety problems have already raised public’s awareness since safety is always the priority for users. In light of these potential dangers, tracking objects, pedestrians, and other vehicles become the main concerns for driverless cars. Some related approaches such as high resolution map, path planning and simultaneous localisation and mapping have been applied to reduce the possibility of the security risk. Proper hardware have been utilised to support the approaches. Among all hardware, multi-sensor including computer vision, Radar, Lidar are three of the most familiar technologies to aid these approaches.
AI and ML Turn Collected Data into Vehicle Instructions
The massive amount of data which is collected from approaches mentioned above is requested for further purpose. Driverless vehicles training can employ the sorted data for future improvement and progressively enhance the road safety, and this occupies the most data. An instrumented vehicle can consume over 30TB of data per day while a fleet of 10 vehicles can generate 78PB of raw data. On the other hand, vehicle operation can benefit from the analysed data as well since the data can correct instructions for further development. During driving, data collection is still executing in the background; however, it is likely to be selective.
Normally, the data rate of a camera which is supported with full HD, RAW12 and 40fps is approximately 120MB/s while that of radar is close to 220 MB/s. To sum up, if a vehicle has a combination of 6 cameras and 6 radars, the complete vehicle RAW data will roughly be 2.040 KB/s, which is around 58TB in an 8 hour test drive shift.
- Comparison of HORUS 330 and HORUS 340
One of the main features of HORUS 340 is it being powered by Intel 10th Gen (Cometlake-S) i9-10900TE. Compared with Core 9th Gen. Coffee Lake, Comet Lake-S has up to 10 CPU cores. Also, hyperthreading is applied on almost all models except for Celeron. As for raising certain of the processors' operating frequency, Comet Lake single core turbo boost is up to 5.3 GHz, which is 300 MHz higher than Coffee Lake-S; all-core turbo boost is up to 4.9 GHz; core i9 is supported by thermal velocity boost while Turbo Boost Max 3.0 supports for Core i7, i9. For the memory support, Core i7 and i9 is supported by DDR4-2933 whereas DDR4-2666 backs for Core i3, Pentium and Celeron.
To achieve Artificial Intelligence Security in Traffic field, high performance GPUCPU structure is an essential element. HORUS340 plays a critical role in Sensor Fusion framework, which is fundamental of Traffic surveillance and management system. 3D LIDAR Enforcement, a gradually recognized and matured surveillance device, using LIDAR cameras technique to reach accurate and efficient traffic monitoring and detection. Moreover, the Sensor Fusion Capability makes HORUS340 can be widely used for different situations, UGV or Smart City.
- How Autonomous Vehicle Works
Sensors are key components to make a vehicle driverless. Camera, radar, ultrasonic and LiDAR enable an autonomous vehicle to visualize its surroundings and detect objects. Cars today are fitted with a growing number of environmental sensors that perform a multitude of tasks. The control system integrated sensors for AV encompasses three parts: perception, decision and execution.
01. PERCEPTION LAYER
Perception enables sensors to not only detect objects, but also acquire and eventually classify and track objects surround.
02. DECISION LAYER
Decision-taking is one of the most challenging tasks that AVs must perform. It encompasses prediction, path planning, and obstacle avoidance. All of them performed on the basis of previous perceptions.
03. EXECUTION LAYER
Execution layer consists of interconnection between accelerator, brakes, gearbox and so forth. Driven by Real-Time Operating System (RTOS), all these devices can carry out commands issued by Driving Computer.
- Required High Performance Computing Power
In response to an exponential increase in the usage of the autonomous vehicles across the globe, Perfectron continuously develops suitable products for self-driving cars. Perfectron’s GPGPU AI Fusion computers provide complete structure for image processing and driving with remarkable durability for various unpredictable conditions and perfect adaptation for multi-usage. It can process variant vision sensor data synchronously, and offer a high-performance solution for automated driving that supports all relevant sensor interfaces, buses, and networks.
Depending on environmental condition and application, AV requires different facility composition and system organization. In recent innovating and examining process, AV is commonly used in three main fields: Load lifter, Shuttle bus, and Battle MUTT. To learn more details about the operation, please check out the highlight solutions below.
|-20°C to +60°C ( ambient with air flow )|
|CPU||Intel® 10th Gen Core™ i9-10900TE (Cometlake-S) Processors, 2 x Dual Channel DDR4 2400/2600 MHz, 260-pin SODIMM,AMI® UEFI BIOS CODE|
|Memory||Up to 64GB DDR4 RAM|
|GPU||NVIDIA GTX1080 GPU|
|Display Port||DisplauPort 1.4, DP++ Max resolution up to 4096x2160@60Hz|
|HDMI||HDMI 2.0a, Max resolution up to 4096x2160@60Hz|
|VGA||1 (resolution up to 1920x1200)|
|Storage Device||1 x PCIe x 16 (Gen3, Support riser card x8/x8, x8/x4/x4 )
1 x M.2 (Key E, 2230) with PCIe x1 and shared USB 2.0 for Wireless
1 x M.2 (Key E, 3042) with shared USB 2.0 and SIM for 4G
1 x M.2 (Key E, 2280) with PCIE x 4 and SATA3 for SSD
1 x SIM socket connected to M.2 key B
|2 x Intel Gigabit Ethernet LAN Interfaces (10/100/1000 ,10/100/1000/2500 Mbps)|
|DisplayPort||2 x DP 1.4|
|HDMI||1 x HDMI 2.0a|
|Ethernet||1 x 1 Gigabit Ethernet LAN, 1 x 2.5 Gigabit Ethernet LAN|
|USB Port||4 x USB3.2 Gen2 standard-A connectors|
|Serial Port||2 x COM (RS-232/422/485)|
|Audio Port||1 x Line-Out, 1 x MIC-In connector|
|DC-IN||4P Rugged Terminal connector|
Applications, Operating System
|Applications||Commercial and Military Platforms Requiring Compliance to MIL-STD-810G
Embedded Computing, Process Control, Intelligent Automation and manufacturing
applications where Harsh Temperature, Shock, Vibration, Altitude, Dust and
Used in all aspects of the military.
|Operating System||Windows 10 64bit
Ubuntu16.04, Ubuntu18.04, Fedora 28
|Dimension (W x D x H)||362(L) x 250(D) x 70(H) mm|
|Chassis||Aluminum Alloy, Corrosion Resistant.|
|Finish||Anodic aluminum oxide (Color Iron gray)|
|Cooling||Natural Passive Convection/Conduction. No Moving Parts|
|Connectors||DC-IN : PHOENIX CONTACT 1776715 RJ45 Ethernet :
COM: FEN YING SM10-09P
HDMI + DP : JKCR Display and HDMI Female
|Ingress Protection||Dust Proof (Similar to IP50)|
|MIL-STD-810G Test||Method 507.5, Procedure II (Temperature ® Humidity)
Method 516.6 Shock-Procedure V Non-Operating (Mechanical Shock)
Method 516.6 Shock-Procedure I Operating (Mechanical Shock)
Method 514.6 Vibration Category 24/Non-Operating (Category 20 ® 24, Vibration)
Method 514.6 Vibration Category 20/Operating (Category 20 ® 24, Vibration)
Method 501.5, Procedure I (Storage/High Temperature)
Method 501.5, Procedure II (Operation/High Temperature)
Method 502.5, Procedure I (Storage/Low Temperature)
Method 502.5, Procedure II (Operation/Low Temperature)
Method 503.5, Procedure I (Temperature shock)
|Operating Temperature||-20°C to 60°C (ambient with 0.7m/s airflow)|
|Storage Temperature||-40°C to 85°C|
|EMC||CE and FCC compliance|