PRODUCTS
-
EVA01a: One Stop Solution BYD Atto 3 EVA01b: One Stop Solution BYD E5 EVA01c:One Stop Solution BYD Dolphin EVA01d: One Stop Solution-Customized EVB01 - Powertrains System EVB02 - Energy Management EVB04B - Intelligent Power Battery EVB05B - BMS Training Bench EVB06 - Vacuum Pump Brake EVB07 - Drive Control System EVB09 - Battery Energy Recovery EVB10 - Charging Training Bench EVB11 - Drive System Connection EVB14 - Drive System Disassembly EVC01 - Drive System Disassembly EVC02 - Drive System Components EVC03 - Drive Motor Cutaway EVC04 - Gearbox & Axle Cutaway EVC05 - Battery Cutaway Model
-
CL-EP-001: Panoramic perception car CL-EP-002:Panoramic perception panel CL-EP-003: Millimeter Wave Radar Kit CL-EP-004: Vision Perception Kit CL-EP-005: Lidar Portable Kit CL-EP-006:Combined Positioning Kit CL-EP-007: Ultrasonic Sensors Kit CL-EP-008:Automotive Electronics Kit CL-EP-009:Millimeter Wave Radar Stand CL-EP-010: Vision Perception Stand CL-EP-011: LiDAR Training Stand CL-EP-012: Inertial Navigation Stand CL-EP-013: Ultrasonic Snesors Stand CL-DC-001: Chassis Bench CL-DC-002: Brake Integration Stand CL-DC-003: Steering Technology Stand CL-IC-001: Algorithm & Simulation CL-IC-002: Smart Cockpit Stand CL-IC-003: ADAS Assisted Driving Stand CL-IC-004: Parallel Cockpit and Remote Driving Vehicle Kit CL-IT-001: Connected Training Microcar CL-IT-002:Connected Logistics Microcar CL-IT-003:Sightseeing Mini Bus CL-IT-004:Car Basic Development CL-IT-005: Car Advanced Development CL-IT-006:Sightseeing Cart CL-IT-007: Roaming Cart CL-IT-008: Training Microcar-Educar CL-VC-001: Miniature Vehicle CL-VC-002: Miniature Vehicle Parallel Driving System CL-VC-003: Vehicle-Road Cooperation Sandbox and Cloud Control Platform CL-VC-004: V2X Vehicle-Road Cooperation
-
CL-A01 Gasoline Engine System Trainer CL-A02 Engine with Automatic Gearbox CL-A03 Engine with Air Conditining CL-A05 Engine Disassembly Trainer CL-A06 Engine Cutaway Stand Trainer CL-A07 Ignition System Panel Trainer CL-A08 Startup System Panel Trainer CL-A09 Charging System Panel Trainer CL-A10 Fuel Injection System Trainer CL-A17 Engine Parts & Components CL-B01 Automatic Transmission Trainer CL-B02Automatic Gearbox Disassembly CL-B03 Automatic Gearbox Cutaway CL-B04 Manual Gearbox Disassembly CL-B05 Manual Transmission Cutaway CL-B06 CVT System Trainer CL-B07 CVT Disassembly Trainer CL-C01 Clutch Stand Trainer CL-C02 Clutch Parts & Components CL-D01 Powertrains Cutaway Stand (FR) CL-D02 Powertrains Cutaway Stand (FF) CL-D03 Differentials, Axles, Drum CL-D04 Differentials Section Model CL-E01 Hydraulic Braking Trainer CL-E02 Pneumatic Braking Trainer CL-E03 Anti-Lock Brake System CL-E04 Braking Parts & Components CL-F01 Power Steering & Suspension CL-F02 Electric Power Steering Trainer CL-F03 Steering & Suspension Parts CL-G01 Chassis System Trainer CL-H01 Vehicle Cutaway Stand Trainer CL-I01 Automatic AC System (Cool) CL-I02 Automatic AC Trainer (H & C) CL-I03 Manual AC Trainer (Cool Only) CL-I04 Manual AC System Trainer (H&C) CL-I05 Air-Conditioning Component CL-I06 CAN-BUS Panel Trainer CL-I07 Air Bag and Seat Belt Panel CL-I08 Electrical / Electronics Stand CL-I09 Electrical / Electronics Panel CL-I10 Single Display Driving Simulator
E-mail:sales@coolifegroup.com
Phone:15814041681
Tel:15814041681
Add:Block 3,Wujiyungu,Rongxing Road,Xinci Street,Pingshan District, Shenzhen, China.
Category:CL-IT-003:Sightseeing Mini Bus
CL-IT-003: Intelligent Connected Sightseeing VehicleI.Product OverviewThe autonomous shuttle vehicle showcases a fully self-developed autonomous driving solution, integrating various sensor technologi···
CL-IT-003: Intelligent Connected Sightseeing Vehicle
I. Product Overview
The autonomous shuttle vehicle showcases a fully self-developed autonomous driving solution, integrating various sensor technologies such as cameras, LiDAR, millimeter-wave radar, Ultrasonic Sensor, and GPS/IMU. Relying on advanced perception algorithms and behavior prediction technology, the vehicle achieves precise obstacle detection and response. It uses a strategy primarily based on laser positioning, supplemented by RTK positioning, optimizing location accuracy in complex environments and ensuring stable operation of the autonomous vehicle in the event of signal interference.
To adapt to mixed traffic conditions of pedestrians and vehicles, the shuttle is equipped with a planning and control unit designed specifically for autonomous vehicles. This unit integrates multi-sensor fusion technology, supporting L3/L4 level autonomous driving requirements. This integrated computing platform is capable of processing inputs from multiple cameras, LiDAR point cloud data, millimeter-wave radar signals, and Ultrasonic Sensor information. Additionally, the built-in IMU enhances the vehicle's positioning and navigation accuracy, while capabilities in target fusion, combined positioning, and decision-planning processing further optimize the vehicle's autonomous driving performance.
The autonomous shuttle vehicle not only supports key autonomous driving functions such as automatic tracking, obstacle identification, automatic braking, station docking, local path planning, and automatic parking but also provides comprehensive services including system setting, calibration, fault diagnosis, and software upgrades. Its autonomous driving algorithm software system consists of perception, fusion, planning, and control modules, which are modularly designed and interconnected through API interfaces. This provides educational institutions with high openness and flexibility, allowing users to customize functional modules to meet various training and teaching needs.
I. Feature Overview
The autonomous shuttle vehicle integrates advanced autonomous driving technology and a multi-sensor fusion computing platform, designed to meet L3/L4 level autonomous driving needs, with the following main functions and features:
1. Multi-Camera and Visual Recognition: Supports inputs from multiple cameras, with robust visual recognition and automatic parking data processing capabilities.
2. LiDAR Point Cloud Processing: Capable of processing LiDAR-generated point cloud data, supporting precise environmental perception and obstacle identification.
3. Millimeter-Wave Radar Data Processing: Integrates inputs from multiple millimeter-wave radars, providing high-precision moving target detection and tracking.
4. Ultrasonic Sensor Application: Equipped with 12 Ultrasonic Sensors, enhancing short-range obstacle detection and avoidance capabilities.
5. IMU Integration: Built-in Inertial Measurement Unit (IMU) enhances the vehicle's motion state monitoring and precise positioning.
6. Target Fusion and Positioning: Achieves target fusion and combined positioning functions, improving the accuracy and reliability of decision planning.
7. Vehicle Data Management: Supports vehicle data access and complex data processing, providing support for autonomous driving decisions.
8. Multi-Channel Vehicle Control: Features multi-channel control buses, meeting complex vehicle control requirements.
9. System Setting and Calibration: Provides system setting and calibration functions, optimizing autonomous driving system performance.
10. Fault Diagnosis: Built-in fault diagnosis function, ensuring stable system operation.
11. System and Software Upgrades: Supports system and software upgrades, ensuring continuous technological progress and compatibility.
The autonomous driving algorithm software system uses a modular design, including perception, fusion, planning, control, and a full set of software, with API interfaces for inter-module communication, forming a complete autonomous driving software system. The vehicle offers open API interfaces, allowing educational institutions to independently develop and verify related functional modules, providing an experimental platform for teaching and research in autonomous driving technology, deepening students' understanding and application skills in autonomous driving technology.
II. Vehicle Configuration
1. By-Wire Shuttle (Parameters)
2. Computing Unit
· CPU: 6 cores 12 threads, base frequency 2.9GHz, 12M L3 cache, ensuring powerful data processing capabilities.
· GPU: Independent image processor, CUDA cores 3584, memory frequency 15Gbps, capacity 12G DDR6, supporting complex image processing tasks.
· Memory: 16GB LPDDR4x 2666MHz, ensuring smooth system operation.
· Storage: 500GB solid-state drive, providing ample data storage space.
· Interfaces: Equipped with Gigabit Ethernet + WiFi, USB3.0, meeting diverse communication needs.
Front Camera
· Model: Sensor IMX291, lens Size 1/2.8, providing clear image capture.
· Interface: USB3.0, ensuring high-speed data transfer.
· Maximum Effective Pixels: 2 million, resolution 1920x1080, supporting high-definition video capture.
· Output Format: MJPEG/YUV2(YUVY), meeting different image processing needs.
· Maximum Frame Rate: 50 frames/YUV/MJPEG, ensuring smooth video playback.
· Detection Targets: Vehicles, pedestrians, traffic signs, traffic lights, etc., enhancing environmental perception capabilities.
4. 16-Line LiDAR
· Scanning Channels: 16 lines, providing 360-degree spatial scanning.
· Laser Wavelength: 905nm, suitable for various detection environments.
· Detection Distance: 70 meters to 200 meters, meeting long-range sensing needs.
· Power Supply Range: 9V-36VDC, adaptable to different power systems.
· Communication Interface: Ethernet pps, ensuring stable data transmission.
· Data: Includes three-dimensional spatial coordinates and point cloud reflectivity, supporting precise environmental modeling.
Combined Positioning Unit
· Positioning Modes: Supports RTK, GNSS single point, and triple-mode seven-frequency (GPS, BDS, GLANESS).
· Built-in: 6-axis IMU, providing accurate posture recognition.
· Attitude Accuracy: 0.1° (baseline length ≥2m), ensuring high-precision positioning.
· Positioning Accuracy: Single point L1/L2 at 1.2m, DGPS at 0.4m, RTK at 1cm+1ppm, meeting high-precision navigation requirements.
· Input Voltage: 9~32V DC (standard adapted to 12V DC), power consumption
Millimeter-Wave Radar
· Operating Frequency: 76GHz to 77GHz, suited for high-precision long-range detection.
· Detection Distance: 0.2m to 250m, covering a wide range of distances.
· Distance Resolution and Accuracy: Close range ±0.39m, long range ±1.79m; close range accuracy ±0.10m, long range accuracy ±0.40m.
· Speed Range: -400 km/h to +200 km/h, supporting the detection of high-speed moving targets.
· Speed Resolution and Accuracy: Long range 0.37km/h, close range 0.43km/h; speed accuracy ±0.1 km/h.
· Detection Targets: Includes targets moving away, approaching, stationary, and crossing.
· Data Output: Supports CAN/CANFD, providing target ID, distance, speed, and radar cross-section (RCS).
· Working Environment: Temperature -40℃ to 85℃, voltage 9-16V, protection level IP67.
Ultrasonic Sensor
· Power Supply: +12V to 24V.
· Operating Temperature: -40℃ to +85℃.
· Measuring Range: 130mm to 5000mm, adjustable measuring distance.
· Accuracy: 0.5% of the detection distance.
· Resolution: 5mm.
· Communication Interface: Compatible with CAN2.0A and CAN2.0B.
· Sampling Rate: 100ms, with a probe emission angle of 60 degrees.
III. Vehicle Function Description
1. Complete Autonomous Driving System: The vehicle is equipped with a complete autonomous driving system, ensuring it can operate normally under the management of the system.
2. L4 Autonomous Driving Capabilities: Based on high-precision maps, the system can perform L4-level autonomous driving functions, including but not limited to active path tracking, obstacle identification and avoidance, autonomous braking, precise station docking, and dynamic local path planning.
3. Adjustable Driving Parameters: Provides a user-friendly interface for driving parameter settings, allowing users to adjust the autonomous driving system's strategy according to actual needs.
4. High-Precision Map Generation: The system integrates the function of generating high-precision maps, capable of recording and processing point cloud data, and creating detailed navigation maps in conjunction with map-making software.
5. Sensor Application Training Software: Provides dedicated training software for each sensor, supporting step-by-step functional teaching and technical training.
6. Multi-Technology Fusion Positioning: Combines a variety of advanced positioning technologies, supporting precise path tracking and navigation in indoor and outdoor environments.
IV. Supporting Software Description
1. Vision Testing Software:
· Includes modules for vehicle and pedestrian recognition, lane line and traffic light recognition.
· Supports rapid installation, calibration, debugging, data collection, and processing of cameras.
· Provides a complete development toolkit, including Python3, TensorFlow, CUDA, etc.
· Includes machine learning models, training samples, and a real-time processing DEMO, supplemented with a training dataset.
Radar Testing Software:
· Supports testing of millimeter-wave and Ultrasonic Sensors, including distance detection and target recognition.
· Real-time reception and analysis of radar data streams, applicable for target analysis under different working conditions.
· Provides the capability to read fault information.
LiDAR Testing Software:
· Supports interface testing and LiDAR configuration (including network settings, time synchronization, motor parameter adjustments, etc.).
· Real-time reception of LiDAR data streams, visualizing point cloud information.
Combined Navigation Testing Software:
· Includes interface testing and calibration of the combined navigation system (initial alignment, navigation modes, coordinate axis configuration, data output, etc.).
· Supports the reception and analysis of combined navigation data, offering the ability to read fault information.
The autonomous driving algorithm software system adopts a modular design, containing perception, fusion, planning, control, and a complete set of software. It implements communication between modules through API interfaces, forming an entire autonomous driving software ecosystem. The vehicle provides open API interfaces, allowing educational institutions to independently develop and verify related functional modules. This offers an experimental platform for teaching and research in autonomous driving technology and deepens students' understanding and application capabilities in autonomous driving technology.
- Prev:no more
- Next:no more