Automotive Collision Avoidance System Field Operational Test Program
FIRST ANNUAL REPORT

APPENDIX A
Function Diagrams and Descriptions


This Appendix includes the process model for the system that was developed as part of the Functional Description Task (Task A1). The process model shows the functional decomposition of the system through data and control flow diagrams. Each circle in the data and control flow diagrams represents a function performed by the system. Double circles represent primitive functions while single circles indicate there is another diagram that decomposes the function into lower level functions. The solid lines and arrows between the functions indicate flow of information. The dashed lines indicate control signals. Double horizontal lines with a name between them represent data storage. Single vertical lines represent flow into a control specification. Control specifications define the states (operating modes) and state transitions of the system.

The Context Diagram (Figure A1) shows the relationship between the functions provided by the system and the entities that interact with the system. The ACC/FCW System takes inputs from sensors that determine the driving environment, the driver’s activities, the host vehicle actuators, and the vehicle dynamics. The ACC/FCW system controls the vehicle speed when ACC is active and produces Forward Collision Warning alerts and warnings for the driver. The ACC maintains a constant speed set by the driver or a set headway if there is a lead vehicle that is going less than the set speed. The FCW produces alerts and warnings based on an assessment of the threat of a crash.


Figure A1: Context Diagram

Figure A1: Context Diagram

The ACC and FCW Function diagram (Figure A2) shows the interaction between the top-level functions and the entities that interact with the system. The Sensor Specific Functions include radar processing, vision-based lane tracking, map-based road geometry estimation and yaw-based path estimation. These functions use each sensor to determine the road geometry, to estimate the current relationship between the vehicle and the road, and/or to predict the host vehicle’s path. The sensor specific functions also use the radar data to detect, track and classify objects in the forward environment of the host vehicle. Finally the sensor specific functions include vehicle kinematics estimation based upon the GPS data.

Figure A2: ACC and FCW Functions

Figure A2: ACC and FCW Functions

The Vehicle Sensor Filtering function filters the vehicle kinematics sensors to provide engineering units and to reduce noise in these measurements.

The Data Fusion Functions combine the evidence from the entire sensor suite to develop a higher confidence prediction of the host vehicle’s path and to predict the driver/vehicle response in the event of an alert.

The Target Identification and Threat Assessment Functions determine which targets are likely to cross the path of the host vehicle, determine if a collision warning should be produced, and select the targets for the ACC functions. They also prioritize the targets to help with resource allocation within the Sensor Specific Functions.

The ACC Vehicle Controls maintain the vehicle’s speed or headway when the ACC is on and engaged. The controls are similar to those of a conventional cruise control system with the addition of a headway setting. The output includes throttle and brake actuator control signals. The ACC vehicle control also responds to a brake pulse request by controlling the brake actuator control signals. In headway maintenance mode the ACC gets range and range rate data for the primary target from the Target Selection function that is part of the Threat Assessment Functions.

The Driver-Vehicle Interface Functions control all of the devices that transmit information to the driver. These include audio, visual, and haptic outputs. The visual display includes a head-up display. The information displayed includes the status of the ACC (on, engaged, set speed, and target detected). The information also includes warnings that indicate maintenance is required or that the vehicle is being operated beyond the range of capability of the ACC/FCW. The warnings may include multiple levels.

The Data Acquisition function includes the collection of data from the FOT. These will include the vehicle kinematics, warning levels, and intermediate results from many of the processing functions. It will also include video of the roadway ahead of the vehicle and the driver’s head.

The Sensor Specific Functions (Figure A3) include Radar Processing, Vision-Based Lane Tracking, Map-Based Road Geometry Estimation and Yaw-Based Path Estimation.

Figure A3: Sensor Specific Functions

Figure A3: Sensor Specific Functions

The Vision-Based Lane Tracking function determines the geometry of the road ahead of the vehicle and the relationship between the road and the host vehicle. The road-geometry information includes the curvature and/or offset of the road at selected distances ahead of the vehicle. The relationship between the vehicle and the road includes the lateral position in the lane, the heading angle, and whether a lane change is occurring.

The Map-Based Road Geometry Estimation function uses a roadmap database, DGPS, and dead reckoning to determine the current map position of the vehicle. It then extracts information from the database indicating the geometry of the road ahead of the vehicle, the relationship of the vehicle to the road, and then the location of significant features along the road. It also produces vehicle kinematics measurements based upon the GPS data.

The Yaw-Based Path Estimation function predicts the host vehicles path using yaw-rate sensor input, vehicle speed and acceleration measurements, and steering wheel angle measurements.

The Radar Processing function is covered below.

The Radar Processing function (Figure 3.4) includes Target Detection, Multi-Target Tracking, Target Classification, Scene Tracking, Lane Position Estimation, and Auto-Alignment and Blockage Detection.

Figure A4: Radar Processing

Figure A4: Radar Processing

The Target Detection function processes the radar signals to produce estimates of the range, range rate, acceleration, and extent of objects. It also reports the amplitude of the return from each detection.

The Multi-Target Tracking function associates detections in each new sample with previously observed tracks. It reports whether any currently stationary objects were ever observed to be moving, and can let a target "coast" if it disappears for a short period of time.

The Target Classification function looks at the target tracks to determine if any should be associated into a larger object such as a bridge or a truck. If this occurs it indicates which tracks are associated and calculates some composite features of the object.

The Scene Tracking function evaluates the target tracks to estimate the geometry of the road ahead of the vehicle and the vehicle’s relationship to the road.

The Auto-Alignment and Blockage Detection function evaluates the radar returns to detect when the signal seems to be attenuated by a blocked radome. It also looks at target tracks to produce electronic adjustments of the radar alignment. This function also produces control signals that indicate if the radome is blocked or if the alignment is beyond the range that can be corrected.

Data fusion techniques are used to combine results derived from individual sensors into a composite evaluation of the road geometry, host state, environment state and driver distraction level (Figure A5), shown on following page.

Figure A5: Data Fusion Functions

Figure A5: Data Fusion Functions

The Road Geometry Estimation function uses data fusion techniques to produce a road geometry model based upon the sensor specific estimates.

The Host State Estimation function uses data fusion techniques to estimate the relationship between the host vehicle and the road based upon the sensor specific estimates. This includes determining if a lane change is occurring.

The Environment State Estimation function uses data fusion techniques to estimate the condition of the road, the weather, and the visibility, based upon evidence from several vehicle sensors.

The Driver Distraction Level Estimation function keeps track of driver activity to determine if the driver is performing tasks other than driving. It uses this information to derive an estimate of the distraction level of the driver.

The Target Identification and Threat Assessment Functions (Figure A6) identify targets that are likely to cross the host vehicle’s path, estimate the driver’s and vehicle’s response to each threat, determine if any of them satisfy the criteria for FCW warnings, and selects the target for ACC.

Figure A6: Target Identification and Threat Assessment Functions

Figure A6: Target Identification & Threat Assessment Functions

The Host Path Prediction function uses the vehicle kinematics, road geometry and host state to predict the path of the host vehicle relative to its current position.

The Lane Position Estimation function estimates the relationship of each tracked target to the roadway geometry derived from the tracks. It determines which lane the target is in, its lateral offset, and its lateral velocity in that lane.

The Target Selection function evaluates the predicted path of the host vehicle and the objects to determine the threatening targets that will be used for ACC control and for FCW threat assessment. The FCW targets are those that are in the host vehicle’s path or are predicted to cross the host vehicle’s path. They may be moving or stationary.

The Driver/Vehicle Response Prediction function predicts how fast and how hard a driver is likely to brake if a warning is generated. It assesses the environmental conditions, current speed and headway, and other driving conditions that impact reaction time and the intensity of the response.

The Threat Assessment function uses the host vehicle dynamics, the target dynamics, and the expected driver response to determine what level of warning should be generated. The warning algorithm also depends upon whether the ACC is active. When ACC is active a warning is produced if it is predicted that the maximum braking authority will not prevent a collision.

The Data Acquisition Functions (Figure A7) record measured and computed values as well as video and audio information. Most variables are recorded continuously. Audio-visual data is recorded in clips at regular intervals and when predefined incidents are detected. The system transmits a summary of the data to a base station at the end of each trip by the host vehicle. The complete set of collected data is offloaded when each subject is finished with the vehicle.

Figure A7: Data Acquisition

Figure A7: Data Acquisition

The Scale and Condition Data Channels function performs the necessary unit conversions and signal conditioning.

The Calculate Derived Values function calculates values from the directly measured values that may be required in real time by other functions.

The Detect Transitions and Episodes function looks for pre-defined conditions that trigger storage of audio/video clips. The count of some detected transitions and episodes may also be stored and/or transmitted at the end of each trip.

The Update Time History function maintains the log of continuously recorded measurements and derived data.

The Update Histograms and Counts functions maintain the histograms of the measured values and counts of events that are transmitted to the base station at the end of each trip.

The Cue Audio-Video System is triggered by the detection of an episode. It controls the software that logs audio and video data for a short period before and after each episode. It also causes the audio and video systems to record short clips at regular intervals while the vehicle is operating.

The Digitize Video function controls the frame grabber and collects video from cameras that looks out the front window and at the driver.

The Digitize Audio function controls the audio digitizer for recording sound from the passenger compartment.

The Store Frames in Buffer and Store Audio in Buffers functions control first-in first-out buffers so that data that precedes the detection of an episode can be recorded.

The Log Selected Video Buffers and Log Selected Audio Buffers functions transfer data from the first-in first-out buffers when triggered by the Cue Audio-Video System function.

Figures A8 and A9 show the relationship between the function diagrams and the physical modules in the system. Figure A8 is an enhanced version of the ACC and FCW Function. It augments the basic functions with those required to control the interfaces. In addition to the modules listed above, the enhanced functional diagram shows the Brake, Throttle and HUD modules.

The functions performed by each module are enclosed in polygons on the enhanced functional diagram and the subsequent decompositions. Two functions in the top-level diagram have sub-functions assigned to more than one module. Parts of the Sensor Specific Functions are executed in the Radar, Vision Module, Map-based Road Geometry Module, Path Prediction & Target Selection, and Scene Tracking Modules. Parts of the Threat Assessment Functions are executed in the Path Prediction & Target Selection module and in the FCW Processor. The assignment of each of the sub-functions to each of the modules is shown in the subsequent diagrams.

Figure A8: Enhanced ACC and FCW Function

Figure A8: Enhanced ACC and FCW Function

 

Figure A9: Threat Assessment Functions

Figure A9: Threat Assessment Functions

 

Figure A10: Sensor Specific Functions

Figure A10: Sensor Specific Functions


[TITLE PAGE]      [TABLE OF CONTENTS]
[1 Executive Summary]     [2 Introduction]     [3 System Integration]     [4 Forward Radar Sensor]
[5 Forward Vision Sensor]     [6 Brake Control System]     [7 Throttle Control System]
[8 Driver-Vehicle Interface]      [9 Data Fusion]     [10 Tracking & Identification]     [11 CW Function]   
 [12 ACC Function]     [13 Fleet Vehicle Build]      [14 Field Operational Test]
[Appendix A]     [Acronyms]